What a 2019 it has been for the Alces Flight Crew. We spent nearly the whole of the year engaged across a wide spectrum of community, client, and collaboration — looking at everything from how cloud is taking hold in HPC all the way up to how to handle the rapid need for more skills in our field. What surprised us the most is how excited and willing the HPC community is to take on new ideas and how our little cloud project from 2016 is now literally moving clusters. Here’s our take on how 2019 went for us, and some insights into what 2020 might bring.
Cloud HPC becomes practical.
Our research over the past 3+ years (seen here and here) has highlighted how a broad spectrum of potential in cloud HPC is now starting to settle into practical projects that range from looking at prospective hardware purchases, increasing adoption of HPC solutions, to proving if ideas are possible. This constructive use of cloud is enabling smarter and more efficient HPC portfolio management for both commercial and research-focused customers. In 2020, we see the capabilities of cloud opening the doors to better time management, smarter budgeting, and creating efficient compute and data lifecycles.
Open-source HPC stacks find a solid ground.
We’ve made some real headway in how users approach HPC at the environment level through our launch of the OpenFlightHPC project. OpenFlightHPC is about opening doors to get to the best use out of hardware and cloud, allowing users to not be encumbered by having to learn the nuances of each. We designed the project to work across any platform to allow users to consolidate their operational knowledge into commands that can work anywhere. In 2020, we are anticipating an uptick in cross platform environment requests as users become keen on leveraging the processing, data, and collaboration capabilities that work best for their projects — and to keep research moving forward.
Collaborations and partnerships in HPC become paramount.
There is a very real concern amongst the HPC community that the skills required to keep up with everything from current HPC systems to future work in the fields of Artificial Intelligence and Machine Learning will stretch the capabilities of even the most talented administrator. Rather than asking the talent pool to take on more knowledge, effective collaboration and partnerships is allowing work to be shared intelligently — best seen in our work with the University of Liverpool, whose Advanced Research Computing Facilities team has taken the helm of driving innovation forward while working with Alces Flight to keep track of their day-to-day business. In 2020, we will likely see more institutions looking to collaborate and partner in order to tackle big issues like data management, disaster recovery, and building strong AI/ML foundations.
But in order to make our ideas around 2020 (and beyond) come true, one very important thing needs to happen first.
2020: The year HPC starts setting standards.
Next year will be the start of standard setting in HPC. By this we mean that there needs to be a general understanding around which components of HPC need to be turned into either a fixed process or be handed over to automation. The desire for new fields to enter into HPC (and their magnificently sized data sets) is shining a light on how systems should be less about crafting the uniqueness of the solution and more about enabling efficiency of research. Allowing HPC skills to develop into areas focused on creating and maintaining the best research pipelines regardless of platform will open the floodgates to new ideas that will pull HPC into its next decade. Building the right team, finding the right partners, and honing in on institutional strengths will be what sets each solution apart — what should bring them together is agreeing ways forward that will allow solutions to evolve over time in an efficient, time and cost-aware manner.
We want to thank everyone who has been a part of our work in 2019, and we look forward to telling you more about what we’re learning in HPC in 2020. If you want to learn more about how we make HPC happen, feel free to contact us.