The future is all about High Performance Computing (HPC)

Jack Bedell-Pearce, CEO and co-founder of 4D Data Centres, looks how High Performance Computing (HPC) might be the future of computing, and the challenges and opportunities it presents.
Jack Bedell-Pearce, CEO and co-founder of 4D Data Centres, looks how High Performance Computing (HPC) might be the future of computing, and the challenges and opportunities it presents.

For computing, the future is all about high performance

Bylined to Jack Bedell-Pearce (CEO and co-founder of 4D Data Centres)

High Performance Computing (HPC) is now having a tremendous impact on a number of industries, helping to solve complex issues in science, engineering and business.

Once only available to university institutions and major corporations, it has now opened up into mainstream use – in part thanks to colocation data centres, which can offer the infrastructure needed to host HPC as a service to companies that wouldn’t be able to. 

The shift in focus towards a digital need – as a result of the COVID-19 pandemic – has now got many companies thinking about how to safely store their on-premise servers, which are consuming more processing power and data storage than ever before.

For science in particular, the impact of COVID-19 has shown the world how important HPC can be in the fight against the virus. Significantly more powerful than regular computers, HPC can process extremely large amounts of data quickly, which can lead to faster insights, quicker diagnosis and the ability to conduct research into potential vaccines.

While millions of people across the world have already been vaccinated against COVID-19, extensive research needs to be conducted to develop future strategies and contain any further threats of COVID-19 in future.

And work has already begun to do just that. At the height of the global pandemic in March 2020, the US federal Government launched the COVID-19 High Performance Computing Consortium – with support from companies such as Google, Hewlett Packard, IBM, Microsoft and NASA.

This brought together industry and academic leaders from across the world and provided access to the most powerful high-performance computing resources in support of COVID-19 research. Made up of 43 members, there are currently 100 projects utilising 50,000 GPUs, 165,000 nodes and 6.8 million CPU cores. They can perform around 330 trillion floating-point operations per second – equivalent to 330 petaflops.

The powerful algorithms which supercomputers possess can hopefully be instrumental in reducing the time it takes to bring life-saving medicines to market. According to the Pharmaceutical Research and Manufacturers of America, clinical trials take six to seven years to complete, on average, raising the cost of R&D to $2.6 billion.

While HPC offers significant advantages to the medical science industry, there are other sectors where it has been used well. They include media and entertainment – to edit feature films, render special effects and stream live events across the world. Additionally, in artificial intelligence and machine learning, HPC has helped to detect credit card fraud, teach self-driving vehicles and improve cancer screening techniques.

The future of HPC is fast

New supercomputers are being launched across the world, and it is a welcome sight as we look at finding new ways to handle intensive data processing activities. In July 2021, the UK introduced the country’s most powerful supercomputer, Cambridge-1.

Developed by NVIDIA, Cambridge-1 delivers more than 400 petaflops of AI performance and eight petaflops of Linpack performance, which will be used to help scientists and healthcare experts use the powerful combination of AI and simulation to bolster the country’s world-leading life sciences industry.

Cambridge-1 is the first supercomputer which NVIDIA has dedicated to advancing industry-specific research in the UK. The company also intends to build an AI Centre for Excellence in Cambridge featuring a new Arm-based supercomputer, which will support more industries across the country.

In the USA, Hewlett Packard and AMD are planning to install what they expect to be the world’s fastest supercomputer, El Capitan, in 2023. It is expected to perform quicker than today’s 200 fastest supercomputers combined, with both companies promising it will perform at two exaflops, or two quintillion calculations per second.

To put this into context, if the world’s human population could perform one such calculation per second, it would take everybody eight years to match one second’s worth of El Capitan computing.

Understanding the potential of HPC is key

These two examples of new supercomputers are proof there is an abundance of technology waiting to be deployed to tackle tomorrow’s challenges.

So it’s no surprise to hear more companies are looking at ways to tap into the potential of the lucrative HPC market, which is estimated to be worth $49.4 billion by 2025. In a world where technology continues to make rapid progress, the need and demand for HPC has become more urgent.

It does come with its challenges though. While the ability to quickly and accurately process large amounts of data is appealing to any company, the cost of setting up an HPC system and then maintaining it is high. There is huge capital expenditure (CAPEX) involved in the supporting infrastructure (cooling, backup power, fire suppression etc.) and this factor alone can restrict startups and SMEs from adopting HPCs, especially as the energy requirements needed are huge and HPCs also require specialist water cooling systems.

As most companies don’t possess the infrastructure needed to host their own HPC system, the idea of working together with a data centre allows them to take advantage of the benefits without it costing the earth. As HPC requirements differ from case to case though, choosing the right solution will be one of the most important investment decisions a business can make. Especially when looking at the bigger picture so it doesn’t affect the company’s growth plans.

READ MORE:

The impact of HPC has already been felt across the world, but taking time to understand its benefits will allow companies to take advantage of its endless opportunities.

For more news from Top Business Tech, don’t forget to subscribe to our daily bulletin!

Follow us on LinkedIn and Twitter

Amber Donovan-Stevens

Amber is a Content Editor at Top Business Tech

Unlocking productivity and efficiency gains with data management

Russ Kennedy • 04th July 2023

Enterprise data has been closely linked with hardware for numerous years, but an exciting transformation is underway as the era of the hardware businesses is gone. With advanced data services available through the cloud, organisations can forego investing in hardware and abandon infrastructure management in favour of data management.

The Future of Cloud: A Realistic Look at What’s Ahead

Erin Lanahan • 22nd March 2023

Cloud computing has transformed the way we work, communicate, and consume technology. From storing data to running applications, the cloud has become an essential part of our lives. But what does the future hold for this technology? In this article, we’ll take a realistic look at the future of the cloud and what we can...

Ab Initio partners with BT Group to deliver big data

Luke Conrad • 24th October 2022

AI is becoming an increasingly important element of the digital transformation of many businesses. As well as introducing new opportunities, it also poses a number of challenges for IT teams and the data teams supporting them. Ab Initio has announced a partnership with BT Group to implement its big data management solutions on BT’s internal...