Disadvantages of a supercomputer
A supercomputer may seem like the solution to your business’ problems, but the realities of operating and running such a powerful machine present serious issues that require consideration.
They have numerous potential applications, including weather prediction, processing the calculations required to cure diseases such as AIDS, searching for oil and attempting to pre-empt earthquakes. However, the term “supercomputer” means that it performs above the realms of ordinary computing, so the construction and the maintenance of them present obvious issues.
The most severe disadvantage of supercomputers is the excessive cost of their operations. The fastest supercomputer in the world, the US-based Titan (described as being faster than half a million laptops), cost 100 million dollars (around £63.6 million) to construct. It’s important to remember that the huge construction costs are only the beginning of the expense. The electricity consumption is also gratuitous, costing around 9 million dollars (£5.7 million) per year to run. They are so expensive that the standard benchmark for the cost of running a supercomputer is “Cray time,” which works out to around £630 per hour.
- The most severe disadvantage of supercomputers is the excessive cost of their operations.
- They are so expensive that the standard benchmark for the cost of running a supercomputer is “Cray time,” which works out to around £630 per hour.
To conduct massive amounts of calculations simultaneously, the supercomputer requires an excessive amount of physical space to store. Titan is housed in a 160,000 square foot facility. Another example is the Star of Arkansas supercomputer in the University of Arkansas, which is essentially 157 computers built into a Frankenstein’s monster style amalgamation. This gives you an idea of the amount of physical space you’ll need for a supercomputer.
- To conduct massive amounts of calculations simultaneously, the supercomputer requires an excessive amount of physical space to store.
- Another example is the Star of Arkansas supercomputer in the University of Arkansas, which is essentially 157 computers built into a Frankenstein’s monster style amalgamation.
Input and output speed
One of the suggested definitions of a supercomputer is one which is only limited by its input and output speeds. This is a technicality, as a definition, but an important practical issue when fully considered. Data needs to be transferred to and from the supercomputer in order for it to process and make calculations, so if your input and output connections aren’t fast enough, you won’t get full performance from your supercomputer.
Along with input and output speed, a supercomputer also needs massive amounts of memory at its disposal in order to operate effectively (like a person would need a pen and paper to calculate a complex equation). For example, Titan is equipped with 10 petabytes (1,000 terabytes) of storage space to facilitate the huge amount of data being processed. This isn’t the biggest issue in terms of cost, but without sufficient space supercomputers aren’t able to complete their all-important calculations.
Supercomputers work extremely quickly, but they have such mammoth tasks that they will still take a very long time to perform their desired function. If 1,000 people could complete one calculation per second, it would take them 60,000 years to do the amount of calculations Titan can do in a single second, but it isn’t designed for such simple tasks. Although it’s an inevitable part of difficult calculations, it’s important to remember that supercomputers won’t complete their intended task quickly, just exponentially quicker than an ordinary computer.
- Knox News: Oak Ridge Lab to Add Titanic Supercomputer
- AnandTech: Inside the Titan Supercomputer: 299K AMD x86 Cores and 18.6K NVIDIA GPUs
- Computer World: Meet the Fastest, most Powerful Science Machine in the World: Titan Supercomputer
- The History of Computing Project: Introduction to Supercomputers
Lee Johnson has written for various publications and websites since 2005, covering science, music and a wide range of topics. He studies physics at the Open University, with a particular interest in quantum physics and cosmology. He's based in the UK and drinks too much tea.