Learning the lessons of HPC
Big Data is a huge growth area in computing today, as businesses grapple with increasing volumes of structured and unstructured data.
But Big Data isn’t really a new phenomenon. Scientific and engineering industries have been recording and digesting large, fast-flowing and diverse datasets for decades, using high-end processing platforms and tools.
Consequently, enterprises have a lot to learn from both High Performance Computing (HPC) and High Performance Technical Computing (HPTC): the forerunners of today’s Big Data technologies.
HPC and HPTC use high performance computing to solve scientific and technical problems, as opposed to business ones. Applications include computational fluid dynamics and seismic tomography, which is used to image the Earth’s sub-surface characteristics.
So, what can enterprises learn from HPC users?
The first lesson is to introduce higher performance into mainstream computing. This can be done through optimising your software on one hand, and boosting your data centre infrastructure on the other.
You can increase your processing power, your multi-threading capabilities and floating point performance. You can also reduce network complexity and boost storage flexibility by unifying your network and storage fabrics.
The second lesson is to incorporate better mathematical data modelling, which is one of the strengths of HPC/HPTC.
Modelling goes hand-in-hand with simulation, which is the third lesson. Simulation enables you to explore numerous scenarios involving complex data, quickly and cost-effectively.
Fourth is powerful analytics. Apache Hadoop is proving to be a great platform for handling Big Data, particularly when combined with a solution like Cloudera’s Distribution of Hadoop.
Finally, the HPC world has made advances in machine learning, using algorithms that have linear algebra at their core. This is something for the future but worth noting now.
When applied to enterprise Big Data scenarios, learning networks that can process large, parallel jobs and optimise their own performance as they go, will be highly useful to businesses.
It’s time to learn the lessons of the elders and get the best from our Big Data.
To keep up to date with future articles published on the Intel IT Center, along with exclusive access to in-depth guidance, expert insights and a wealth of learning resources on the hottest topics in IT, register here.
At a recent technology conference in London, Michelle Tinsley, Director of Mobility and Secure Payment at Intel, discussed the retail benefits and challenges of Internet of things (IoT) adoption.
Cloud security remains a top concern for businesses. Fortunately, today’s data centre managers have an arsenal of weapons at their disposal to secure their private cloud infrastructure. Here are eight things you can use to secure your private cloud.
High-Performance Computing (HPC) isn’t just for high end corporates and large scientific organisations. The cost of processing, coupled with the raw power of today’s servers means that small and mid-sized businesses can also benefit from the advanced simulation that HPC provides.
Teradata Unified Data Architecture* gives any users any analytic on any data
Intel’s 2013 IT manager survey on how organizations are using big data
Prioritize which resellers have the greatest potential for high-volume sales