Learning the lessons of HPC
Big Data is a huge growth area in computing today, as businesses grapple with increasing volumes of structured and unstructured data.
But Big Data isn’t really a new phenomenon. Scientific and engineering industries have been recording and digesting large, fast-flowing and diverse datasets for decades, using high-end processing platforms and tools.
Consequently, enterprises have a lot to learn from both High Performance Computing (HPC) and High Performance Technical Computing (HPTC): the forerunners of today’s Big Data technologies.
HPC and HPTC use high performance computing to solve scientific and technical problems, as opposed to business ones. Applications include computational fluid dynamics and seismic tomography, which is used to image the Earth’s sub-surface characteristics.
So, what can enterprises learn from HPC users?
The first lesson is to introduce higher performance into mainstream computing. This can be done through optimising your software on one hand, and boosting your data centre infrastructure on the other.
You can increase your processing power, your multi-threading capabilities and floating point performance. You can also reduce network complexity and boost storage flexibility by unifying your network and storage fabrics.
The second lesson is to incorporate better mathematical data modelling, which is one of the strengths of HPC/HPTC.
Modelling goes hand-in-hand with simulation, which is the third lesson. Simulation enables you to explore numerous scenarios involving complex data, quickly and cost-effectively.
Fourth is powerful analytics. Apache Hadoop is proving to be a great platform for handling Big Data, particularly when combined with a solution like Cloudera’s Distribution of Hadoop.
Finally, the HPC world has made advances in machine learning, using algorithms that have linear algebra at their core. This is something for the future but worth noting now.
When applied to enterprise Big Data scenarios, learning networks that can process large, parallel jobs and optimise their own performance as they go, will be highly useful to businesses.
It’s time to learn the lessons of the elders and get the best from our Big Data.
To keep up to date with future articles published on the Intel IT Center, along with exclusive access to in-depth guidance, expert insights and a wealth of learning resources on the hottest topics in IT, register here.
With the digital universe doubling in size every two years it is essential for data centre owners and operators to monitor, assess and improve performance using energy efficiency and greenhouse gas emission metrics
Intel has developed the Powered by Intel® Cloud Technology Program to give businesses confidence when choosing their public cloud infrastructure services.
Microsoft Windows Server 2003 Extended Support ends on July 14th 2015. Planning a move to private or hybrid cloud could transform your Windows Server refresh.
Teradata Unified Data Architecture* gives any users any analytic on any data
Intel’s 2013 IT manager survey on how organizations are using big data
Prioritize which resellers have the greatest potential for high-volume sales