May 17, 2017
Hewlett Packard Enterprise this week introduced what it called the world’s largest single-memory computer — The Machine. The company said its largest R&D program ever is aimed at delivering a new paradigm called Memory-Driven Computing — an architecture custom-built for the big data era.
“The secrets to the next great scientific breakthrough, industry-changing innovation or life-altering technology hide in plain sight behind the mountains of data we create every day,” said Meg Whitman, the company’s CEO. “To realize this promise, we can’t rely on the technologies of the past, we need a computer built for the big data era.”
The prototype contains 160 terabytes (TB) of memory, capable of simultaneously working with the data held in every book in the Library of Congress five times over—or approximately 160 million books.
Based on the current prototype, HPE expects the architecture could easily scale to an exabyte-scale single-memory system and, beyond that, to a nearly-limitless pool of memory—4,096 yottabytes.
“We believe Memory-Driven Computing is the solution to move the technology industry forward in a way that can enable advancements across all aspects of society,” said Mark Potter, CTO at HPE and director, Hewlett Packard Labs. “The architecture we have unveiled can be applied to every computing category — from intelligent edge devices to supercomputers.”
The new prototype also contains:
* An optimized Linux-based operating system (OS) running on ThunderX2, Cavium’s flagship second generation dual socket capable ARMv8-A workload optimized System on a Chip.
* Photonics/Optical communication links, including the new X1 photonics module, are online and operational.
* Software programming tools designed to take advantage of abundant persistent memory.
Further information can be found at www.hpe.com/TheMachine.