Connections +
Feature

The Making Of The Machine


August 21, 2014  


Print this page

Las Vegas, Nev. – Meg Whitman certainly chose the right venue to announce what could end up easily being the largest corporate gamble she and the company she heads has ever made.

The CEO of Hewlett-Packard Co. (HP), who was hired in 2010 to resurrect an organization that at the time was in such complete and utter turmoil its board had burned through three CEOs in two years, has since slashed more than 50,000 employees and survived a shareholder lawsuit resulting from the questionable US$10 billion acquisition of U.K. software vendor Autonomy that went awry. Whitman, who has also overhauled the corporate culture, is betting big time on something called The Machine in this the 75th year of the company’s existence.

Speaking here in June at the HP Discover 2014 user conference, Whitman, accompanied on stage during a keynote speech by Martin Fink, the company’s chief technology officer, director of HP labs and general manager of its cloud business unit, said the code-name for a completely new computing architecture, changes everything.

“It is a continuum, a new approach to manage the distributed world,” she said. “It is the future of technology.”
Whitman is so sure of its value she has also made it a top R&D priority and plans to invest billions of dollars into the initiative if needed.

It was left up to Fink, a Canadian who joined HP in 1985 after graduating from Loyalist College in Belleville, Ont., to outline the more salient details of the project including the name itself.

“Why do we call it The Machine? When we first started developing it, we wanted to be very careful not to call it a server, workstation, PC, device or phone, because it actually encompasses all of those things. As we were waiting for Marketing to come up with a cool code name for the project, we started calling it The Machine – and the name stuck.”

The Machine, said Fink is made up of:

• A task-specific, system-on-chip ecosystem that achieves “quantum leaps” in computer performance and power efficiency when compared to traditional computing architectures that at their core, have not changed in more than 60 years. There is the CPU, main memory, and some sort of IO, typically storage and networking.

• New photonic interconnects that will eventually supplant the traditional copper cables currently in use.

• Traditional memory hierarchy working memory and mass storage with a single, universal memory.

According to Fink, toward the end of this decade, data growth will come at a rate that surpasses the ability of a current infrastructure to ingest, store and analyze. As a result, he added, a steep change in computing technology is required.

In his keynote presentation he said that a new operating system is being created from the “ground up” and that it will enter public beta in 2017 and be released the following year.

Developing The Machine, he wrote in blog, not only means “building the hardware. But we are also investing in developing the software that will support it – the data algorithms, the operating systems, the security platform and the tools required to manage millions of compute nodes from servers and data centres to the smart sensors that will make up the Internet of Things.”

Last year, in another blog, he discussed the need to “think differently about the future of IT. “These days, enterprises commonly handle petabytes or more of data and increasing amounts of unstructured data. Tomorrow it will be exabytes. The sheer amount of computing horsepower required to handle all that data is enormous, taking a financial, logistical and environmental toll on organizations.

“If you looked at the existing public cloud as a country, it would be the fifth largest consumer of electricity in the world. For context, reducing public cloud electricity consumption in half would be enough to power the United Kingdom. This current path is unsustainable, as we are barely scratching the surface of the deluge of data that’s projected to come at us in the coming years. This reality requires transformative thinking that addresses energy consumption and other IT challenges.”

That this is no longer an organization in disarray was evident with the launch of a multitude of products and services at HP Discover, which the company said act as a way to “bridge” current and future data centre technologies.

“Big data, mobility, security and cloud computing are forcing organizations to rethink their approach to technology, causing them to invest heavily in IT infrastructure,” HP said in release. “Gartner estimates that data centre hardware spending from new types of products and big data deployments will reach US$9.4 billion this year.”

Announced were:

• New encryption capabilities as well as information protection and control systems it says help “safeguard data throughout its entire life cycle.” The HP Atalla offerings “support data whether it’s at rest, in motion or in use — across cloud, on-premises and mobile environments—to ensure continuous protection of an organization’s most sensitive information,” the company said in a release.

• The launch of the HP Apollo line of high-performance computing (HPC) systems, the company said are capable of delivering up to four times the performance of standard rack servers, while using less space and energy. “Demand for HPC applications across industries is growing rapidly, and today’s data centres are ill-equipped to handle the extensive space, power and infrastructure necessary to run the required level of processing power,” said Antonio Neri, senior vice present and general manager of servers at HP.

• Enhancements to the all-flash HP 3PAR StoreServe 7450 Storage array. IDC forecasts that by 2016 the market for all-flash storage arrays will increase to US$1.6 billion, representing a 59% compound annual growth rate (CAGR) over the 2012-2016 forecast period.

• The HP Virtual Cloud Networking (VCN) SDN Application, which is designed to simplify the transition to a software-defined data centre.

• New backup, recovery and archive offerings that allow organizations to protect increasing volumes of data, which IDC will reach monumental levels by 2020. Growing at 40% annually, the research firm is estimating that the amount of data created and copied each year by then will reach 44 zettabytes.


Print this page

Related


Leave a Reply

Your email address will not be published. Required fields are marked *

*