December 22, 2015
LONDON – You need more than a new name and new branding to succeed in the enterprise computing space, a point that Meg Whitman and the rest of her senior team were well aware of in the days and weeks leading up to a user conference called HPE Discover 2015 held earlier this month in the bustling U.K. capital.
The event represented far more than a coming-out-party for a new entity called Hewlett-Packard Enterprise or HPE, which came into being on Nov. 1 and is headed by Whitman.
Touting everything from the Internet of Things, Big Data analytics, hybrid cloud and something called “Composable Computing,” it needed to do well out of the gate in this, the first user conference since the split of HP Inc. into two separate entities – HP Inc., which is now all about personal systems and printers and HPE, which as the name suggests, is all about the enterprise.
According to Dave Pearson, research manager for storage and networking at IDC Canada, multiple leadership changes, a few notable unsuccessful acquisitions and the reduction of R&D and staff spending in pursuit of leaner profits, have certainly led to much speculation about HPE’s long-term success.
“The challenge here is as much one of perception as it is of performance. A focused HPE, able to develop and communicate a clear path toward their next generations of technology is a must to overcome these biases, ” said Pearson, who attended HPE Discover.
To that end on Day Two of the event, Robert Youngjohns, the firm’s executive vice president of software, told his audience that whatever industry they are in, the disruptors of the future will be those who can use “information at their fingertips to drive new business models, different ways of accessing their customers and different ways of competing. This is really, really important.”
Youngjohns also referenced a conversation that occurred at a recent customer event.
“Somebody said IoT was an invention by the IT industry simply to sell more storage. I don’t think that is the case. It is real and it is happening now. The key is in the infrastructure you build to support the Internet of Things.”
HPE announced new Internet of Things (IoT) systems and networking offerings that it said enable customers to more efficiently collect, process and analyze IoT data.
The rapid proliferation of IoT devices, data and connectivity has the power to enable the creation of new offerings, improved efficiency, better decision making, and more effective risk management across organizations, the company said.
* HPE IoT System EL10 – A gateway designed for entry level deployments.
* HPE IoT System EL20 – A gateway with additional features for higher compute capabilities.
‘Today, delivering business outcomes quickly and securely requires intelligence to enable real-time decisions at the edge,” HPE said in a release. “Moving computing power, data acquisition and data management to the edge of an organization’s network, outside of the traditional data centre, allows faster access to relevant data, requires less bandwidth to transport useless data, and ultimately accelerates the time to insight for organizations.”
Meanwhile, Aruba, an HPE company, launched a cloud-based beacon management offering.
The sensors combine a small Wi-Fi client and BLE radio, which enables organizations to remotely monitor and manage Aruba Beacons across existing multivendor Wi-Fi networks from a central location using the Meridian cloud service.
“What is going to make IoT successful in your enterprise is not the devices you put out on the edge of the network,” said Youngjohns. “They are pretty much trivial — $5 will buy you a sensor with the ability to create log data. What is going to matter here is the infrastructure you build around it to support the Internet of Things.”
In an interview with Connections+, Dr. Tom Bradicich, head of hyperscale servers and IoT systems at HPE, theorized that if you have enough data you negate the need to make a decision. “A decision is made because you don’t have enough data. The more data we can collect through these thousands of sensors,” the greater the insight that can be learned, he added, about anything and everything.
“Think about it. You walk up to a washing machine and you need this shirt washed tonight … and the machine breaks then that is a surprise. Contrary to that, if you can be told your washing machine is going to break on Wednesday of next week, you can plan around it.
“Wouldn’t you love to be able to plan around an automobile breakdown, instead of it happening while you are on the highway. This notion of being able to prognosticate and convert surprise outages into planned outages has tremendous personal value and business value.”
Meanwhile, also launched at Discover were:
* HPE Synergy, a platform designed to run both traditional and cloud native applications. Jed Scaramella, research director at IDC, said in order to “respond to the demands of the business, CIOs and IT executives need to deliver services that are increasingly application-centric. To remain relevant, IT needs to not only provide a reliable and cost-effective infrastructure that can support their legacy investments, but one that gives them the flexibility and speed to deliver services like a cloud provider.”
* HPE Helion Managed Cloud Broker, a managed services that allows businesses to provision, access and consolidate control services across multiple cloud workloads and providers.
* The appointment of Microsoft Cloud Azure as the “preferred public cloud partner for HPE customers.
Neil MacDonald, vice president and general manager of HPE BladeSystem and converged datacentre infrastructure, said software-defined intelligence contained in Synergy now means the “infrastructure can play a much, much greater role in managing its own life cycle.
“For customers it helps them in four ways: It helps them reduce the over-provisioning of IT infrastructure, it helps them develop applications much faster, it helps them deploy those services much more rapidly and it greatly simplifies the whole life-cycle management that they have had to invest in managing that infrastructure and keeping it current over time.”
Composable computing, said Pearson, was definitely a “mind share leader at the conference and their announcements around the Synergy products showed great movement forward in terms of actually deploying true composable infrastructure.
“Composable computing is going to be big — we’ve already had some early entrants from Cisco and Dell, but HPE’s announcement was a forward look at the technology, with more powerful embedded software tools for deployment and management, and multi- chassis support with storage nodes in the chassis or external storage support.
“These capabilities, extended forward, can provide the agility so many business leaders are looking for from their IT department. Quick responsiveness from IT to meet business demands has come to be of utmost importance among line of business and IT leaders alike, according to our surveys and executive interviews.
“Expect to see more products in this vein from other vendors, which leads to the question — how can HPE execute on this vision, especially in a Canadian context? Synergy does have the potential to be a game changer, but it will rely on HPE demonstrating real value to customers, not just thought leadership.
“They were also early mind share leaders in the realm of converged infrastructure, being literally the first name on the list in terms of customer awareness when the buzz started. In Canada now; however, they don’t enjoy the same leadership position in terms of market share that they once had in recognition.”
Meanwhile in terms of cloud, Bill Hilf, the company’s senior vice president and general manager of HPE Cloud, said that in this particular sector, hybrid will rule. He also released details of a joint HPE-451 Research study that examined cloud investing plans in eight major verticals – manufacturing, telecom, retail, insurance, healthcare, government, finance/banking and education.
In each case, two-thirds of total spend is going into the private cloud. The vast majority of enterprises need the mix of both, he said, in order to make the “digital transformation happen to them in a real way.
“The first wave of cloud computing was heavily adopted by start-ups and younger companies taking advantage of going into primarily a public cloud context. They were no longer buying servers, they were no longer buying infrastructure.
“What we see now is that enterprises want those same benefits, but they have to deal with the environment they have today and they have to deal with a variety of other constraints that often start-ups or consumers do not have to deal with such as regulations and security and geo-political issues that are related to certain industries.
“Cloud computing is becoming a proxy for IT transformation in the industry. Customers now want to use the cloud for all of their workloads.”