The HDS hyper scalable platform (HSP) is a building block for infrastructure that comes with computing, virtualisation and storage pre-configured, so that modules can be snapped together quickly without any need for integrating three different systems. HDS has taken the integration stage further by embedding the big data technology it acquired when it bought Pentaho in 2015. As a consequence the new HSP 400 system creates a simple to install but sophisticated system for building enterprise big data platforms fast, HDS claims.
HDS claims that the HSP’s software-definition centralises the processing and management of large datasets and supports a pay-as-you-grow model. The systems can be supplied pre-configured, which means installing and supporting production workloads can take hours, whereas comparable systems can take months. The order of the day, says HDS, is to make it simple for clients to create elastic data lakes, by bringing all their data together and integrating it in preparation for advanced analytic techniques.
The system’s virtualised environments can work with open source big data frameworks, such as Apache Hadoop, Apache Spark and commercial open source stacks like the Hortonworks Data Platform (HDP).
Few enterprises have the internal expertise for analytics of complex big data sources in production environments, according to Nik Rouda, Senior Analyst at HDS’s Enterprise Strategy Group. Most want to avoid experimenting with still-nascent technologies and want a clear direction without risk and complexity. “HSP addresses the primary adoption barriers to big data,” said Rouda.
Hitachi will offer HSP in two configurations, Serial Attached SCSI (SAS) disk drives, generally available now, and all-flash, expected to ship in mid-2016. These will support all enterprise applications and performance eventualities, HDS claims.
“Our enterprise customers say data silos and complexity are major pain points,” said Sean Moser, senior VP at HDS, “we have solved these problems for them.”