Search this website
By Sumir Bhatia
A famous company once coined the phrase – “the Whole World is… Data!”
It all ties back to data, whether we’re talking about YouTube, Facebook, Uber, Deliveroo, SAP, High Performance Computing (HPC) or databases. Today’s increasingly connected world reflects that very phrase. Data has grown exponentially, due to the rise of Internet of Things, social media and big data. For enterprises, this avalanche of data also presents a wellspring of untapped knowledge.
Mining Data for Endless Possibilities
The ability to store data is only half the story. To truly benefit from the new knowledge economy, organisations need to know how to manage data effectively, analyse and extract meaningful intelligence out of it. The possibilities are endless, once businesses and governments understand this. Some organisations have already put this into practice:
All-Flash Arrays and Storage Virtualisation
In general, data that is frequently accessed is termed ‘hot data’. Examples include databases, Enterprise Resource Planning systems and webpages. With flash drives being priced more competitively nowadays, the adoption of all-flash arrays for these mission-critical applications has become a reality. Today, the leader in this space is Nimble Storage.
The real growth of flash adoption, however, came with hybrid arrays, which deployed tiering between flash and traditional disk drives. With the arrival of storage virtualisation, enterprises can achieve this completely via software. In today’s landscape, IBM Spectrum Virtualize, IBM StorWise and DataCore offer such capabilities and are the most mature in the market.
Software-Defined Storage (SDS) and Object Storage
Data that is infrequently accessed is called ‘cold data’. It can exist either in structured form, such as backup and archival data which most enterprises are using, or unstructured form, for items like large videos, pictures, and blogs. Such data can range from small file sizes of a few kilobytes to large terabyte file sizes, or from a few hundred files, upwards to billions or trillions of files.
Managing millions of small files is altogether a different matter from managing a smaller number of ultra-large ones. While both cases can take up the same storage capacity, older architectures are unable to support the high level of granularity involved, from a data management standpoint. Therefore, SDS and Object Storage are introduced to address this trend. Cloudian is a leader in the Object Storage SDS space.
Unlike Object Storage, it is equally challenging to manage traditional File and Block storage, especially in large capacities due to the escalating costs of Storage Area Networks. SDS, with its scale-out architectures, can help to reduce the costs of storage significantly. Nexenta currently leads the File and Block SDS market.
For homogenous workloads where performance, capacity, and scale are needed, we’re seeing specific HPC workloads like machine-learning systems that require not just petaflops of performance, but also petabytes of data. For this, specialised low latency SDS offerings like IBM’s Spectrum Scale are ideal.
Bridging Traditional and New Storage Technologies
To cope with the different types of storage data, the industry has to embrace a more granular approach on how it addresses different workloads. Even as new software defined technologies are being rapidly developed, enterprises will still need a trusted technology provider to bridge the traditional with the software defined solutions of tomorrow. After completing its acquisition of IBM’s x86 server and networking portfolio, Lenovo is well-positioned to serve this crucial role, helping enterprises explore this new chapter in storage innovation – and the future is looking bright.
The author is Vice President, Data Centre Group, Asia Pacific at Lenovo.