making your data more valuable

Data growth is not new. Data proliferation is not new. However, what has changed is the value of data. In the past, anything over a certain period of time was backed up and removed offsite to reclaim expensive data storage capacity. But today, in the Big Data, analytics, regulatory compliance, and corporate governance era, many enterprises have more than 5% of data are active on a daily basis.

With small variances, this treats all data as having similar value. But this is not true. It is not true at any given point in time, but as time and demands change, the value of each piece of data will also change.

To address these issues, many storage systems vendors have advised on using data management software to fully understand what data needs more performance, is more valuable, or is frequently accessed. For most enterprises, this exercise only provides a point-in-time view of data, and is expensive on human resources, the time necessary to undertake such an activity, the software costs, and deep understanding of applications and workloads.

In the end, Neuralytix research shows that only 5-10% of data is “hot” – i.e. frequently accessed, high performance demands, or valuable due to the number of applications that depend on the data.

A simpler approach to data management is necessary. In fact, rather than capacity-driven approaches, it is necessary to take a value­-driven approach. Even with deep understanding of applications and workloads, the multiple layers of virtualization, from server- to network- to storage-virtualization abstracts the demands on the underlying data to a point that requires extraordinary effort to understand the value.

Companies such as NCS, a 44-year-old, Ohio-based commercial collection agency have started to rely on solutions that can help improve performance by orders of magnitude (equaling driving more revenue and profit); improve capacity density (reducing the cost of storage capacity); and delivering the right level of service to each application/workload using storage QoS to guarantee performance for each workload (ensuring the mission critical applications are prioritized over less critical applications).

NCS leveraged the NexGen N5 Hybrid Flash Array to achieve these benefits.

The NexGen solution helped to improve the business value by improving the user and customer experiences. It eliminated NCS’ need to conduct deep-dive analyses of all its data on a regular basis; yet it still resulted in overall system performance and reliability.

In today’s data-centric enterprise, Neuralytix believes that all levels of IT infrastructure need predictability, repeatability and scalability (PRS). Predictable performance, repeatable (read: simple) to manage, and the ability to scale with an enterprise’s growth are major drivers, even over and above the Reliability, Availability, and Serviceability (RAS) concepts of the 1990’s. RAS is table stakes. PRS drives business value and competitive advantage.

What NexGen has done is delivered on both the business demands, while simplifying the management of datacenter infrastructure – the result is simple: IT can now spend more time on helping the enterprise seek insight from information, rather than improving infrastructure.



This Research Note is sponsored by Nexgen. All opinions expressed are those of Neuralytix and our analysts.