Data interaction is changing. Here’s what new, data-centric environments look like, how to support the workloads of today while preparing for the fluidity of tomorrow.
Let’s consider a typical company, it might sound like yours; its universe of data is evolving, rapidly. There are copious amounts of data, a substantial portion that is generated by line-of-business (LOB) applications – ERP, CRM, data warehouse, etc. each with its own siloed storage. In addition, there is unstructured data from many diverse sources, including mobile devices, transportation vehicles, wearable devices, and an ever-growing variety of edge or IoT devices distributed across remote locations. If you could analyze all this data from multiple perspectives, you could deliver some phenomenal insights and unearth that coveted competitive advantage. Yet getting there is not so easy.
To fully unlock the value of data and identify a true competitive advantage, you need to have a data-centric, not application-centric, approach to storage infrastructure. This is a radical shift. IT infrastructure strategy has focused on how to best support a specific application for a very long time. Yet by placing data at the center, you can allow multiple disparate applications to leverage said data.
Consider the greater insights possible by having a customer retention software data interconnected with your marketing and lead generation data, or business finances and operations.
Now, it’s possible for sales and marketing to share data to create better campaigns, gain enhanced insights into buyers’ behaviors, and understand the impact on the bottom line. Yet, this is just one example of how data and its insights beget more data and insights.
The Impact on Data Infrastructure
As data interaction is changing, infrastructure is evolving to meet new types of characteristics, very different than how IT has traditionally been done.
First, you need to think about scalability; after all your quantity of data is growing every day. Plus, you are trying to support multiple disparate applications. Thus, agility is paramount as needs are constantly changing with the time, date, month, and season. Can your infrastructure adjust on the fly to meet the peaks and valleys inherent in your corporate workflow?
In addition, your infrastructure must be cost-effective, i.e., demonstrate a high degree of operational and financial efficiency. How will you ensure that needed data is available at the requisite performance level without busting the budget? Aligning the value of data with the cost of its storage can deliver faster insights and a competitive advantage.
The Rise of Composability
A new approach that is gaining rapid interest is to evolve from dedicated platforms to a more fluid, composable infrastructure defined by software. In a software-composable environment you disaggregate everything into resource pools and through software grab just the amount of compute, network, and storage needed, with the ability to change it on the fly. This allows for seamless scalability, cloud-like agility, and exceptional resource efficiency.
This type of disaggregation relies on a connective tissue or extensible interconnect to stitch the compute, memory, network, and storage elements together. This fabric (or network) enables resources to communicate with one another and make data directly accessible across multiple compute resources. The good news is that in most cases, the fabric or network for composable infrastructure is a high-speed Fibre Channel or Ethernet based solution. Hence, some of the underlying plumbing is already in place. Protocols, such as NVMe™ and NVMe over Fabrics (NVMe-oF™), help reduce I/O bottlenecks within storage arrays and across networks, respectively, a key consideration for on-the-fly distributed composability and maximum application performance.
Communicating Over Fabrics
When combined with NVMe-oF, we have the essential underpinnings for a base fabric; however, different elements and devices need to speak common language. For example, a set of APIs are necessary to control and manage media, automate discovery of resources, figure out peer-to-peer connections, and orchestrate resources. For a robust solution, a Software Composable Infrastructure and Extended Fabric, a new high-speed memory fabric, is necessary. Western Digital recently announced our proposal for said technologies. You can read more about it in this technology brief.
The result of software composability is that applications’ needs can cost-effectively be met through closer resource alignment. More importantly, as much as any technology can be, a composable infrastructure is future-proof. It allows you to support a variety of different environments: virtual servers, containers, bare-metal applications, etc., the choice is yours.
Round Table – Building a Next Generation Data Center
By now, if I haven’t already muddied the waters with a bunch of technobabble, and you would like to learn more, I invite you to join me at the upcoming Ecocast by ActualTech Media, Building a Next Generation Data Center. On April 24, multiple vendors will come together to discuss what these kinds of environments look like, how to support the workloads of today while preparing for the fluidity of tomorrow, and how to seamlessly connect data within this new data center paradigm.
I look forward to seeing you there!