Logo

Generic selectors
Exact matches only
Search in title
Search in content
Search in posts
Search in pages

AS YOU ACCELERATE ON YOUR CLOUD JOURNEY, MAKE SURE DATA IS IN THE DRIVER’S SEAT

05/03/2019

By: James Petter, VP International, Pure Storage

 

With the rise of machine learning and artificial intelligence (AI) technologies, data has shifted from informational asset to the core of innovation. According to a survey conducted by MIT Technology Review and commissioned by Pure Storage, an overwhelming 87% of leaders in the Middle East & Africa (MEA) region, say that data is the foundation for making decisions, delivering results for customers and growing the business.

 

Yet the volume and variety of data is overwhelming. Transaction data, data from IoT devices and sensors, and video and audio data from a multitude of sources are generating massive data sets. Global IP traffic is now measured in zettabytes, and the deluge is rising fast. As a consequence, organizations are struggling to extract value from their data as evidenced by the fact that 75% of enterprises in the MEA region reported that they face challenges in digesting, analyzing and interpreting all the data.

 

Data, Not Apps Should Dictate Cloud Strategy

In this new world, application and data mobility are key to driving efficiency and business advantage. In the old days, specific applications dictated infrastructure needs, but in an era of exponential data growth, use of that data—and accessibility to it―should be the driving factor in any cloud strategy.

 

Modern businesses need real-time access to any and all data. This means making the most of business data in the reality of a multi-cloud environment—enabling applications to move freely between on-premises, private, or public cloud. The right strategy is on-prem and cloud, not either or.

 

Why is this so important? According to IDG, 90% of companies will have a portion of their applications or infrastructure in the cloud this year. Among the many lessons learned in that migration over the last several years is the need to drive efficiency and cost savings while meeting strategic business needs. This means setting data free, but doing so in the most efficient manner possible, which in many cases is vastly different than operations today.

 

The bottom line: executives should be making strategic decisions about the appropriate environment based on the type of data and the applications making use of that data. For example, applications that typically run consistently day after day and week after week—mission critical, steady state apps that run all the time without a lot variance—are better-suited to an on-prem instance. It’s simply less expensive than running such apps persistently in the cloud.

 

But workloads that typically must spin up or down with some frequency—and which require lots of compute—are better-suited for the public cloud, where they can take advantage of cloud economics and cost only for the time they’re actually being used.

 

As executives begin the shift to making strategic decisions based on the type of data rather than type of application, one way to accomplish it is to view the classic concept of application tiering in a new way.

 

A Data-Centric Approach to Tiering

Most enterprises run a mix of workloads. The concept of tiering has evolved with the advent of fast, flash-based storage. The idea that you have mission critical Tier 1 applications on high-performance storage, Tier 2 applications on mid-performance storage and Tier 3 applications on cold storage, based on economics is out of date. Flash has democratized the data center and enterprises realize there is value in all of their data.  In other words, “cold data” is no longer part of our lexicon.

 

Modern data centricity means that application mobility―the ability to seamlessly move applications born in the cloud to an on-premises environment, or vice versa, based on needs the data dictate—is what is mission critical.

 

Data mobility across public and private cloud requires a common tier of shared data. An Oracle database, for example, might run on Tier 1 storage but the data in the database might be leveraged in many places within the enterprise depending on the use case.

 

Because it’s business critical, it should be running at all times with high resiliency and low latency storage on premises. Yet periodic analytics reports or an intelligent algorithm that runs for end-of-month reporting are good candidates for the cloud. The agility of the cloud allows you to not need this massive compute on-premises, as you can spin up the compute you need quickly for the period of time you need it. Need a report faster, spin up more compute. Having a common data layer makes data mobile and applications agile.

 

Data Centricity As a Bridge in the Multi-Cloud World

Today, a cloud divide persists. On one hand we have the on-prem and hosted environment and on the other, the public cloud with different management and consumption experiences, different application architectures, and different storage.

 

But what if you could bridge the two with seamless orchestration, bi-directional mobility, and common shared data services? Hybrid cloud requires a data centric architecture, which is built at its core to share data in real time and easily facilitates moving data and applications.

 

Consider the example of that Oracle OLTP database instance running on-premises. Being able to send copies of the data out to the public cloud gives enterprises a huge advantage of scale with an agile compute architecture. By having that same common data layer, you’re taking that same on-prem data centric architecture and extending the experience into the public cloud.

 

Similarly, native cloud app deployments can be enhanced by getting data services that increase efficiency through data reduction, snapshot, and replication facilities. Indeed, your data should even define your disaster recovery (DR) strategy. In a multi-cloud world, data centricity enables you to leverage the cloud as a second data center. In an on-prem environment, spinning up the cloud-hosted backup environment makes recovery much less painful.

 

That’s how you liberate applications, unify cloud, manage data across the data center and cloud—to unlock new use cases like AI and real-time analytics―and ultimately give your organization the data advantage.

 

YOUR COMMENT

Your Comment

Email (will not be published)

Finance Derivative is a global financial and business analysis magazine, published by FM.Publishing. It is a yearly print and online magazine providing broad coverage and analysis of the financial industry, international business and the global economy.