Guest Column | June 11, 2014

The Challenges Of Moving Data To The Cloud

John Gallagher, VP of Marketing, Quorum

By John Gallagher, VP of Marketing, Quorum

Cloud is a hot topic right now. It seems most businesses are making the move to the cloud: servers are moving to the cloud, data is stored in the cloud, and applications and services are hosted from the cloud. Collectively, organizations have made the cloud “mission critical” for their overall business operations, including reliance on the cloud for disaster recovery.  According to recent study from Cloudability, 86 percent of companies currently use more than one type of cloud services.

Based on all of the buzz, your clients might think their businesses need to be in the cloud as well, and that if they’re not, they might be missing out on something. Before they rush off to the great big service in the sky, though, it will be vitally important to explain both the pros and the cons. There are challenges with moving data to the cloud for any business, as well as risks.

What Are The Challenges?

Managing Big Data

It’s no secret that data is increasingly large in nature. As a result companies need to pose the question — how do we manage such large amounts of data? There are a number of different factors that drive the use and popularity of Big Data, including new ways of linking datasets, creative approaches to visualizing data, and improved statistical and computational methods to name a few. Another challenge that Big Data has brought into consideration; when do you move the data to the processor (e.g., shipping data from the cloud into a datacenter) versus moving the processor to the data

Data Privacy, Security, And Compliance  

The second challenge is making sure there are processes and procedures in place to manage the privacy and security of customer data. As data continues to grow more complex — businesses will also see (and have recently seen) a surge in compliance related issues, including data breaches that are toxic to corporate reputations.

Retention Time

How long should your customer store that data? Not all data is equal. Because there is so much data, containing different information, companies have a hard time determining how long they should store and have access to certain data, and when it’s okay to let that data “go” and erase it from its storage.

Fast Storage Is Expensive

The cost of data storage brings in the concept of “tiered storage,” effectively data migration strategies. The issue is that in generating high-energy type data, meaning the kind of data you need to constantly have access to for use, you need to be using a very fast storage platform, which is one of the most expensive types of storage platforms available. But, if you don’t necessarily need access to all of your data all of the time, you can use less expensive option.

Sharing Challenge-Distribution Challenge:  

Businesses that deal with large amounts of data deal with determining the best data migration strategy that needs to balance customer and organization experiences.

What Are The Risks?  

Impact Of IT Downtime

According to the Ponemon Institute’s December 2013 Study, the cost of data center downtime across industries is approximately $7,900 per minute. Whether it’s caused by a natural disaster, human error, or IT failure, how many organizations can really afford that kind of lost?

Reputation/Competitive Edge

The big risk to an organization in terms of handling its reputation is to make sure customer-facing data are protected. Not all data is equal. Another high risk area is when mission critical data and apps lack a focus on business continuity and public disaster recovery. If you are down, your reputation will likely suffer if your competition is down less or not at all. Establishing criteria for recovery time and what the right recovery point is depends on the specific applications. You can’t have a one size fits all.

Business Impact Analysis

Not doing a business impact analysis creates risk; a BIA forces organizations to look at infrastructure and realize what are the critical apps and data. The risk is that the organization can’t answer that data. Often times, companies have a hard time distinguishing between these two. 

Overcome These Challenges

Functions Over Content

Figure out the functions of the data, rather than focusing on the content of the data. Function shipping is the process of moving logical services to the physical areas of a system where they can be optimally run. Especially with the cloud, if all the data is in the cloud — why would you ever move it? What you need to do is move the functions to the cloud so they can be actionable on the data that is stored there. Moving large amounts of data to and from the cloud is bad strategy — it’s hard to move, especially Big Data. 

For An Instant Recovery, Ship The Functions

This means having a DR (disaster recovery) solution that moves functions back to the cloud along with your data for instant recovery. You want a recovery that allows for an automated, physical-to-virtual (P2V) conversion so that the physical data center server can be moved as a virtual machine into the cloud. Also need a testing technology in the cloud that demonstrates that the recovery flow in the cloud is exactly the same as the production flow. 

Match Solution With Data Use

If it’s long-term data storage, then you would go a different direction, such as Amazon Glacier, which extremely low-cost storage service that provides data and backup for the infrequently accessed data, in which slow retrieval time is acceptable.  If you are trying to achieve instant recovery, you will need a very different storage platform.

Ensure Data Security

There are tons of standards around security, especially in the cloud. So, making sure the vendors of the solutions you consider are PCI compliant. Also, don’t be shy talk to auditors! The auditing function is there to bulletproof what needs to come. 

Always Be Testing

Do a lot of testing. Run operations in a number of situations, and test for security. You will need to test for apps running the cloud versus the data center — any situation you might theoretically encounter. 

John Gallagher runs marketing for Quorum, leveraging 20+ years of expertise in product marketing, pipeline management, and thought leadership on innovative technologies. Across his work in storage and semiconductors, he has embraced the role of technology evangelist through forming industry associations, editorial viewpoints, and public speaking. Prior to joining Quorum, Gallagher led the outbound marketing and inside sales teams at DataDirect Networks, a leader in high performance storage systems. John has managed marketing teams at LSI, Cadence, and EMC/Isilon, as well as startups Synplicity, and Meru Networks.