Guest Column | January 17, 2013

Using Storage QoS To Accelerate, Control And Guarantee Performance

Chris McCall-NexGen Storage

By Chris McCall, VP of Marketing, NexGen Storage

Last week I had the opportunity to meet with a customer who is considering deploying a virtual infrastructure. Their existing deployment consists of high-performance servers connected to direct-attached storage (DAS). Ensuring adequate application performance has always been the key driving factor for their infrastructure strategy, but with 26 servers and more coming online this year, footprint, power and cooling are becoming huge issues.

Enter server virtualization. The space, power, and cooling savings are obvious, so I asked the customer why they hadn’t done it sooner. His response was simple: “Performance.”

Consolidating multiple applications on a shared storage system meant that each application’s workload would impact every other application’s workload. Application performance needs could change at a moment’s notice, resulting in unpredictability and chaos. The existing DAS implementation avoids this mess by dedicating storage resources to one, and only one, application. So while DAS addressed the customer’s performance concerns, the challenge of managing upwards to 30 servers with individual storage is pushing them to a virtual infrastructure with shared storage.

Given the switch, how does this IT administrator guarantee the performance levels that are critical to his company’s productivity?  That’s where storage Quality of Service (QoS) comes in. Similar to the concept of QoS in the networking realm, storage QoS lets you accelerate, control and guarantee storage performance levels.

Managing Mission-Critical Application Performance

Today, IT professionals and VARs are constantly looking for ways to monitor and tune shared storage systems and often over-buy disk to accommodate performance spikes. The vast majority of SAN and NAS products on the market force the IT administrator to configure performance, rather than manage it. To do this the administrator estimates a workload then sizes the storage system by the number of drives. They’re left with a single pool of performance that every application shares, with no way to assign resources or prioritize. The administrator is forced to buy resources they don’t need:

  • If the storage system is built on spinning disk drives, the administrator sizes the system by estimating the number of disk drives required to hit the performance requirement. This almost always results in excess capacity.
  • For systems based on solid-state technology, capacity tends to lag performance. The administrator has to purchase by the amount of capacity required, resulting in excess performance.

Rather than continuing in this decades-old paradigm, IT administrators now have the ability to use innovative management capabilities in the form of storage QoS to manage performance. Storage QoS allows IT administrators to guarantee performance to each application and isolate workloads from one another on the SAN.

For example, a company can assign 30,000 IOPS to their business intelligence app, 25,000 IOPS to their order database, and 5,000 IOPS to the marketing file shares. No matter what is going on in the system, each application will get, at a minimum, the targeted level of performance.

By setting these guaranteed minimum levels of performance, storage QoS eliminates resource contention within the shared storage system. With this capability, when one application spikes, the critical applications such as the order database or business intelligence application will never drop to unacceptable levels.

Adding VDI to the Application Mix

In the above example, I referred to business intelligence and the order database, both business-critical applications that, prior to storage QoS capabilities, have typically not been virtualized in order to avoid the likely performance contention. VDI is another example of an application with extreme performance demands; most IT departments run VDI on its own server to isolate the impact of boot storms. However, with storage QoS capabilities, the administrator has the ability to control and guarantee performance for both mission-critical applications and VDI from a single storage solution, allowing them to take advantage of the consolidation benefits of shared storage.

Storage QoS – The Missing Link

The storage industry has spent the last few decades focused on managing capacity. By consolidating capacity resources into shared systems, customers saw overall cost per GB decrease and management become much more straightforward. However, sharing capacity also meant sharing performance. When x86 processing power exploded and virtualization allowed multiple applications to run on a single host, the concentrated performance workloads exposed a massive management gap, specifically how to manage shared storage performance.

Storage QoS addresses this missing link, allowing IT administrators to accelerate, control and guarantee storage performance. This capability makes it realistic to support mixed workloads on one SAN, including mission-critical applications, without compromising performance.