A Shift In Balance: Virtualization Growth Drives Demand For Greater App Performance
By Atchison Frazer, KEMP Technologies
IT budgets have been on the move in recent years. The shift from central IT purchasing to line of business (LOB) responsibility has been spurred in part by C-level initiatives to drive greater departmental accountability without the problems of amortizing centralized IT overheads. In parallel to this, there has been a pronounced migration in data center resourcing from the large fixed capacity, compute and storage hardware systems, to virtualized and cloud delivery models.
Gone are the days of “big iron” dominating the centralized IT enterprise; today’s corporate infrastructures include a dynamic mix of physical, virtual, and cloud deployments. Server virtualization has rapidly become the default platform for modern data centers with some 50 percent of all new workloads deployed in a virtual machine (VM).
Two key attributes of server virtualization have accelerated this trend. Server-to-server and resource-to-resource isolation facilitate dynamic sharing of core data center hardware that prevents peak demand from one LOB from affecting the service quality for another’s workloads. Additionally, direct visibility into resource utilization enables a metered “utility” chargeback model, solving that difficult-to-amortize overhead problem. Even though they draw from a common pool of shared hardware infrastructure, there is no need for autonomous LOB IT projects to be coordinated if they all run in their own well-isolated virtual environments. It is critical that LOB users have predictability in their application performance, and that their apps will not be randomly starved in favour of other apps running in the same virtualization environment.
Sign in to read more.