By Steve Van Till, President and CEO, Brivo Systems, LLC
The Cloud Comes to Security
The cloud computing market is exploding. Worth over $100 billion in 2012 and doubling within the next four years, this trend is here to stay and the physical security market is no exception. With a growing range of Security-as-a-Service offerings from access control to video surveillance, recurring revenue business models have grown far beyond their origins in alarm monitoring and now pervade every aspect of our industry.
Vendors responded to this gold rush with product offerings designed to put integrators into the RMR business. They’re also offering to save end users big up-front expenses while reducing total cost of ownership.
Unfortunately, with every gold rush there are opportunists selling false, over-hyped claims. This is especially worrisome for the security industry because inaccurate cloud claims endanger the safety of our customers and their property. In this article, we’ll examine a number of these claims and discuss their impact on the quality of security systems.
Real Data Center or the Integrator’s Network Closet?
You’d think that anything advertised as a “cloud” solution would be hosted at a real data center. Today, a number of security systems vying for cloud legitimacy are actually server-based products that integrators install at their own facilities and operate on behalf of customers.
This sounds fine in theory, except that few security integrators have the IT chops to pull it off. Many a “data center" in our industry turns out to be nothing more than an unsecured network closet in an industrial park office, with electrical backup from a 5 HP generator. Contrast this with a real data center that has a hardened perimeter, riot glass, 24/7 guards and staffing, biometric access control, redundant ISP connections, redundant power supplies and cooling equipment, fire suppression, earthquake proofing, network monitoring, and enough batteries and diesel generators to keep the whole thing running for days — if not weeks — in the event of a major emergency.
Why this matters: Nothing less than a real data center can provide the high availability and data security required for security services; anything else places end users at high risk.
Substandard Hosting: Just a Fancier Network Closet?
Substandard hosting is a close cousin to the network closet. It has become so easy to “spin up a virtual machine” with an Infrastructure-as-a-Service company that everyone is doing it and declaring “game over” for meeting the hosting requirements of a security application. Unfortunately, many end users suffer brand-name blindness in this scenario and can’t see past vendor assurances that their data is hosted at Amazon, Azure, or some other big company. Don’t get me wrong: these are great companies with great services. But the quality of an application service has as much to do with the way the software is written and managed as it does with where it is hosted.
For example, it does no good to be hosted at a big company unless you’re diversified across at least two of their data centers. A lot of Amazon Web Services customers learned that lesson the hard way during recent outages in one of their primary availability zones. Ditto for Microsoft’s European cloud customers.
Taking it a step further, it does no good to be diversified across multiple data centers unless you have real-time database replication built into your product architecture. And it does no good to have replicated data unless you also have global traffic management to immediately switch between facilities. The lesson here is that it’s the whole solution that matters. A brand-name hosting service doesn’t guarantee quality unless the application provider has taken all the proper architectural and operational steps.
Why this matters: High availability and top-tier hosting matters because no one knows when security violations will occur, and you cannot afford to be down when they do.
What, No Multi-tenancy?
Software multi-tenancy is the fundamental design approach that allows Software-as-a-Service systems to operate securely and efficiently. Yet many of the systems touted as “cloud” solutions don’t have it. They are simply the same single-tenant designs as before, with a web browser tacked onto the front end. Yes, the web browser is a welcome improvement over thick clients, but customers and vendors should care about what’s behind the browser.
So, what is multi-tenancy? To quote Salesforce.com, a leading authority on the subject, “Whereas a traditional single-tenant application requires a dedicated set of resources to fulfill the needs of just one organization, a multi-tenant application can satisfy the needs of multiple tenants … using the hardware resources and staff needed to manage just a single software instance.”
Multi-tenancy matters to security integrators and end users for two reasons: economics and security.
In terms of economics, multi-tenancy allows major SaaS providers like Salesforce, Google, Netsuite, and many other familiar names to operate their services at massive scale and low cost. It does this by using a software design that enables thousands, even millions of unrelated customers to safely share the same underlying hardware resources. While the cost savings on hardware is obvious, there are equally considerable savings in energy, maintenance, and staffing expenses. Without multi-tenancy, the expense of running applications is virtually the same as traditional IT, and the whole cost-benefit argument for cloud services collapses.
By the same token, supporting millions of customers on a single, highly scalable instance can only be accomplished if the security provisions were designed into the software from the start. Here a real estate analogy is illustrative. Ever lived in a single family home that was subdivided to support multiple renters? Doesn’t work so well. Not nearly as well as an apartment building designed from the outset to support multiple tenants. The same holds true for software.
Why this matters: Customers must look for multi-tenancy if they expect to achieve the promised cloud savings over the long term. Without it, there is also no adequate data security model in repurposed legacy applications.
The “Private Cloud Ready” Deception
Read a stack of recent sales literature, and you’ll come across the terms “private cloud ready” or “suitable for private cloud deployment.” Vendors often apply such terms to security appliances and server architectures, both real and virtualized. Sounds good, but what does it actually mean? Not much.
According to the oft-cited NIST definition, a private cloud is an architecture where “the infrastructure is provisioned for exclusive use by a single organization”. This means dedicated servers, storage, network connections, and staff to take care of the whole thing. Sound familiar? It should, it’s exactly the same as the traditional security software delivery model. It might have been moved offsite to another data center, and it might have been virtualized for a little hardware efficiency, but at its core it offers the same features as the dedicated client-server model of the past several decades.
That’s why I favor the more descriptive definition that a private cloud “is a marketing term for a proprietary computing architecture that provides hosted services to a limited number of people behind a firewall”. That’s probably not what you thought you were buying when the vendor told you it’s a “private cloud ready” product.
Why this matters: When security buyers are trying to free themselves of all the hassles of dedicated server equipment, the single-user “private cloud” fiction leaves customers right where they’ve always been.
The “Hide the Server” Deception
The private cloud claim is closely related to a practice we call “Hide the Server.” This amounts to taking the end user’s current security applications and moving them to where they no longer see them running on their own computers.
Does moving an old application architecture to a new server 1,000 miles away give it any of the characteristics of cloud computing? Of course not. It won’t magically support thousands of end-user organizations or suddenly be any faster for new users to provision. The truth is that the service provider will need to move every new customer’s computers to a new location, just like they did with yours. What do you think that will cost them? What will it cost you?
Why this matters: As a technology that promises to lower total cost of ownership, real cloud computing must deliver savings. “Hide the server” can never do that.
The “Cloud-Based Protocols” Deception
Many old-line software systems vendors are desperate to shoehorn “cloud” into their marketing literature. You can’t blame them. If it was 100 years ago and I had to sell wagons against automobiles, you can be sure I’d find a way to use the term “horseless carriage” in my pitch.
In one of the most egregious abuses of the term, there are systems vendors who are covered by the media as cloud companies because they claim to use “cloud-based protocols.” You might as well claim to be an electric company because your products use electricity.
I applaud their PR agency for working “cloud” into their press release, but it turns out this is just a case of old-fashioned remote access.
Why this matters: Citing “cloud-based protocols” leads users to a situation that sums up everything we’ve outlined so far: single-tenant applications that are usually hidden remotely as “private clouds” in a data center that has not been qualified or audited.
How to Recognize a Real Cloud
So, how do you recognize the real thing? Let’s go back to the impartial definitions NIST wrote several years ago:
Conclusion: The cloud is here to stay, and it offers security buyers numerous advantages over traditional solutions — but only when it’s the real cloud.