Jan 27 ’15
Using the Cloud: Abusing the Cloud
You cannot throw a stick in any direction without hitting a cloud-based application or service. The many offerings from so many vendors blur lines differentiating vendors, and ultimately, responsibilities.
Consumers appear to be the most trusting, and from recent experiences, embrace cloud providers willingly and without much investigation. Sadly, consumers have as much to lose as any corporation, yet due to the ubiquitous nature of consumer clouds, they continually fail to take responsibility for their own protection.
While that statement sounds like a condemnation of the consumer, it really shows how immature the entry-level cloud can be for this user community. Blind optimism that many consumers have about data protection and access credentialing seems to be a holdover from the good old days when clear-text transmission of financial and PII (Personally Identifiable Information) was commonplace and accepted. Sadly, cost is normally a key factor when consumers search for providers, with a secondary selection factor based on features provided, disregarding several crucial elements. The real selection criteria for consumers as well as enterprises should be security first, accessibility second, followed by the appropriate focus on features and cost.
Before you dismiss this introduction as a setup for a consumer-focused article, allow me to allay your fears. I am merely setting a foundation for the need to change attitudes about how we view and use the cloud. Sometimes bad habits learned from being consumers influence our business decisions, often with disastrous results. Complacency is too common and is not our friend.
The “cloud” is touted as offering many models; software as a service (SaaS), infrastructure as a service (IaaS) and platform as a service (PaaS), with definitions based on NIST Special Publication 800-145. Deployment can be further designated private cloud, public cloud, hybrid cloud, or as newer categories with finer granulations such as community cloud, distributed cloud, intercloud and multicloud.
One of the thorniest issues with cloud implementations has been the diverse culture surrounding duties and responsibilities: This has become a hot topic and subject of great disappointment for many customers. Sadly, the cloud is not a panacea for all applications. The same issues pertaining to duties, responsibilities and implementations faced with traditional data centers still exist. Yet there appears to be an overwhelming urge to move everything to the cloud regardless of value. The promise of major cost saving by various cloud providers has driven the business model to a frenzy of outsourcing activity, but motivations may be skewed based on hype and hard-to-quantify promises.
Have you developed a business justification for moving to a cloud? Have you looked at all costs as well as duties and responsibilities? Absolutely nothing is for free; not in life, and most certainly not within the cloud!
Rule Number 1: If you put data into the cloud, you must ensure it is secure and protected from unauthorized access and alteration. No matter how you look at it, this requirement is the one that has so many large corporations up against the ropes. The news media is awash in stories of stolen data (Target, Benesse Holdings Inc., Neiman Marcus, P.F. Chang’s, eBay, etc.), yet we within the IT community continue to resist the solution: encryption of all data in-flight and at rest. The issue has always been the cost of such an implementation, but what of the cost for failing to protect data? Unfortunately, this affects both the enterprise as well as the consumer; a double whammy with worldwide implications.
Several large corporations and federal agencies have jumped on the cloud bandwagon by outsourcing common applications such as email. Their thinking is that by moving this type of standardized function to a bulk service provider, costs will be significantly lower. Email in the cloud offers several compelling benefits such as availability (always accessible from anywhere), elastic data storage (it can grow as required; i.e., the more you use, the more you pay) and offloading operational duties.
If you are bad at managing your internal IT infrastructure, the cloud may offer some respite from inability to protect your own assets. Just ask the IRS about email retention and recoverability. They were arguably incapable of managing email internally so perhaps the cloud would have been a better implementation, albeit with caveats.
Rule Number 2: If you put data into the cloud, you must ensure it is recoverable. Backups are a must, and similarly, testing the backup methodology and process is a necessity. If you fail to cover contingent liabilities with an all-encompassing plan, the cloud will become no better than your data center self-rule. If you kept backups of your data for six months in your own data center, then implemented the same term for your cloud app, you have not improved the situation much more than divorcing yourself from the day-to-day activities.
But for some reason, we are not following these two basic rules; rather, we act like teenagers that store files through a service and then ignore the fact they have lost all control.
Who’s to Blame?
Blame for what, you ask? Simply put, we are not learning from our mistakes, and from a doom and gloom perspective, the stage is set for serious data loss as the cloud paradigm is adopted by more and more users. On the surface, this may sound antagonistic or confrontational but shifting responsibility from an internal organization to an external organization does not guarantee success and security.
Every cloud user must be completely involved with the definition of what the cloud will be for the organization; what services it will provide; who will monitor and provide oversight. The same requirements imposed when everything was in-house remain when you outsource to the cloud. This means that some functions, such as vendor management and oversight, must still be a budget line item and must still be part of a review process.
The cloud can be a safe haven for corporate data only if you take the time to implement proper controls and oversight. The cloud will not protect you from yourself unless you establish guidelines upfront to allow it to protect you. The cloud, whatever cloud you choose, is not in itself perfect, and security has yet to be tested on a broad scale. You cannot ignore safeguards of your assets, be they music files for your MP3 player, emails for your office or customer databases. Outsourcing expectations should not be viewed differently from those for your own data center. If anything, you should expect— and demand—more and better.
The cloud does offer agility. The ability to react rapidly to changes in requirements is one of the best value-adds that cloud providers offer. Agility is great but security is better and more important.
Rule Number 3: Take full responsibility for your cloud implementation. Designate someone in your organization to monitor application operation and interface directly with the cloud provider. Treat the cloud provider as a vendor and require them to sign a service level agreement specifically detailing everything required by your corporate or federal standards for IT, such as, at a minimum:
• Establish a point of contact for all issues, along with a process for problem escalation and a maximum time limit for corrective action.
• Ensure compliance with legislative requirements such as HIPAA, FedRAMP or for PII, and certified reports proving they are in compliance annually.
• Embrace a rigid backup and recovery procedure and policy.
• Encourage an application update strategy and cycle.
• Enforce clear understanding of who owns data and what happens to it once deleted.
Proactive Cloud Control
The cloud service industry has enjoyed rapid growth and relatively few issues involving data breach attempts or loss of data. Eventually, cybersecurity hackers will develop formidable tools to break into lucrative repositories and feast on financial information or intellectual property. Most likely, initial methods employed will be suspiciously similar to current methods such as social engineering to steal credentials.
The best solution to this threat is multifactor authentication. Costs are greater but protection afforded is genuinely better and provides a first line of defense against intrusion. Individual security tokens with PIN, along with traditional userid/password combination, is a wise implementation for any size group.
Combining the secure authentication model with full data encryption (in-flight and at rest) starts you down the path for resilient security demonstrably more reliable than other more passive methods. Yes, it will cost more to implement. Data encrypted at rest requires decryption during application access, and this is a major concern, but it can be done. IBM System z environments provide this capability natively, using hardware acceleration to maintain application performance, making this platform a perfect candidate for cloud implementations.
By implementing these two controls, even a data breach can be less stressful with the knowledge that data decryption will be impractical for most thieves.
Stop the Abuse
All clouds are not created equal; that is a given. If you develop your own private cloud, it may provide all features you desire and give you complete control, but not automatically or without planning. Many organizations believe that if they have complete control over the cloud, as with their own private cloud implementation, they will have less to worry about. If you don’t build security into your cloud design, then you have achieved nothing of value. The recommendations do not change; use multifactor authentication and encrypt all data at all times. This security mantra must be accepted because it works; it is simple, straightforward and secure.
Cloud management from the customer perspective must include all elements of a traditional data center:
• Operational control such as monitoring, logging, failure analysis and periodic review
• Content control to implement a security model over assets (applications and data)
• Recovery control to plan for disasters
• Administration control to manage access control for users and applications.
Above all, create a strong management team that takes control of the cloud to unite all users, departments and functional business units that use the enterprise cloud. Failing to exert influence and control sets up the organization for a failure. Do not fall into the complacency trap; do not blindly accept assurances from the cloud provider regarding protection of your data or services provided. Actively be part of the solution and not a passive observer from the sidelines. After all, this is your system, your assets and yours to screw up.
I find it humorous that after all these years we are talking about cloud computing as a new technology or a capability just invented. For years, many of us have been using mature virtualization found within z/VM (or VM/ESA, VM/SP, etc.) to bring many of these “new” cloud features to organizations we supported. At one point, it was called distributed processing or decentralized processing; now, the moniker is “the cloud.” Elements comprising this type of operation are familiar to the VM community and I suspect that many successful implementations of corporate clouds are thanks in part to old VM graybeards leading the project. The world would be more secure if z/VM was used everywhere.
Regardless of whence you came or where you are going, the same generally accepted business practices and procedures that we have followed for years must continue to be implemented as we move forward. It should not be assumed to be automatic. Security is not inherently automatic. We must manage projects and systems the same whether using an outsourced cloud, a self-supported cloud or a traditional data center. Nothing changes.