Posts Tagged ‘Computing’
On the CA/Ponemon Security of Cloud Computing Providers Study…
Last Updated on Monday, 12 March 2012 01:27 Written by Celframe Security Team Tuesday, 11 September 2012 06:26
CA recently sponsored the second in a series of Ponemon Institute cloud computing security surveys.
The first, released in May, 2010 was focused on responses from practitioners: “Security of Cloud Computing Users – A Study of Practitioners in the US & Europe”
The latest titled “Security of Cloud Computing Providers Study (pdf),” released this week, examines “cloud computing providers’” perspectives on the same. You can find the intro here.
While the study breaks down the survey in detail in Appendix 1, I would kill to see the respondent list so I could use the responses from some of these “cloud providers” to quickly make assessments of my short list of those to not engage with.
I suppose it’s not hard to believe that security is not a primary concern, but given all the hype surrounding claims of “cloud is more secure than the enterprise,” it’s rather shocking to think that this sort of behavior is reflective of cloud providers.
Let’s see why.
This survey qualifies those surveyed as such:
We surveyed 103 cloud service providers in the US and 24 in six European countries for a total of 127 separate providers. Respondents from cloud provider organizations say SaaS (55 percent) is the most frequently offered cloud service, followed by IaaS (34 percent) and PaaS (11 percent). Sixty-five percent of cloud providers in this study deploy their IT resources in the public cloud environment, 18 percent deploy in the private cloud and 18 percent are hybrid.
…and offers these most “salient” findings:The majority of cloud computing providers surveyed do not believe their organization views the security of their cloud services as a competitive advantage. Further, they do not consider cloud computing security as one of their most important responsibilities and do not believe their products or services substantially protect and secure the confidential or sensitive information of their customers.
-The majority of cloud providers believe it is their customer’s responsibility to secure the cloud and not their responsibility. They also say their systems and applications are not always evaluated for security threats prior to deployment to customers.
-Buyer beware – on average providers of cloud computing technologies allocate 10 percent or less of their operational resources to security and most do not have confidence that customers’ security requirements are being met.
-Cloud providers in our study say the primary reasons why customers purchase cloud resources are lower cost and faster deployment of applications. In contrast, improved security or compliance with regulations is viewed as an unlikely reason for choosing cloud services. The majority of cloud providers in our study admit they do not have dedicated security personnel to oversee the security of cloud applications, infrastructure or platforms.
Providers of private cloud resources appear to attach more importance and have a higher level of confidence in their organization’s ability to meet security objectives than providers of public and hybrid cloud solutions.
_While security as a “true” service from the cloud is rarely offered to customers today, about one-third of the cloud providers in our study are considering such solutions as a new source of revenue sometime in the next two years.
Ultimately, CA summarized the findings as such:
“The focus on reduced cost and faster deployment may be sufficient for cloud providers now, but as organizations reach the point where increasingly sensitive data and applications are all that remains to migrate to the cloud, they will quickly reach an impasse,” said Mike Denning, general manager, Security, CA Technologies. “If the risk of breach outweighs potential cost savings and agility, we may reach a point of “cloud stall” where cloud adoption slows or stops until organizations believe cloud security is as good as or better than enterprise security.”
I have so much I’d like to say with respect to these summary findings and the details within the reports, but much of it I already have. I don’t think these findings are reflective of the larger cloud providers I interact with which is another reason I would love to see who these “cloud providers” were beyond the breakdown of their service offerings that were presented.”
In the meantime, I’d like to refer you to these posts I wrote for reflection on this very topic:
Cloud Computing, Open* and the Integrator’s Dilemma
Last Updated on Monday, 12 March 2012 01:27 Written by Celframe Security Team Wednesday, 5 September 2012 02:37
My esteemed co-tormentor of Twitter, Christian Reilly (@reillyusa,) did a fantastic job of describing the impact — or more specifically the potential lack thereof — of Facebook’s OpenCompute initiative on the typical enterprise as compared to the real target audience, the service provider and manufacturers of equipment for service providers:
…I genuinely believe that for traditional service providers who are making investments in new areas and offerings, XaaS providers, OEM hardware vendors and those with plans to become giants in the next generation(s) of Systems Integrators that the OpenCompute project is a huge step forward and will be a fantastic success story over the next few years as the community and its innovations grow and tangible benefits emerge.
I think Christian has it dead on; the trickle-down effect with large service providers leveraging innovation in facilities and compute construction looking to squeeze maximum cost efficiencies (based on power, density, cooling, and space efficiency) from their services will be good for everyone, but that it’s quite important to recognize why and how:
…consider that today’s public cloud services and co-location providers are today’s equivalent of commercial airlines, providing their own multi-tenant services, price structures and user experiences on top of just a handful of airframe and engine manufacturers. OpenCompute has the potential to influence the efficiency and effectiveness of those manufacturers by helping to contribute towards ideas and potentially standards that can be adopted across the industry.
Specific to the adoption of OpenCompute as an enterprise blueprint, he widened the bifurcation between “private clouds operated by service providers as public clouds” (my words) and “private clouds operated by enterprises for their own use” with a telling analog:
Bottom line ? To today’s large corporate IT shops; those who either have, or will continue to operate on-premise or co-located “private cloud” environments, the excitement levels around the OpenCompute project (if anyone actually hears of it at all) will be all-to-familiarly low as sadly, to wake some of these sleeping giants, it will take more than a poke from the very same company who’s website their IT teams are trying to prevent employees from accessing.
This is the point of departure for OpenCompute — it’s not framed for or designed for enterprise consumption. In an altogether fascinating description of why Facebook open-sourced its data center design, the Huffington Post summarized it thus:
“[The Open Compute Project] really is a big deal because it constitutes a general shift in terms of what how we look at technology as a competitive advantage,” O’Grady said. “For Facebook, the evidence is piling up that they don’t consider technology to be a competitive advantage. They view their competitive advantage in the marketplace to be their users.”
Here we see the general abstraction of technology in line with Nick Carr’s premise that “IT Doesn’t Matter:”
“Sharing its blueprints may gain Facebook not only free manpower, but cheaper equipment. The company’s bet, analysts say, is that giving away intellectual property will help it foster an ecosystem of competing vendors that will drive down the cost of parts.”
With that in mind, I am just as worried about the fate of OpenStack and its enterprise versus service provider audience and how it’s being perceived as they watch the mad scramble by tech companies to add value and get a seat at the table.
Each of these well-intentioned projects are curated by public cloud operators and technology vendors and are indirectly positioned for the benefit of enterprises, but not really meant for their consumption — at least not if they don’t end up putting enterprises right back where they were trying to escape from in the first place with cloud computing: the integrator’s dilemma.
If you look at the underlying premise of OpenStack — it’s modularity, flexibility and open design — what you get is the ability to craft a solution finely tuned to an operating environment of your design. Integrate solutions into the stack as you see fit. Contribute code. Develop an ecosystem. Integrate, manage, maintain…
This is as much a problem as it is a solution for an enterprise. This is why, in many cases, enterprises choose to use a single vendor with a single neck to choke in order to avoid having to act as an integrator in the first place or simply look to outsource to one or more public cloud providers and avoid this in the first place.
Chances are, most are realistically caught up somewhere in the nether-regions in between the two.
I wish to make it clear that I am very much a proponent of Open* but I realize that the lack of direct enterprise involvement in standards bodies, “open” initiatives and a lack of information sharing and experience for fear of losing competitive advantage is what drives enterprises to Closed* in the first place; they want to lessen their developmental and integration burdens and the Lego erector-set approach in many ways scares conservative, risk-averse CxO’s away from projects like this.
I think this is where we’ll see more of these “clouds in a box” being paired with managed services to keep it all humming, regardless of where it lives. [See infrastructure solutions from: Dell, VCE, HP, Oracle, etc. paired with "Cloud" distributions layered atop]
Let’s hope we see enterprise success stories built on leveraging OpenCompute and OpenStack…it will be good for all of us.
Update: I just saw that my colleague, James Urquhart, wrote a blog titled “Cloud disrupts, creates channel opportunities” in which he details the channel’s role in this integration challenge. Spot on.
FYI: New NIST Cloud Computing Reference Architecture
Last Updated on Monday, 12 March 2012 01:20 Written by Celframe Security Team Saturday, 21 July 2012 01:52
In case you weren’t aware, NIST has a WIKI for collaboration on Cloud Computing. You can find it here.
They also have a draft of their v1.0 Cloud Computing Reference Architecture, which builds upon the prior definitional work we’ve seen before and has pretty graphics. You can find that, here (dated 3/30/2011)