This week a fellow blogger on the Cloud posed some interesting questions around compliance this week that highlighted this area is not very well understood when it comes to the cloud and virtualization - across desktops, apps, and to some degree servers.
Compliance is an interesting element in it's own right with many twists and turns depending on the industry (healthcare, financial services, manufacturing, etc), type of company, what technology is in place, whether it is actually used in a way that adhere's to COBIT and for outsourcing the controls the outsourcer has placed and if they adhere to pass a SAS70 Audit.
Yes - SOX does say that the CXO will go to jail if they do not adhere to proper controls and conform to the standards identified by NIST to do so. Truth be told very few have actually gone to jail although several companies (527 in the first year according to IT Governance Institute) have had material discrepancies - their CXOs have not seen much in the way of jail time. The real teeth around SOX is having to post in a public place like the Wall Street Journal and the impact on the stock etc is a much bigger driver. Companies typically have time to clean up their act and fix the material discrepancy. The actual act itself is very ambigious and doesn't actually define all the components but leaves that up to NIST and COBIT (not to mention additional flexibility for auditors) to deem whether a company is in compliance. It is the system, manual or automated - that enables compliance not technology.
Having said that who has ownership, how do you determine compliance for the cloud? Many of the compliance factors whether SOX, HIPAA, PCI, GLB, etc have been factored into MSP and outsourcing models and are part of SAS70 audit controls - at least for physical systems. Else companies like Salesforce.com, Amazon, etc would have a difficult time maintaining their service given the sensitive data.
The real gap that needs to be thought of for the cloud is what newer technologies that enable the cloud - like virtualization do to traditional Controls used to maintain compliance and how the lack of understanding about those technologies - impact companies ability to deploy them fully. In my previous company - ITPI and I worked on research in this area across several different companies - interviewing CXOs to operations to really understand the GAPs.
We recorded an introductory webcast on this topic:
ITPI is targetted to release the overall study - Kurt Milne copied here can provide more insight on the details. I must say it is a real eye opener and a significant area that quite a bit of work needs to be done. www.itpi.org
The real concern is around the standards such as COBIT, Common Information Model (CIM, Smash, Dash, etc ) are based off of the physical world and were created void of virtualization. DMTF is adding virtualization to CIM but there is still quite a bit to be done from a backend systems perspective around virtual apps, desktops, and servers to ensure maintaining compliance.
In some ways virtualization poses more risks to existing controls particularly around security and in other ways it makes possible new controls. The key is understanding what those risks are, the architecture - not all are created equal - ways to work around them, and what can be deployed versus what can not based on the application, oversight required, etc. Companies work around this today - so it is also possible in the Cloud.
They key here is while everyone is trying to define this new market - it is critical to understand the current physical paradigm, processes, controls and how we impact them before creating the solution. Clearly as with all new paradigms and markets - there is quite a bit for all of us to define, educate each other on and understand before jumping.