Jun 23 ’09

IT Sense: Get Ready for the Regulators

by Editor in z/Journal

You don’t need a political science degree or an MBA from Harvard to know that additional regulations, particularly in the financial industry but also for all publicly traded firms, will shortly pass the legislatures of most developed countries. Accountability and transparency are in; abstractions and derivatives are out. And, as usual, the systems and networks that process the work data of companies will shoulder most of the burden of change.

There’s nothing wrong with regulation, provided goals and objectives are clearly understood and business isn’t hamstrung in performing acceptable, aboveboard transactions. It’s widely accepted that better regulation and oversight might have helped prevent some of the behavior that led to our current problems. The problems with regulation are usually in their implementation.

Staying on top of the latest regs is, in itself, a painful thing. If the experience of Sarbanes-Oxley, HIPAA, or Gramm-Leach Bliley (GLB) in the late ’90s provides any useful guidance, we can be sure that laws will be written with sufficient blurriness to mandate change without providing a lot of specifics about what exactly must be done or how to do it.

Making the confusion worse, you will doubtlessly discover on the heels of the next round of regs dozens of authoritative-sounding Websites that offer insights and advice from consultants, lawyers, risk managers, auditors, and pundits who turn regulation interpretation into a cottage industry. As in the past, these authorities will differ in their interpretations of rules and will contradict each other on nearly every point. This, in turn, will make the development of a compliance strategy nothing short of an exercise in hair-pulling—assuming you have any hair left from the previous regulatory wave.

Confusing national laws will doubtlessly be augmented by equally confusing state and local laws. HIPAA, for example, requires healthcare service providers to retain patient data for seven to 10 years, but certain states require that data be retained for as long as the patients are alive. GLB required companies to stand on a soap box and publicly flog themselves while confessing that a bunch of tapes containing private customer data fell off the back of the Iron Mountain truck; the thought being that fear of embarrassment would compel companies to take measures to secure the data entrusted to them. California decided that a bigger stick was needed and passed a law of its own to force companies to contact and personally apologize to each Californian whose individual data was disclosed. The latter dramatically increased the cost of disclosure beyond any measure dreamed up by Phil Gramm, Jim Leach, or Thomas Bliley.

And you don’t need a crystal ball to predict that the new regulatory requirements will require two things of IT: tighter classification of data and tighter controls over data access. The former goes well beyond the relatively non-granular data classification system we’ve used in System Managed Storage (SMS) these past years. Data classification can no longer be regarded as a simple tag that identifies what data needs to be provisioned for how long on which DASD disk before it can be migrated to tape or optical. And storage class is no longer just a tag that identifies which box of spinning rust should be targeted by a write from an application or end user. In the same vein, Hierarchical Storage Management (HSM) isn’t a compliance-related activity so much as it is a capacity management-related activity, and the days of treating data as collections of anonymous bits are rapidly reaching their end.

The next frontier will be the application of information governance policies to data at point of creation, involving highly granular, business-focused classification techniques. These will need to map to services provided in the IT infrastructure, whether in the form of third-party software, as functions of an operating system, as network-hosted processes, or as capabilities embedded on storage array controllers. Case in point: One of my financial clients needs to retain data for business reasons, but doesn’t want to expose certain data required by the SEC to deduplication services for fear it will conflict with SEC rules about providing only “full and unaltered copies” of data. This client will require a way to identify which data to segregate from the deduplication functionality provided on his storage arrays and a policy- based mechanism for routing data paths around this service.

As for tighter management controls on data access, expect access event logging to become commonplace going forward. Most leading mainframe software vendors are pursuing technologies in this space today and a few are shipping first-generation products. Basically, you will need a way to ensure data paths through services are compliant and that accesses made to data anywhere along the path are copasetic with the compliance rules—that means event logging, reporting, and alerting. Prepare for this additional workload in your shop: It’s much closer than clouds and other current panacea