IT Management

Hardly a day goes by when I don’t receive five to 10 messages from vendors, analysts, and news publications pushing the latest wares for “clouds,” “server virtualization,” and “Big Data.” My reactions to these “new technology memes” range from a sense of déjà vu to one of fatigue at having to separate the kernels of truth from the chafe of disinformation.

Take, for example, the idea of a cloud. Clouds, when originally promoted by IBM a few years ago, referred to temporary locations in cyberspace where geographically disparate individuals could meet to share resources, knowledge, and applications. Sort of “Go to Meeting” on steroids. When the project was complete, the resources were returned to a pool of resources awaiting reallocation to another project.

Since then, clouds have taken on a life of their own. The terminology and acronyms of cloud speak are fuzzy, but the concepts are strangely familiar. They’re part service bureau computing (an idea that mostly failed in the late ’80s), part Application Service Provider, or ASP (an idea that mostly failed in the late ’90s), and are now mainly a reference to services or infrastructure resources stood up in a network intended to augment or replace data center operations. 

Based on past experience, I expect the cloud craze will pass, leaving behind one or two success stories—just as service bureau computing left behind payroll check processing services and the ASP phenom ended with SalesForce.com, one of its few remnants. Both successes reflected a truth I hold as unalterable: While you can outsource a routine task or workload, you’re much less likely to be satisfied with the results received from outsourcing a problem task or workload.

When it comes to server virtualization, again I see an old idea recast as something new. Architecturally, it might make sense for software to be abstracted away when hardware becomes commoditized. I get that, but the problem with x86 hypervisor computing is the lack of attention paid to systemic consequences of consolidating lots of servers and their workload into a much smaller number of kits. 

We’re starting to see the fallout from the failure of planners to anticipate the impact of guest machine consolidation on LANs and storage I/O patterns on application performance. The resulting poor application performance (and high cost) helps explain poor penetration of the technology (only 17 to 20 percent of servers are virtualized today—well off the original vendor target market projections) and the backsliding (current surveys report that companies are abandoning server virtualization initiatives when less than 20 percent complete). 

Is server virtualization ready for prime time, or even a wise approach given the next computing innovation—application on demand—that seems to be coming down the pike? Early adopters confide they’re worried about all the money they’re spending on server virtualization projects. Will unstable and insecure hypervisor technology lead to a computing debacle? Will it already be a dinosaur technology by the time it’s deployed?

What would happen if companies spent the money they’re throwing at virtualization and clouds instead on improving the manageability of their physical infrastructure? Unfortunately, multi-million dollar marketing campaigns aimed at non-technical decision-makers have succeeded in exploiting a rift that has opened up in too many companies between senior management and IT management. The CFO or CEO reads an article in Forbes about the miracle of clouds and asks the IT bosses when they’re going to unplug their hardware and lay off staff.

I don’t need a Big Data application to tell me this is a wrong-headed approach. In fact, I’m not really sure I understand what Big Data is. To some, it refers to the data deluge. To others, it’s shorthand for a magical analytical process used to find tiny information needles in a hugely complex, multi-data set haystack. None of the examples provided by Big Data advocates correlate to any requirements I have in my shop. Do I really need to spend millions on technology to find hidden demographics that will help me sell five more boxes of doughnuts? I want a cost/benefit justification for the technology; otherwise, it looks suspiciously like the business intelligence apps peddled to me in the last decade.

When we implement new technology, isn’t it supposed to solve a problem—preferably different from one created by the new technology itself? Otherwise, we risk engaging in a cycle of changing dogs just to meet new fleas.