IT Management

IT Sense: Critical Convergence in 2010

This has been an interesting year. In the business world in 2009, downsizing and rightsizing produced high “GDP per worker” numbers, suggesting that companies have shed all but the most productive people and are successfully running on much leaner operational models. As a corollary, mainframes have enjoyed a renaissance in companies that have largely deferred plans to abandon Big Iron in favor of presumably less expensive x86 platforms. In the x86 world, virtualization has been seized upon by a growing number of firms to shrink infrastructure, enable more applications to be managed by fewer people, and reduce energy consumption. The next evolution of virtualization, touted to be just around the corner, is “cloud computing,” which, depending on the vendor literature you read, will dramatically reduce IT costs and deliver broader benefits, ranging from solving world hunger to alleviating climate change. This all sounds pretty good, except ...

Worker productivity numbers conceal more than they reveal. In the IT shops I’ve recently visited, the fear of Friday pink slips may have abated, but an increase in stress levels is palpable as single individuals are carrying the workload once shared across five sets of shoulders. There’s a sense this isn’t a temporary situation, but the “new normal.”

Under the circumstances, such things as the H1N1 flu are particularly scary. There’s a profound sense that business IT operations have been stretched so thin that the absence of anyone for a week or two could create major havoc for the organization. Adding to the fragility of the situation, corporate disaster recovery planning staff was among the first to be cut in many companies, despite surveys claiming that disaster recovery is a top priority for organizations. So, contingency plans are in generally poor condition and largely untested.

Further adding to the stress level is concern that “clouds”—this year’s conceptual darling of the IT analyst community and trade press—are gathering on the horizon. The technology professionals I talk to, many of whom have been using Google apps for several years and know their capabilities and limitations well, believe that clouds still aren’t ready for prime time. Lacking interoperability standards, real security guarantees, and enforceable service level agreements, contemporary clouds hardly seem to have resolved the key issues that led to the failure of Application Service Providers (ASPs) only a few years ago.

The bottom line: “Cloud architecture” remains vaporous. But that hasn’t stopped advocates from portraying clouds as the next big thing, and ultimately as a way for businesses to eliminate internal IT investments altogether. IT pros fear that the constant bombardment of senior managers with cloud woo will eventually make the idea a fixture in the minds of the Flashing 12s of the front-office who are already fond of stating that “IT isn’t a core competency of their business.” “Flashing 12” is an expression coined in Redmond a few years back to describe business managers who lack sufficient technical acumen even to program the clocks on their VCRs, which subsequently flash “12:00” forever, but who insist, nonetheless, on making deals directly with tech vendors without consulting with their internal IT experts. It reflects a deeper divide that has been forming between business and IT managers in many companies for at least a decade, one that has led to: 1) the deployment of infrastructure that doesn’t meet the needs of the business, 2) that largely comprise a set of unmanageable stovepipes, and that, in turn, 3) require a larger administrative staff. Clouds are just the latest manifestation of this problem. Vendors are making a simplistic pitch to technically non-savvy business decision-makers that feeds their pre-existing view that IT no longer matters.

Perhaps the most telling critique of cloud computing can be derived from a brief conflict earlier in 2009 between Microsoft and IBM over the latter vendor’s “cloud manifesto,” which Big Blue described as an open standards-based approach to building cloud computing. Watching IBM and Microsoft argue over whose cloud was “more open” was the closest this industry ever came to realizing a skit from “Monty Python’s Flying Circus.” Absent open standards, clouds can’t be used in concert with one another; they are simply bigger stovepipes that can’t be managed in any sort of unified way. The IT folks I talk to worry that this bit of wisdom may not be understood by the front-office until it’s too late.

The new year will see a critical convergence of trends and ideas that will set the stage for the transformation of IT going forward. While I hope that in 2010 the front- and back-office will form a working alliance for the betterment of IT decision-making and company welfare, I worry about a different result: a convergence of marketecture and ignorance that will result in a lot of unfortunate business IT decisions that may well translate into non-recoverable failures for many companies.

The one bright spot, for now, is the mainframe. While IBM is talking about including mainframe computing in a cloud architecture, this will take time. Meanwhile, companies with mainframes to anchor them to real computing architecture will likely be more stable than those that entrust their mission-critical IT to an architectural model that shifts with the breeze.