Virtualization continues to be a buzzword on the computing scene, and will remain so as its usefulness and economies continue to be proven in the enterprise data center. Last year, market analyst firm Gartner predicted that virtualization will be the highest-impact trend, changing infrastructure and operations through 2012. According to Gartner, virtualization “transforms how IT is managed as well as how it is bought, deployed, planned, and charged.” It has already impacted competition among infrastructure vendors.
According to Gartner, the leading edge of this change is server virtualization, which promises to unlock much of the underutilized capacity of existing server architectures. It also is impacting the traditional role of the operating system.
“Essentially, virtualization creates a fork in the road for operating systems,” says Thomas Bittman, vice president and distinguished analyst at Gartner. “Traditionally, the operating system has been the center of gravity for client and server computing, but new technologies, new modes of computing, and infrastructure virtualization and automation are changing the architecture and role of the operating system. The days of the monolithic, general-purpose operating system will soon be over.”
Virtualization Not New
Science fiction writer Robert A. Heinlein once wrote, “When it’s time to railroad, you railroad.” This glib statement means that as a technology evolves, there comes a point where exploiting it becomes the obvious way to operate. Virtualization has reached that point.
For the mainframe community, virtualization isn’t a new concept. One of the early works in the field was a paper by Christopher Strachey titled “Time Sharing in Large, Fast Computers” in June 1959 for a United Nations Educational, Scientific and Cultural Organization (UNESCO) conference. IBM began exploring virtualization in the mid-1960s with its CP-40, M44/ 44X, and CP/67 research systems. These led to the commercial deployment of the virtual machine concept in the VM/370 product.
In the ’80s and early ’90s, the industry moved from leveraging monolithic mainframes to running collections of servers. As a result, the virtualization concept became less prominent. That changed in 1999 with VMware’s introduction of VMware Workstation. This was followed by VMware’s ESX Server, which runs on bare metal and doesn’t require a host operating system.
“Virtualization is hardly a new concept; storage has already been virtualized— albeit primarily within the scope of individual vendor architectures—and networking also is virtualized,” says Philip Dawson, vice president and distinguished analyst at Gartner. “. . . traditional IT infrastructure orthodoxy is being challenged and is changing the way business works with IT.”
A 2009 survey conducted for SHARE, the world’s largest association of corporate users of enterprise information technology, found that most respondents recognize the advantages enterprise virtualization can deliver and are preparing to move forward with it. This SHARE survey of 388 professionals from both the technical staff and management sides of companies focused on z/OS (85 percent of respondents), with the rest scattered between Windows, Linux, and other UNIX flavors.