Operating Systems

“Hindsight is 20/20” is typically one of those things we say with a sigh or a shoulder shrug. We zigged when we should have zagged. Period. There’s no intelligence quotient—no Mensa—when it comes to hindsight.

Making bold predictions on a public platform can make for a gut-churning, “in-hindsight-maybe-I-shouldn’t-have-saidthat” realization—or an outright mea culpa. Take outspoken columnists who like to prognosticate, for example. Mind you, it’s good to be controversial, as it gets people talking and gives your message a longer life. That’s especially important in today’s world where the average human attention span is about eight seconds—one second shorter than a goldfish’s. No joke. But sometimes columnists can be completely off base for any number of reasons, from misreading the proverbial tea leaves to letting their own biases cloud their perspective.

The latter appears to be the case with Stewart Alsop. In 1991, he predicted the last mainframe would be unplugged in 1996. His mea culpa came when 1996 arrived and mainframes were still humming away, supporting the world’s leading organizations. It turned out that it wasn’t the technology he disliked so much as the culture of Information Systems (IS).

IT analyst and InfoWorld columnist, Cheryl Currid, gave the mainframe a bit more time than Alsop did, predicting the last one would be decommissioned on New Year’s Eve 1999. Labeling COBOL programming as “deadend,” she advised developers to shelve their life experience programming in COBOL and learn fourth-generation programming languages (4GLs) such as PowerBuilder. She even encouraged IS managers to tell their local colleges and universities to stop teaching COBOL altogether.

A few years later, USA TODAY printed the story, “Mainframe? We don’t need no stinking mainframe.” The article didn’t predict when the mainframe would become obsolete per se, but it did talk about a new idea called utility computing, which was to—you guessed it— replace the mainframe someday.

Utility computing harnessed many computers to work on and help solve a problem. The idea was that one or more computers could be taken offline without affecting the work being done. Jobs come and go, the work gets done, and should something bad happen, it doesn’t affect the other work taking place. It’s these attributes that have given the mainframe its reputation for being reliable and highly available.

As for utility computing, the most famous example of this is the SETI@home project that works on processing data collected by radio telescopes on Internet-connected computers. This, according to the article, was to be the next big thing in computing. Microsoft even invested a whopping $1 million in the idea, which ultimately would become “the cloud.” Amazon first introduced the cloud as we know it in 2006 as a way to sell its spare computing capacity. The cloud hasn’t replaced the mainframe; in fact, the mainframe has become a part of the cloud.

Alsop and Currid clearly never dreamed the mainframe would celebrate its 50th anniversary, and that IBM and other companies, mine included, would be working with colleges and universities to train the next generation on z/OS, COBOL and mainframe technologies. The mainframe isn’t only alive, it’s thriving—worldwide. Young people are also opting for careers in mainframe. I’ve worked with our Chinese customers and my initial observation is that everyone is so young compared to the U.S.

The last commentary I will discuss is a 2012 blog post by NASA’s then CIO, Linda Cureton, in which she proudly proclaimed they had unplugged their last mainframe. The mainframe at NASA was used to put Neil Armstrong on the moon and return Apollo 13 safely after a catastrophic accident. Shuttle missions also depended on it. (Today, NASA relies on Russia for rocket engines as well as getting astronauts to and from the International Space Station. I hope this isn’t considered progress.)

The CIO was a former mainframe systems programmer and was generally complimentary to the mainframe. However, there was one comment I took exception to: “The end-user interfaces are clunky and somewhat inflexible.” In fact, IBM and other mainframe software vendors, mine included, are putting GUI interfaces on mainframe tooling. You no longer need a 3270 emulator (i.e., green screen) to debug your COBOL program. Debuggers built on Eclipse allow users to edit and debug their code without ever having to see a green screen. Access to mainframe data is often through a distributed front-end, where an MQ message, Simple Object Access Protocol (SOAP) message or Java Database Connectivity (JDBC) call to the mainframe returns the data requested by the end user.

There will always be critics and those who prognosticate for sport, but people are now less likely to make bold proclamations about the death of the mainframe. A great deal has transpired since Alsop’s infamous statement. Many technologies—and companies—have come and gone, but in hindsight, the one constant has been the mainframe. The discussion shouldn’t be about how to get off the mainframe, but about how the mainframe can better integrate into a company’s existing applications.