Recently, I gave a talk on “hybrid mainframing,” which raised a few eyebrows. A few folks were confused regarding my use of the term hybrid.
Was I borrowing the term from trendy automobile commercials? You know the ones, where electric car owners smugly explain how they’re reducing their carbon footprint by abandoning gasoline in favor of dirty electrical power (40 percent of electricity is generated by dirty coal, after all). Was I talking about some sort of new application for chimeric gene splicing; taking strands of DNA from different plants or animals and combining them to produce a Frankenstein chicken that produces more eggs, or a Syfy Channel strawberry plant monster that produces much more fruit, while inadvertently changing insects into behemoths?
I tried to assuage their concerns by explaining what I meant this way: First, everything changes. When I first started my career, we called what we did “data processing,” not “information technology.” The former connotes a useful work process that’s immediately understood; we crunch numbers to provide information. The latter expression describes a domain of interest; there’s no such thing as information “technologizing.” IT was a hybrid term that described less what we did, than what we did it with. That was an application of hybridization gone terribly wrong. No wonder senior management can’t figure out whether IT matters!
My meaning was different. It referred to a need to innovate to keep mainframing viable and relevant. I pointed to the four problems that always seem to creep into discussions and articles about mainframes:
- The mainframer confronts a proliferation of software utilities with too little time to master any of them and too few fingers to operate them all.
- Mainframe staff sizes have shrunken and domain boundaries that used to exist have blurred substantially.
- New workload from the distributed side of the house is finding its way into mainframe LPARs or is being connected for centralized mainframe management and control on zEnterprise blade servers.
- New mainframers aren’t being produced in numbers sufficient to replace the current crop of aging Big Iron sysprogs, DBAs, and admins.
To cope with these challenges, we need to start innovating—which, of course, translates to borrowing concepts, processes, and technologies from anywhere we can to improve the mainframe experience. I’m not just talking about bolting on a pretty graphical user interface over a bland, text-based user interface that’s “just so day before yesterday.” This has been tried before.
What’s really needed is a workspace where we can pull together the disparate and non-integrated utility apps we’re using to handle both traditional and new workloads. Also, we need to capture what we do into structured workflows that will help us document our new “blended roles” more readily and transfer knowledge more effectively to the new mainframer. A lot of folks are working on this right now—from IBM to CA Technologies, BMC Software, and a lot of the other advertisers in this magazine.
The good news is there seems to be a growing recognition among mainframe software vendors I talk to that hybridization is the only way. We can disparage distributed and mobile computing all we want, but just look at iPhone and iPad sales over the past year! Clearly, there’s something to simplifying the technology experience overall, and nowhere is it needed more than in the mainframe world.
I just had a briefing on CA Technologies’ latest version of Mainframe Chorus, just to mention one example. One thing that jumped out at me was an easy-to-overlook feature that operators use to organize the workflows associated with tasks they perform. Behind this innocuous tab will ultimately reside a new definition of the role of the DBA (and later the storage administrator, the security administrator, etc.), customized for the company where the person works.
Personal computing meets the mainframe. Now there’s a hybrid mainframe experience!