Operating Systems

The Development of z/OS System Integrity

5 Pages

When was the last time you heard of a virus, worm, or other malware infecting z/OS? Chances are you’ve experienced far fewer security problems with System z machines than any other platform you’ve worked with. But you’ve probably also heard it said that if z/OS was as popular as Windows and exposed to the Internet, it would have just as many security issues. Those who make such statements would have you believe “Patch Tuesday” is just the price one pays for success.

While it’s true the Internet is a hostile environment that poses special challenges for system designers, the superior stability and security of z/OS isn’t marketing fluff; it’s real. Ethical hacking experiments have confirmed that z/OS is highly resistant to malicious attacks. Consider the unique technology provided by mainframes and how IBM’s developers used it to avoid the problems that plague other platforms. The secret lies in how z/OS interacts with System z hardware to provide system integrity.

Integrity vs. Security

System integrity differs from system security. Security is concerned with who gets a key to the lock. Integrity is concerned with the fact there’s no way of bypassing the lock. For IBM mainframes, system integrity became a major design goal in the early ’70s, long before the Internet was developed. In those days, hacking was confined to local terminals and a few dial-up connections, and viruses were being spread by floppy disks. Although mainframes were largely unaffected by all this, storm clouds were gathering on the horizon. Anticipating the hostile world to come, IBM’s developers abandoned their old approach of “let’s prevent honest mistakes and accidents” and adopted a new, far more ambitious goal of “let’s protect the system from hostile users and malicious attacks.” This represented an astonishing, unprecedented change in corporate culture.

Research was conducted to determine how operating system integrity exposures occurred. As W. S. McPhee reported in his landmark 1974 IBM System’s Journal article, “Operating System Integrity in OS/VS2,” the developers found they were able to classify exposures into these major categories:

  • System data in the user area
  • Non-unique identification of system resources
  • System violation of storage protection
  • User data passed as system data
  • User-supplied address of protected control blocks
  • Concurrent use of serial resources
  • Uncontrolled sensitive system resources.

The design practices and procedures that might result in one of these exposures were identified and eliminated. Then a massive effort was launched to find and fix all known problems. Can you appreciate the difficulties of that little word “all”? Some exposures were timing-dependent; others were so obscure their probability was small. The decision to spend time and money to correct something that had almost no chance of ever happening had to be difficult to make. Yet, the developers successfully argued that, unless all exposures were corrected, no one could be truly confident that today’s obscure exposure might not become tomorrow’s gaping hole.

An historic decision was made to fix everything IBM knew about, without regard to the probability of its occurrence. The word came down,

“Just do it … ”

The IBM Integrity Statement

Then the other shoe fell: IBM also agreed to try to fix anything anybody else could find! In Software Announcement P73-17 dated Feb. 1, 1973 for VS2 Release 2, IBM formally defined system integrity and committed the company to accepting Authorized Program Analysis Reports (APARs) for integrity exposures found by customers. The famous “IBM integrity statement” was, and still is to my knowledge, unprecedented in the industry. It also speaks volumes about the confidence IBM has in its people and products. What that means in 2006 is that just about every conceivable problem in core operating system functionality has been found and fixed. Of course, z/OS and the System z continue to evolve and change as new features are added, but 30 years of a “zero defects” system integrity policy have given developers an unusually robust foundation upon which to build new things.

5 Pages