The term “modernization” has again attained buzzword status in the IT industry. Nearly every major software vendor uses it in their marketing, and major analyst firms such as Gartner and Forrester are devoting more ink to it. It’s no wonder. A group of CIOs surveyed by Gartner in 2008 reportedly named “legacy application modernization” as fourth highest overall in a list of technology priorities. But like any other term with billions of dollars swimming around it, modernization, especially legacy modernization, has taken on some unexpected connotations.
No organization is concerned with modernization unless it believes it has IT systems that have been in service long enough to need modernizing. Vendors and stakeholders pushing for major change often call these systems “legacy systems,” but this term has come to be pejorative in the view of many who operate the systems in question. It might be more easily argued that these are often the core systems of an organization.
Core systems are those custom-built systems that support the vital, high-volume data processes unique to a particular enterprise. Examples include a large financial institution’s massive stock-trading system, or an airline’s system to coordinate all its flight crew schedules in realtime, or a system that manages cargo at a busy port. These systems are often comprised of millions of lines of code and can support billions of transactions per day. Off-the-shelf software packages have fallen seriously short when attempting to support these unique, critical functions.
While it might surprise some people, most of these core systems run on mainframe computers. It is a commonly accepted estimate that between 70 and 80 percent of the world’s data resides on the mainframe. Entire national governments are run on mainframe systems.
With few exceptions, no technology can match the combined performance, reliability, security, and efficiency the mainframe can provide. But there’s a more fundamental reason these custom-coded core systems haven’t been replaced. Packaged software works best for a function that doesn’t differentiate the organization. In areas such as human resources or accounting, it’s fine if one’s competitor has purchased the exact same system from the same vendor. These packaged systems are standard across a given industry. Core systems, on the other hand, are one of a kind and often give companies a competitive advantage.
Proprietary systems are part of an organization’s DNA. Within core systems, which have been added to and modified for years or even decades, is data that represents the history of how the organization has evolved. These systems embody the unique combination of data and business rules that differentiate an organization and help maintain its competitive edge.
But DNA isn’t perfect. Defects and flaws creep in over time. It isn’t enough to simply preserve the core systems; they must be enhanced to help the organization adapt to new challenges.
There are three areas where custom-built core applications typically fall short:
1. Green screen interfaces
2. Data silos
3. System inflexibility.
Part of the secret to core systems’ longevity is that they are reliable, high-performing, and secure. However, these traits are often lost on most business users, who typically see only the screen interface. The technology behind core system interfaces has often been around as long as the systems themselves; these interfaces are typically many generations behind the interfaces available on PC or distributed system-based applications.