Automation vs. Quality in Application Modernization

6 Pages

As applications of the ’70s and ’80s become obsolete, more IT shops have queued application modernization projects. These are complex, costly, and risky; however, some degree of automation in the transformation process still appears attractive. Looking only at the costs, rewriting the application from scratch in a new environment, without any automation, would roughly entail at least the same effort and costs as the original. The costs are prohibitive and the extended development time may render the new incarnation of the application obsolete even before it’s used in production. Automation is one way to solve these difficulties. 

Where can automated transformation play a role? Ideally, automation would be like a black box for which the legacy application written in the ’70s is the input and the new, modern application is the output. Even such an extreme scenario isn’t ideal, as often there’s also an effort to change the technical platform and some of the functionality, adding new features and making it more flexible. So we may settle for less than 100 percent automation of the transformation process, although we may keep 100 percent as a measuring stick. 

The Role of Preliminary Steps

Various automation methods may be employed with varying degrees of success. As a common denominator, all require the preliminary step of analysis, which helps decide the best strategies and gathers information needed for transformation. An application cleanup also is useful; it simplifies the application and eliminates the dead wood that doesn’t need to be moved to the new platform. 

Analysis: Analysis may be a vague concept, but in the context of automated legacy transformation, it must take more precise forms. A crucial factor is that the results of analysis must be captured in a form accessible to transformation software. If all the information is simply accumulated in a series of free-text documents and pictures, there’s little chance of using it as an input to the transformation process. Ideally, the results of analysis should be accumulated in a well-structured repository where they can be programmatically accessed.  

A good analysis may reveal general information used to estimate the magnitude of the transformation project and to decide the best means to perform it. This includes: 

  • The size of the application and its individual components
  • The complexity of the programs
  • A classification of the artifacts among various categories
  • The interaction between the user interface, programs, and data. 

More detailed information could be used for the actual automation of the transformation process. Such detailed information may include: 

  • Data model, layouts, and data structures used in programs
  • Screen contents and layouts
  • Program syntax trees
  • Screen flows
  • Call maps. 

Having gathered all this data, the team in charge of transforming the application may choose the best transformation strategy. In addition, all the knowledge captured in a repository could be used in the next major step: code and data conversion. 

A great deal of analysis may be automated. There are tools capable of taking in the sources of the application and collecting all the data previously listed in an automated fashion. Such tools differ in their abilities to capture, store and expose the information, so a legacy transformation team must choose the one best suited for the following steps: 

Clean-up: A legacy application that grows organically for many years inevitably accumulates totally superfluous artifacts. These may include programs, jobs, or files no longer used or even dead code in programs otherwise used in the application. Most of the clean-up operation may be fully automated, saving time and resources in transformation projects. 

6 Pages