Jul 6 ’09

Ensuring Secure Data Transfers in the Mainframe World

by Editor in z/Journal

We all know the power and significance of the mainframe in the corporate and public sectors. Mainframes—secure, scalable, reliable and efficient—host more than 70 percent of the world’s critical business data and help solve many of today’s enterprise computing challenges.

This brings us to integration. Mainframes need to coexist with the other platforms, including Windows, Linux, UNIX, etc., but how do companies tie their mainframes in to all their distributed platforms? How does data get exchanged between the various systems and applications?

Demand for Enterprise Integration

Several factors drive the demand for mainframe integration:

• The requirement to link different business operations together by integrating various applications with systems such as Enterprise Resource Planning (ERP), Customer Relationship Management (CRM) and Supply Chain Management (SCM)

• The need to conduct business over the Web and support the transactions with legacy applications

• Regulatory compliance and corporate governance mandates

• Mergers and acquisitions that increase the need to bring together diverse operations.

Most companies have recognized the importance of integration, but the ability of companies to adopt more advanced technologies and address their myriad file transfer challenges has been severely hampered by legacy issues. Several pervasive technologies have led to hesitation or inaction when it comes to implementing newer, more modern integration technologies.


By all accounts, the most widely used integration methods deployed today are based on File Transfer Protocol (FTP), a standard developed in the mid- 80s. For many years, organizations (or business units within organizations) have built custom integration solutions using FTP as the data transport mechanism. Unfortunately, FTP carries security risks and is somewhat lacking when it comes to automation, management, and control functionality. The result can be an integration solution that’s weak and inflexible, prone to security risks, and expensive to develop and maintain. More specifically, FTP:

• Lacks mechanisms to determine if a transfer has been successful. In most cases, failed data transfers can’t automatically restart. Your developers are left writing scripts and applications to compensate for this lack of guaranteed delivery.

• Lacks automation capabilities and provides only a manual interface. Developers are left to their own devices to schedule or trigger file transfers. This kind of scripting requires expertise and can become costly and complex.

• Is difficult to integrate with other applications. Because there’s only a manual interface, developers need to do significant custom programming.

• Has weak control mechanisms, making it difficult to track and audit operations, identify problems, or generate reports.

• Suffers from weak security with passwords and data sent in clear text.

• Can’t compress data, resulting in network bandwidth and performance degradation—especially with today’s larger file sizes.

Increases in Data Volumes

Volume is another major issue. As more systems are installed across the enterprise, the quantity and frequency of data exchanged between systems grows. For even a medium-size company, this can involve terabytes of data on a daily basis. These increases create complex challenges. FTP solutions lack the automation and control functionality required to manage these data processes in any meaningful manner. Companies are finding that:

• Data transfer scenarios have become so complex they must be automated to be manageable.

• IT staff requires the flexibility to manage by exception with the ability to resolve problems when they occur. In unattended situations, the staff must be automatically notified via pager, cell phone, or mobile device if and when failures occur.

• If errors do occur, automatic retries are critical—preferably from the point of failure.

• IT staff must be able to remotely monitor and manage data movement operations.

• IT staff requires the ability to track data transfer activities, and must be able to confirm that files were delivered.

Although FTP has been a useful utility over the years, it can’t deliver on mainframe-centric integration requirements without significant, costly customization. Even with customization, these systems tend to be inefficient. When you add reliability and security concerns, it seems fair to wonder how FTP ever came to be so widely used.

Messaging-Oriented Middleware

Messaging-Oriented Middleware (MOM) is a robust, mature technology that lets an organization efficiently exchange information. Messaging, however, isn’t always a viable approach. Message-oriented technologies can’t effectively solve all business integration scenarios, and come up particularly short with regard to the batch-oriented systems found in many mainframe shops, or when the volumes of information (or file sizes) being exchanged are too large.

The following identifies some common issues not addressed by message-based systems:

Support for legacy mainframe applications: Many mainframes still rely on legacy applications that have been in use for many years. These applications were created before the advent of Service-Oriented Architecture (SOA) and MOM. Modifying these applications is risky, expensive, and complex because the underlying design is no longer understood.

Support for batch-oriented systems and large volumes of data: Many applications were designed to operate in batch mode, which requires them to deal with a large volume of transactions in one operation. Messaging systems, by their nature, were designed to deal with transactions. While messaging applications may handle batch processing, this method of operation isn’t typically the most effective solution.

Applications that span company boundaries: When a business application spans company boundaries, it’s often unrealistic to expect that a business partner will modify the way it conducts business to work with another organization. In the real world, most industries have adopted standard practices on how to exchange information; these practices typically involve the exchange of electronic files rather than the tight integration offered by messaging systems.

Many integration vendors that leverage message-based systems offer little or no way to incorporate file-based integration. These vendors typically see batch processing or Managed File Transfer (MFT) as being outside the scope of their product offerings. Too often, they simply leverage FTP, which offers no added value and carries significant limitations. Unfortunately, many organizations don’t become aware of these limitations until their critical integration projects are under way. Then, they’re left scrambling to seek viable alternatives that support their needs.

Achieving Better Mainframe Integration

Message-based systems don’t offer sufficient ways to tie file-based integration into an integration strategy, so what can organizations do? A multi-platform MFT solution can easily be deployed alongside your messaging technologies (see Figure 1), offering a complete solution for any integration project. MFT offers enterprise-strength file transfer with the reliability, security, and auditing required to effectively manage your mainframe and distributed environment.  


Deploying an MFT application as part of your overall integration strategy lets you choose the right tool for each specific project. An MFT solution goes far beyond simply solving the security problems around file transfer. An MFT solution offers a centralized management capability for all file transfer and uses open standards to make itself available as an interoperable service in your multi-platform or SOA environment. An MFT solution also delivers:

Control and management: Secure, centralized control and management of all transfer servers, regardless of platform or location. This must include letting an organization manage users across and beyond the enterprise, the ability to log all file transfer activity, and the ability to produce detailed audit and activity reports in real-time. Alerts and event-driven notifications also should be part of the core MFT solution. Typically, all this will be accessed from a single, unified interface.

Guaranteed delivery: This is critical in an MFT solution and means the data to be transferred is absolutely, positively transferred to its intended destination and that the file arrives in its entirety and on time. This becomes critical in environments contingent upon service-level guarantees. Guaranteed delivery is possible only in advanced MFT systems, through the use of proprietary protocols that provide error detection and notification as part of the protocol, as well as queuing, automated restarts, and exception alerts.

Automation: This is a key feature. A true end-to-end solution will automate many important steps and processes and provide event-driven transfers for real-time initiation of data movement between your mainframe and all other distributed platforms. This is critical to supporting the unattended “lights-out” requirements at many data centers. Look for the ability to automatically:

• Transfer files

• Process files once delivered

• Recover from errors without human intervention.

Security: This must be engineered into every aspect of the solution. In an MFT solution, it must start with comprehensive authentication and authorization for all users, servers, and clients in the network. This should be tied into the native operating system security, whether it is RACF, ACF2, Top Secret, or Active Directory. In addition, each server should contain its own authentication and authorization schematics so access can be restricted with granular detail. Encryption of all file transfers is absolutely key, as is delegated administration, which ensures system administrators have powers consistent with their organizational role and security status.

Typically, advanced MFT solutions will coexist with your integration system right out-of-the-box. They can either run within the existing integration system’s environment or leverage one of the open protocols the integration system supports. This provides a stable, scalable way to pass data between the file-based and message-based worlds.


For many mainframe shops, the best approach is to leverage a multi-platform MFT system in combination with a messaging system. This may seem like a tough sell in today’s climate, but getting budget approval may not be as difficult as you think. A solid ROI can be developed to justify a move away from FTP or your legacy file transfer solutions, which tend to be rudimentary in nature and often require extensive customization to meet organizational needs. This customization can be resource-intensive and costly. Moreover, the necessary levels of customization make the deployment difficult to modify as business needs evolve.

Standardizing on an MFT solution can help rationalize an organization’s file transfer processes across all platforms. An advanced MFT system will deliver centralized control and management, maximized security, and improved tracking and audit support. By leveraging a modern MFT capability, your two systems can be complementary and provide a robust, scalable integration solution for both messages and files.