Feb 1 ’05

Business Integration: Making On-Demand Capabilities a Reality

by Editor in z/Journal

The time is right for companies to implement business integration techniques to automate processes. In today’s volatile environment, meaningful benefits can accrue rapidly for organizations that use a workflow engine for flexible, dynamic orchestration of application programs. This article examines some of the many reasons the zSeries provides the right platform to drive these activities.
Business Integration Overview
Your site probably has a mix of application programs that have evolved or been acquired over the years. These programs probably run on multiple, dissimilar platforms. Today, you seek to open up your systems to the Web and facilitate integration. Tools that can help here include Java programming and standards, Web services, portal servers, XML, and WebSphere MQ. Most application software products have bridges and Application Programming Interfaces (APIs) that support these techniques.
We now have the building blocks for a process engine to capture input data from a customer’s browser or client, or a Web services request. We can pass data and run programs automatically in a heterogeneous environment that’s more automated than ever. The benefits of such automation often include greater:
- Consistency and effectiveness in managing and using existing infrastructure and applications
- Productivity through process re-engineering and automation
- Flexibility to meet customer requirements
- Ability to reduce costs and take full advantage of IT assets
- Focus on the business
- Ease in implementing new applications.
All the components are now available to implement the on-demand capabilities that today’s environment dictate.
To re-engineer your processes, you must have the right management and organizational structure in place. The right structure lets planners and IT professionals work together to model processes that, for example, service an insurance claim.
Once modeled correctly, a process can be verified and implemented. A process typically is a long-running activity that can be interrupted and restarted while retaining state data. Its execution could use whatever mixture of processing activities the company has:
- Legacy systems
- Application packages
- Databases
- Business-to-Business (B2B) exchanges
- Websites
- Document management systems
- Outsourced activities
- Partner companies
- e-Mail or other messaging systems.
A process control engine can now choreograph all these activities.
Managers can refer to online statistics showing a variety of data views reflecting execution over a selected period, which may be real-time or historical.
For an insurance claim, a set of applications may need to run in a certain order, with data passed at the right time in the right format. Some activities might involve employees working on data at their desktop. This process will likely be interruptible and be able to restart, retain state data, and take time. Smaller, shorter-running processes—such as updating customer name and address data—might also be developed. These can be used in other, larger processes. Runtime capabilities allow use by multiple customers at any time; each customer’s request can be at a different stage using its own data.
Communication Methods
Consider the components that will likely be included in a process. Standards and techniques let programs communicate easily and participate as an activity in a process.
Some useful communication methods:
- WebSphere MQ (a.k.a. MQSeries), which uses asynchronous point-to-point messaging with API commands that are programmed to let a program add or remove data from a queue. WebSphere MQ works on 40 different platforms, so it’s ideal for moving data around, inside and outside your company.
- Application adapters, which provide a way to get data out of an application suite. They monitor activity of an external program and start an appropriate means to retrieve the required data, which then creates a message.
With adapters, there are no application coding requirements, just adapter configuration. An example of this might be if a name and address is updated in a packaged program, or a database adapter running in another address space would sense that update and either capture the data or start a process to retrieve it. The reverse process could also happen; an MQ message could start a process in the adapter, which would request an execution and an update in the application or database.
An adapter is highly dependent on the application suite and operating platform. Adapters are available from IBM and other vendors for most packages and databases on various platforms. Technology adapters can also be used to drive Electronic Data Interchange (EDI) systems, B2B exchanges, or a format such as XML or Java Message Service (JMS).
If you have an environment that doesn’t have a pre-made adapter, you can either make your own with the adapter tooling or pay a vendor to make it for you. JMS allows information to be exchanged program-to-program according to Java 2 Enterprise Edition (J2EE) standards.
Message Brokers
The powerful functions of message brokers let us work on the data as it is “in flight.” Several vendors offer message brokers. For the zSeries, be sure to consider WebSphere. In addition to picking up messages from MQ queues, WebSphere Business Integration Message Broker Version 5 can receive input from telemetry devices using SCADA (Supervisory Control and Data Acquisition) protocols and wireless small messages through WebSphere Everyplace (still called MQEveryplace). The broker allows an execution flow of “nodes.” The designed process understands MQ message type, structure, and format so that when a message is put on a queue, the broker can recognize it and start an execution flow through the designed nodes, each doing a predefined function on the message. This user-designed process can do many things with messages, including:
- Perform arithmetic
- Reformat messages
- Make new messages
- Update databases
- Use Web services
- Store messages in DB2
- Use eXtensible Stylesheet Language (XSL) templates.
The execution process through the designed nodes starts with the receipt of a suitable message on a defined queue, then processes to completion. This is often regarded as a “micro flow,” a small architected process.
There are other brokers on distributed platforms from IBM and other vendors; most can exchange MQ messages, so messages can be flowed to the zSeries. IBM also offers the Interchange Server, which is the evolution of the Crossworld’s product, which IBM acquired a few years ago, and the Rational products. Both can play in this area.
Workflow Engines
In terms of workflow engines that drive the processes, the IBM products are WebSphere MQ Workflow 3.5 and the new WebSphere Business Integration Server Foundation. Both come with a modeling capability to make workflow processes. The online part allows a view of all the business activities in flight and provides a comprehensive reporting capability. The monitoring and modeling products can be ordered separately.
IBM has provided workflow products since 1998; the latest is WebSphere MQ Workflow 3.5, which lets a modeled process be generated in Flow Definition Language (FDL). This generally accepted format, however, isn’t a Java open standard. The OASIS standards body, in collaboration with all the major software providers, has developed a new open standard, Business Process Execution Language for Web Services (BPEL4WS).
The processes modeled have different activities in their flowchart of execution and a description of the inputs and outputs that each accept. Your imagination is the limit to what you can model. Workflow, or Server Foundation, manages orchestration of the activities and keeps state for every business activity that starts up, managing these long-running tasks from start to finish. Hundreds or even thousands of activities could be active at any time.
Server Foundation is now Java-based, but still allows connectivity to all environments.
The Server Foundation runs in WebSphere Application Server (WAS) and is an ideal place to manage input from portals or Web-based clients, and then execute DB2 updates or CICS programs or interact with application packages.
IBM has provided a migration support pack to enable conversion of FDL format processes to BPEL and aid migration to the Server Foundation. The run-time monitoring is a key part of the execution of these processes. Bottlenecks can be identified and corrected, work reallocated, and a run-time view of strategic data displayed through customizable views.
The tooling is almost all Eclipse plug-ins. The only part still outstanding is the monitor for Server Foundation, and IBM has indicated it will be available in 2005. The Eclipse framework provides a good way of managing software development and deployment with customizable access controls for all kinds of products in the WebSphere environment.
Getting Started
Organizations seeking to take advantage of these tools should start by making an inventory of all the applications in the key business areas, what they do, and how they communicate. WebSphere Asset Analyzer can help here. The inventory should include data formats. The inventory will yield a Service-Oriented Architecture (SOA), where each application or activity can be regarded as a service and can be used as such by a modeled process.
The process being modeled could be large or small. As you implement modeled processes, you’ll gather a collection of big and small processes that can be expanded, changed, and evolved easily to respond to a changing environment. Processes and services are now positioned for easy reuse, change, and growth.
Companies should develop an architectural view of their systems that show a tiered model with the preferred ways to connect the tiers. By carefully planning an overall architecture, future development can fit into a clearly defined, well-supported infrastructure. This allows easy deployment and expansion of the operating environment, ensuring autonomous areas fit into the overall structure.
Architectures should contain these tiers:
- Input: Web clients, portals, data entry via call centers, mobile, telemetry
- Web management: where the Websites, clients, data, portals are managed by an application server
- Process control: a flow engine and broker activity
- Application execution: packages, legacy, programs, or databases.
You should include in the definitions of these tiers the communication methods connecting them and any external interfaces used. The different logical tiers could be a different number of physical tiers.
Business integration and modeling your company’s business activities gets you to that world of on-demand computing in a structured way. Imagine having consistent, automatic, self-documenting company procedures that are flexible and changeable for tomorrow’s business needs.
Why zSeries?
The systems we’ve discussed can run on many different platforms, so what’s the best place to put them? The execution of long-running processes that might have tens, hundreds, or thousands of executed instances in flight at any time, all keeping state and storing data are clearly a candidate for the zSeries. Many of the applications you want to execute (e.g., CICS, DB2, WAS, brokers, etc.) will be on the zSeries, so putting the workflow engine on the same computer makes sense.
The workflow and broker products are mostly written in Java or C++, so they can run in the latest Java Virtual Machines (JVMs), executing in zAAP processors. That’s a cost-effective approach. Also, remember the zAAP processors don’t need vendor software to run; this represents another cost savings. It’s also worth analyzing the new IBM software licensing initiatives. These are designed to make the zSeries cost-competitive compared to other platforms.
If you have a distributed strategy for Web-based or broker activities, it’s worthwhile to consider making the zSeries the driver for these activities.
Business integration has much to offer and the products and Web standards you need are available. The time is right to start this initiative!