IT Management

Real-Time Data Enrichment for Business Intelligence

4 Pages

Most progressive organizations recognize the benefits of using advanced Business Intelligence (BI) technologies to better understand the precise health of their businesses and to ease planning and decision-making.

Maximizing the effectiveness of BI requires access to all relevant organizational data points. For the Global 2000 and government agencies, this means access to the 70 percent of operational data that remains resident on IBM mainframes. Only by making this information available can BI professionals ensure they’re working with a credible, accurate, meaningful representation of the state of the business. However, it can be difficult to access information stored in the popular legacy database systems: IMS/DB, DB2, Adabas, IDMS and VSAM.

Furthermore, the information is rarely structured in a meaningful way for BI. Consequently, many organizations have deployed Extract, Transform and Load (ETL) procedures to build and synchronize copies of the operational data. These copies, often referred to as data warehouses or data marts, are structured to accurately reflect the demands of the BI professional and reside in more contemporary database environments.

ETL technologies have evolved to make the manipulation, summarization and creation of data marts inexpensive and straightforward. However, data extraction and synchronization with vast quantities of mainframe operational data remain challenging. Evolving business demands are exacerbating the problem.

Traditional methods of making this data available in an appropriate form can no longer support the needs of today’s enterprises due to:

  • The need for reduced latency in decision- making: Waiting 24 hours to have data available to support key business decisions is unacceptable for many businesses in today’s competitive environment.
  • Shrinking operational windows due to globalization: With mainframes being continuously online in support of global businesses, the opportunities to run effective, full replications of operational information are shrinking.
  • Total Cost of Ownership (TCO) issues: Mainframe integration is costly and adding ETL-specific data synchronization tools creates additional maintenance and support costs (beyond the fundamental inefficiencies of traditional ETL approaches).
  • A steady rise in mainframe transactional volumes: Legacy mainframe applications increasingly support new, composite application development. This has generated more mainframe transaction activity.

This article explores the impact of these trends on traditional approaches to making mainframe data available to BI. It defines the broad set of requirements for effective BI where mainframe data access is essential and provides criteria to identify viable mainframe data synchronization solutions to deploy in conjunction with today’s leading ETL, data integration and Business Activity Monitoring (BAM) technologies.

Business Intelligence Platform

BI has evolved from a simple software product that provides easy-to-use query and reporting facilities to a complete platform for creating, changing and analyzing the volume of data an organization collects and stores. Specifically, a comprehensive BI platform encompasses:

Operational data extraction: This involves developing queries and requests you can run periodically against operational data to create the information subset that forms the basis of the BI data. Since a great deal of mainframe data exists in non-relational database systems (e.g., IMS/DB, Adabas, IDMS and VSAM) that support transactional workloads, operational data extraction isn’t a straightforward proposition. With shrinking batch windows, this process must be either extremely fast or capable of running alongside the online systems without degradation to those services. In the IBM mainframe world, this requires the extraction technology to implement query governor technology, specific to each database, and also abdicate workload prioritization decisions to the mainframe Workload Manager (WLM). Furthermore, it should enable the BI platform to use a consistent Application Program Interface (API) for data extraction, regardless of which database is accessed. Naturally, this implies SQL, and therefore non-relational to relational mapping, within the data extraction component.

Transformation and load: Once extracted, operational data must be cleansed and transformed into the structures that are more meaningful to planning and decision support. Once the data has been prepared for decision support analysis, it must be loaded into database systems appropriate for access by the query, reporting and analytic tools.

4 Pages