IT Management

Scanning the trade press today, it would be easy to conclude that business IT managers have worked out all the challenges of traditional computing and are now ready to throw more effort toward realizing the benefits of disruptive trends such as Big Data analysis. The potential of burgeoning technology aside, the reality of the situation may be quite different.

In most businesses today, the volume of data is growing at an amazing rate, amassing in “live” databases and rarely finding its way into storage due to the growing value of data analysis. Operationally, this means that effective database administration translates into keeping searches efficient across ever-expanding data pools and optimizing the use of already purchased hardware resources—all in the face of increasingly constrained capital expenditure (CAPEX) and operational expenditure (OPEX) budgets. For most, Big Data projects remain in the testing phase, awaiting the development of standards and technologies as well as cost-effective implementation methods. The big push is for optimization and management of performance and infrastructure we already have, in order to respond more quickly to business needs.

Improving Operational Performance

Hardware-wise, organizations are seeking to preserve their existing investments. In the mainframe space, they’re facilitated by specialty processing engines, such as System z Integrated Information Processors (zIIPs) and System z Application Assist Processors (zAAPs), that augment the processing of workloads and help reduce the cost of running on general processors. Provided that database, application and utility software are designed to leverage these hardware enhancements, they can make a serious dent in the CAPEX cost curve while improving operational performance.

Performance improvements can also be realized by tuning the performance of databases and applications and by automating and streamlining how maintenance is performed in the database environment. SQL is, according to some experts, responsible for up to 70 percent of service level agreement response time shortcomings. Whether poorly written by application designers or generated by packaged applications or third-party tools, SQL statements that don’t use the most efficient paths to access data in a database place undue burden on system resources and cost organizations real money.

Coping with this problem requires the right staff skills (to avoid many problems in the first place) and tools (to analyze and guide the remediation of inefficiencies that already exist). The challenge with the first requirement, database administrator and application designer training, is the staffing resource shortages and skills or knowledgebase gap that many companies confront. 

The workforce of the future is hybrid in experience and skillset, much like the hybrid environment they will be tasked with managing. The success of the IT department is dependent upon your ability to leverage your existing workforce to manage both distributed and mainframe solutions with agility. In order to accomplish this, you must capture the knowledge of experienced hands so that knowledge can be transferred more readily to, and between, other team members.

The second challenge is to find an efficient way to remediate inefficiencies that already exist in current databases and applications. Aside from the less-than-optimal SQL that too often finds its way into applications at the design phase, inefficiencies can develop over time as the unintended or unforeseen consequences of change.

Consider the report that was originally designed for occasional production and use by a small group of users but that’s ultimately scaled up to meet the needs of a broader community. The original volume of data was small, but through acquisition the volume of data grew exponentially. This can lead to resource conflicts and other performance issues. So, too, can the growth of databases or warehouses that creep beyond original designs in terms of scope and complexity—the impact on machine performance can change over time, requiring ongoing review and vigilance. With analysis, you may decide to add indexes or make other physical database changes to improve the performance and reduce the cost of generating this report.

Database Optimization Through Automation

In theory, we need a way to optimize databases after analyzing the SQL and choosing optimal data access paths. But in practice, this process can take longer to accomplish and can incur more cost than simply living with the inefficiencies. What we need are tools that can automate much of this work—analyzing current SQL, understanding available routes to data and identifying more optimal routes, and observing the impact on overall performance. 

Fortunately, tools exist to automate much of this work and this example shows the savings: A large telecommunications firm reported a cost savings of more than $2M per year, resulting from the tuning of dynamic SQL using a database performance management tool. Over a six-year period, their targeted attack on performance issues yielded a dramatic improvement totaling in excess of $9M over six years.

Based on the simple resolution of existing inefficiencies in SQL, the cost-savings accrued from using the latest generation of database analysis and optimization tools may astound even the most jaded planners. Moreover, as optimization tools become more automated, we’re getting to a point where many of the tasks that currently burden DB2 administrators and application designers may be delegated to the machine, enabling them to focus on other creative tasks—such as Big Data analytics!

It’s important to note that the more automated DB2 analysis and optimization tools become, the more they develop into permanent applications, providing ongoing services in the system environment. They must be aware of technologies such as zIIP or zAAP processing engines and how they’re being used by installed instances of DB2 and applications. Moreover, as applications themselves, DB2 analysis and optimization tools must, where appropriate, be enabled to use specialty processors to reduce their load on the core CPU and operating systems.

Conclusion

In the final analysis, tuning and optimizing your current database and application environment can result in unexpected savings during these lean budget years, but they also set the stage for next-generation projects such as Big Data. It may not seem terribly sexy to have a tool that lets you determine the actual cost of a single report, but maintenance is the measureable long-term cost of any database, and budget requests are much easier to rationalize when they’re supported by facts.

Plus, what you do today will have an impact on the success of any Big Data initiatives you have in mind. A comment recently voiced by a customer engaged in a Big Data project is that Big Data is still considered a Wild West environment, with precious few standards, a lot of hardware, huge management limitations and almost viral organic growth in expense and infrastructure. Frankly, he wanted to see every tool he could that would help him visualize his environment, gather performance data in a granular way and automate optimization processes. That’s the stuff of effective database management today, and a prerequisite for Big Data analytics in the future.