Operating Systems

Cloud has become a critical part of many IT infrastructures, and as such, it needs a robust platform on which to run. System z can provide that platform in a cost-effective, secure and robust fashion, specifically through the use of Linux on System z. Here we discuss the pros and cons of this platform as well as considerations for choosing Linux on System z. …

Read Full Article →

The power of High Performance Computing (HPC) is well-established, whether it’s being applied to weather forecasting and climate analysis, genome modeling and analyses of neuron function, or material modeling for new generations of semiconductors. But when we think of HPC, we think of supercomputers, colleges, universities and research institutes. IBM has always been a major player in this space, which is why the recognition of Sequoia, the IBM Blue Gene/Q system installed at the Lawrence Livermore National Laboratory in Livermore, CA, for the Department of Energy, as the number one ranked supercomputer in the world came as no surprise.

Of more immediate interest to enterprises, however, is the fact that IBM, with the finalization of its 2012 acquisition of Platform Computing, has moved aggressively to create an HPC solution capable of scaling to the needs and budgets of enterprises. Earlier in 2012, Helene Armitage, general manager of IBM Systems Software, said that: “The acquisition of Platform Computing will help accelerate IBM's growth in smarter computing, a key initiative in IBM's Smarter Planet strategy, by extending the reach of our HPC offerings into the high growth segment of technical computing ... . Our intent is to enable clients to uncover insights from growing volumes of data so they can take actions that optimize business results.”…

Read Full Article →

I’m not a fan of clouds. I bristle when I hear the term used. It has become a fetish in business circles where non-technical managers speak of clouds to mask their lack of knowledge of the black arts of IT, and it’s a fixture in vendor marketing brochures, where it satisfies the dual criteria of marketecture: It sounds technical but means whatever a vendor says it means…

Read Full Article →

Many large organizations are implementing private clouds to enable on-demand network access to a shared pool of computing resources they can easily and quickly configure and provide. Although the main focus of cloud computing has been on the distributed infrastructure, organizations have learned that the zEnterprise also works well for implementing private cloud using Platform as a Service (PaaS)…

Read Full Article →

Ask people about their IT strategy and you often get a long answer. I’ve looked around a bit, and the definition that appeals to me most is one I found on the CioIndex Website at www.cioindex.com: IT strategy is an iterative process to align IT capability with business requirements. The author adds a few extras to clarify this succinct definition:…

Read Full Article →

There’s more than one way to save on CPU consumption. Moving workloads to System z Integrated Information Processor (zIIP) engines has been a popular option. However, many organizations are also exploring using IBM’s DB2 Analytics Accelerator (IDAA) as an option for lowering costs and optimizing the performance of the mainframe. IDAA is a specialty machine that offers faster and more predictable response times for long-running, unpredictable queries. It does this by reducing database tuning efforts and off-loading query workloads. If you’re looking into moving workloads to IDAA, here are some important factors to consider…

Read Full Article →

As companies continue to rely on the cloud to deliver their more business-critical systems, they must consider how much they can trust this new paradigm. For a cloud solution to be robust enough to host the most important systems, it must be built on top of a virtualization stack that can deliver the necessary service levels…

Read Full Article →