In mid-November, I attended IBM’s second Storage Innovation Summit in Carlsbad, CA. This summit was no less grand than its East Coast counterpart, which I attended in May at the Museum of Natural History in New York City.
There was considerable overlap between the two events in terms of messaging and many of the same faces on stage, which was to be expected, given the apparent goal of the events: to tell the story of IBM storage and adjust for a long, loud silence in this space for the past several years.
The two events did the job, I suppose. IBM trotted out the usual analysts to talk about huge challenges confronting consumers in terms of data growth and capacity demand. They were followed by IBM product managers or other line of business managers who talked about specific products, sharing stage time with a few hand-picked customers who contributed how product XYZ solved their problem. Again, nothing out of the ordinary.
Next, Jeff Jonas, IBM’s Big Data guy, told his compelling Big Data tale once again to an always appreciative audience (including me). Jonas, a truly bright guy and natural comic speaker, makes the data deluge sound like a good thing. At least, he doesn’t perpetuate the Debbie Downer meme advanced in a vendor-funded study by IDC, describing a data explosion as a “tsunami” that will inevitably drown the companies that don’t buy enough storage capacity to contain it.
I had a few key questions that were front of mind. First, I wanted to know how IBM was positioning its XIV array, which seems to be showing up in quite a few mainframe shops I’ve visited recently. Maybe I’m a purist, but proffering “enhanced” storage arrays with lots of embedded, value-add software seems to detract from traditional mainframe infrastructure concepts. In the mainframe shops where I cut my teeth, value-add functionality was system-managed and centrally hosted, extensible, and delivered on demand to “clean” and “simple” DASD as required to support workload. XIV puts all this value-add software stuff on the array controller, creating the potential for isolated islands of functionality that are already the bane of the distributed world from a design and management standpoint.
The IBM official’s response to my question can be distilled down to this: XIV was intended to provide a bridge between the midrange equipment offered by IBM and the DS8000. It provided a set of capabilities that are very desirable in distributed computing environments, but its performance (which has improved three times over the previous generation, the fellow observed, pointing to the latest press release) was finding XIV a home in the mainframe shop, too.
He proceeded to give me a laundry list of sexy features on the rig, but I complained it wasn’t an answer to my question. The fellow seemed to shrug off my complaint and to go to the heart of his real thinking. From simple storage capacity, he noted, conversations with consumers around mainframe and distributed systems storage have evolved over the last decade into discussions of capabilities. Summarizing his words to the best of my ability, he seemed to say that the XIV software acquisition and its subsequent productization as a value-add feature set on a hardware array product was a response to that shift.
That led to a really interesting observation. The IBM manager argued that the storage conversation has shifted once again—this time to a discussion of efficiency … hence, clouds. (IBM presented a cloud architecture leveraging Scale Out Network Attached Storage (SONAS) at the event.)
He provided additional context for his observation: “The data center was supposed to last a decade. Now, it’s obsolete in about five years.” Add in the cost of utility power, the data burgeon and other factors, and the idea of cloud-based storage services sounds much more efficient than a roll-your-own storage infrastructure.
While it certainly comes as no surprise to hear a leading tech vendor embracing clouds as the next evolution of storage, it was nevertheless illuminating to understand how customer-driven this direction seems to be, at least from IBM’s perspective. When I’ve discussed clouds with any other vendor, it was quickly apparent that the cloud was the vendor’s idea—just another product they wanted to sell to their customers. Maybe there are some at IBM who view clouds in a similar vein, but not the folks I spoke to in Carlsbad at the summit: They say customers are telling them to deliver it.
I for one will wait to see what they come up with in response.