Johnny likes to run. He has been running for years and thinks he’s really fast. His entire neighborhood runs and they’re all on the elite running team. They race each other all the time and have watched their individual race times improve. So they think they’re the best runners. The only problem is they’ve never had the chance to compare themselves to other runners in other neighborhoods. They don’t even want to see what the best times are in other neighborhoods. They believe their training is sufficient and they’re unaware of any better training method. Deep down, they’re afraid of bringing in ideas from outside. All their neighbors were impressed with their dramatic improvement over the years, but they were shocked when someone from outside told them that nearby neighborhoods were running 50 percent faster. How could that be?
This is what it’s like for many data centers. They have internal people who have managed software assets for years. They’ve developed a core Procurement, Vendor, or Asset Management department that has specialized in managing assets. They believe they’ve created the ideal team of legal, IT, and vendor management people who work together, eat together, and even see movies together. They see themselves as a great team, yet their costs are 50 percent higher than their best-in-class peers. How can that happen?
Research reveals that software costs dramatically vary for similar data centers in similar industries. What are the sources for the cause of the dramatic difference?
Negotiated pricing: Historically, it was widely taught by many large IT consulting companies that to get to best in class, you should adhere to the “Big Stick” policy made famous by President Teddy Roosevelt in his speech at the Minnesota State Fair on Sept. 2, 1901, when he said “speak softly and carry a big stick.” It was the concept of peacefully negotiating, but you should always make the consequences apparent and imminent to those with whom you are negotiating. Corporations often use as their big stick such phrases as “the lawyers won’t allow it,” or “corporate finance won’t approve it” or “the CIO will eliminate it.” In return, vendors gave them bigger “perceived” (which is key) discounts.
Discounts are what most data centers use as the measurement for a “good deal.” Every data center truly believes they have best-in-class negotiators. What negotiator has walked away from a purchase when they thought they got a bad deal? Yet, even for some data centers that obtain 90 percent discounts from major Independent Software Vendors (ISVs), their cost structures remain extremely high. Those negotiating the software deals fail to tell senior management that the same software vendor offered an identical Enterprise License Agreement (ELA) to another smaller company for millions less. Instead, they boast only of the 90 percent discount.
What makes a good deal? It takes a seller who knows how to give the buyer what he or she is willing to pay and a buyer who holds a broad base of information, including the best price offered to other customers, to make an informed decision. Without both pieces, best-in-class deals aren’t possible. There are regions in the U.S. that consistently pay substantially more for their software assets than other regions. There are other regions where every deal from a certain software vendor will beat any pricing structure found anywhere else.
The reason for the fluctuations are that although many data centers will keep their deals confidential, employees move from company to company in a close geographic area and they share the details of good deals received at their previous employer. But if the deal they see isn’t as good compared to other regions, then every company in that area is sharing the same bad pricing. There also are vendor sales representatives who don’t know their company’s best deal, so they don’t offer it to their customer. The required knowledge to understand the complexity of a software vendor’s pricing is often not the primary objective of a software sales rep. They only want to sell new software and renew existing agreements. They have no incentive to get a customer to best-in-class pricing, so when a data center asks if that’s their best deal, they’ll naturally answer “yes” because that’s all they know or choose to know.
Takeaway: Negotiated pricing affects only a data center’s total software costs by less than 10 percent.
Product placement: During the years leading up to Y2K, data centers were annually growing at 25 to 35 percent. Corporate mergers and data center consolidations were occurring at a rapid rate. A byproduct of this was that products were being licensed for 100 percent of the data center capacity. CIOs and data center managers wanted the ability to run workloads everywhere and have the availability of the software on every workload. Software costs were averaging more than 100 percent of what they needed to be. When a product was utilizing only 5 percent of the total capacity, it was being purchased for 100 percent of the capacity. The problem was exacerbated when hardware vendors promoted larger boxes and small workloads were consolidated on large boxes. This created many workload environments that had software charges for capacity far in excess of what was necessary.
IBM introduced workload capacity pricing in October 2000. The new pricing model was a response to complaints from customers wanting to pay less than full capacity for software. It sounded ingenious, and for many data centers, it actually lowered their costs. However, for many data centers, lower software costs yielded a false sense of accomplishment. Their costs could have been lowered even more with other pricing models.