Another 100-degree day in Boulder, CO made my topic easy to choose—heat and energy consumption. Electricity is mission-critical to most businesses, and without it, there’s no IT industry. Heat is the enemy of electronic technology, especially computers. Events of the past few years have brought increasing levels of attention to electricity as the lifeline for the IT sector. As a result, a growing wave of “green computing initiatives” have appeared in the IT industry. The faster micro-processors become, the more heat they generate and the more cooling they require. The faster a disk spins, the more energy and cooling it requires. The same is true for tape drives and other motor-powered devices. Today, servers account for approximately 40 percent of a data center’s overall power consumption, and storage takes just over 35 percent, on average. Network switching gear, servers, blades, and disk drives are the largest IT electricity consumers and have been getting considerable attention lately.
Storage suppliers are using packaging schemes that tightly pack more boxes into more racks in the data center, exponentially increasing heat density and cooling requirements. These technologies all use fans, but they just move the heat around and don’t remove it. Air conditioners normally are used to remove heat but they significantly increase overall IT costs. Chilled water and liquid refrigerant solutions first appeared with S/370-model 168 mainframe computers in the ’70s and later gave way to air-cooled systems as businesses were concerned about leaks, moisture, and plumbing issues. Nonetheless, expect to see liquid cooling regain momentum, since it’s more efficient at removing heat than air cooling. Spot liquid cooling solutions are expected to appear first as they can provide liquid cooling for specific products such as server racks, blades, and dense disk arrays.
The numerous hurricanes along the Gulf Coast, fires in Southern California, floods, earthquakes, and tsunamis have all caused frequent and lasting power outages. On Aug. 14, 2003, a large section of the North American power grid failed, leaving 50 million people in the U.S. and Canada without power. In Cleveland, pumping stations had no electricity and had to shut down, leaving people without water. In Toronto, subways were out of commission for several days. Then power outages occurred in Europe, followed by major power outages in Southern Sweden and Italy. Today, the presence of electricity 100 percent of the time is no longer a given, as many power grids are antiquated and susceptible to outage.
Wind and Solar Power
In recent years, alternative sources of electricity, such as wind and solar power, have gained some interest, though progress is relatively slow. A recent U.S. Department of Energy report found that the U.S. is the fastest- growing market for wind power, though Europe remains far ahead in total electricity generated from wind. The U.S. has far more wind resources available to develop and wind power and capacity jumped 27 percent in 2006. Denmark leads the world with 21.4 percent of its energy derived from wind power, and Spain ranks second with 8.8 percent. Wind power is the fastest growing energy source in the world. The wind in North Dakota alone could produce a third of America’s electricity.
Solar energy is abundant year-round in many parts of the world, but the conversion costs still remain too high for widespread usage after many years of development. The amount of solar energy that strikes Earth’s surface per year is about 29,000 times greater than all the energy used in the U.S. The solar energy falling on Wisconsin each year is roughly equal to 844 quadrillion BTU of energy, which is about 550 times the amount of energy used in Wisconsin. Although the amount of solar energy reaching Earth’s surface is immense, it’s spread out over a large area. There are limitations regarding how efficiently solar energy can be collected and converted into electricity and stored. These factors continue to impact the amount of solar energy that can actually be used; the cost of conversion is high.
As much as 30 percent or more of all electricity consumed in the U.S. by the year 2010 will be by information technologies and the increasing number of personal digital appliances. Data centers often use more than 100 watts per square foot—more than 10 times that of the average household. The cost of energy is now increasing at 20 to 30 percent annually in many geographic locations, making energy consumption one of the largest considerations in the Total Cost of Ownership (TCO) for computing equipment and when building a new data center.
Servers are large consumers of electricity. Before the year 2000, servers drew about 50 watts of electricity; now they use up to 250 watts and there are more of them. Blade server technology best illustrates this point. As many as 60 servers can be placed in a standard height 42U (U=1.75 inches) rack. The typical power demand (power and cooling) for this configuration is more than 4,000 watts.