Advances in engineering are usually born from the potential limitations of the present technology. Companies will try to work around the problem for a while,
Advances in engineering are usually born from the potential limitations of the present technology. Companies will try to work around the problem for a while, inventing solutions that push the current limits a little further. But then, there'll be a breakthrough. One of the possible solutions pays off and raises the bar once more.
A major recent breakthrough in the commercial marketplace has been in storage capacity. We saw commercial computers leap from storage capabilities of megabytes to gigabytes then quickly develop to terabytes.
What limits are there on present data center technology and how are innovators seeking to overcome them?
Currently, one of the key limits to data center technology is power. New high-performance machines need more energy ever to operate and produce far more heat. The machines need cooling for them to operate optimally that, in turn, needs more power. Thus a balance between cooling and infrastructure power has to be attained, and an adequate energy source found.
As you may expect, among the companies innovating at the forefront of the problem is Google. At their data center in Belgium, Google has been experimenting with natural cooling options. By building their servers in individual cabins and finding them outside, they've harnessed the natural air-flow for cooling.
Some of you may be skeptical of this approach, after all, if it were a case of using natural air flow to keep servers cool, wouldn't everybody have done it? Well, the difference at Google is they're experimenting by running their servers at a significantly higher temperature than most data centers. At peak times, when the temperatures in their server huts are too hot for humans to operate comfortably, they take 'excursion hours' away from physical care to get on with other work, leaving their servers to operate at around 30-4 C.
However, doesn't operating at high temperatures dramatically reduce server reliability?
Interestingly enough, not as much as you may think. Researchers at Toronto University have found that the error rate for servers about temperature is not the exponential curve they initially expected. Rather it was a simple linear growth until you hit 5 C, meaning centers could run entirely smoothly at much higher temperatures than is currently thought acceptable. In fact, they calculate that data centers could save 4 percent of their energy for every degree hotter they allow their servers to operate (degrees Fahrenheit that is). This implies the Google data center in Belgium is undoubtedly their most energy efficient yet.
Google has set an inspiring example. While the Internet giant raised the bar, it need not represent the be all and end all of the innovative thinking.
Attempting to keep up with new technologies may seem like an obvious benefit, but companies should only invest in system upgrades which are connected to the dimensions and function of their company. The most important thing for every company is that their data center provider is reliable. After all, there's absolutely no point having a brilliantly innovative data center if you can't always access it.
COMMENTS