Would You Heat Your Home with a Cloud Data Furnace?
Are you frustrated by the high cost of heating your home? With the winter weather arriving in many parts and furnaces kicking into high gear, once again we can look forward to exorbitant bills for oil or natural gas.
If you can't justify the hefty investment in solar panels or other alternative energy sources, would you consider replacing that furnace with a cabinet full of servers, storage and network gear?
That's what researchers at Microsoft and the University of Virginia are proposing. They have introduced the concept of the Data Furnace, or DF for short, to heat homes and office buildings, while at the same time reducing operational costs for those hosting cloud infrastructures by offloading at least some of the expense of running servers in large datacenters that consume huge amounts of energy and require substantial cooling facilities.
With servers used to power cloud computing infrastructures, these DFs can generate enough heat to act as a primary heating system in a home or building, the researchers proposed in a paper presented back in June at the annual USENIX Workshop on Hot Topics in Cloud Computing, held in Portland, Ore.
Though the paper got little attention at the time, New York Times columnist Randall Stross wrote about it in his popular Digital Domain column Sunday, thereby exposing the idea to a mass audience.
The authors defined three primary benefits to cloud providers deploying DFs in homes and office buildings: a reduced carbon footprint, lower total cost of ownership per server, and the ability to bring the compute and storage closer to the users (by further caching data, for example). The DF shares a footprint similar to a typical furnace, in a metal cabinet that is linked to ducts or hot water pipes.
"DFs create new opportunities for both lower cost and improved quality of service, if cloud computing applications can exploit the differences in the cost structure and resource profile between Data Furnaces and conventional data centers," the authors wrote. "Energy efficiency is not only important to reduce operational costs, but is also a matter of social responsibility for the entire IT industry."
A typical server farm generates heat ranging from 104 to 122 degrees. While not hot enough to sustainably regenerate electricity, it is ideal for powering heating systems, clothes dryers and water heaters, the authors wrote.
Cheaper servers, improved network connectivity and advances in systems management also make this a practical notion, thanks to the ability to remotely re-image or reboot a server.
Still, there are some obstacles. It can cost anywhere from 10 to 50 percent more for electricity in a home to power the DF versus the cost cloud providers pay in an industrial complex. Also, network bandwidth to the home could be more costly. And maintaining the geographically dispersed systems becomes more complex and expensive.
Co-author Kamin Whitehouse, an assistant professor of computer science at the University of Virginia, told Stross that he has received more response than he typically gets when publishing a scientific paper. In fact, he said he has heard from some who are already heating their homes with servers "which shows that it works."
While it may work, I'd like to see some cloud providers trying this out, find out how well it works in the home or building and see what the total economics are. It seems reasonable that the industry should seriously evaluate this concept.
Posted by Jeffrey Schwartz on November 30, 2011 at 11:59 AM