I often wonder why large datacenters are not put in colder places and use the temperature difference between inside and outside to generate electricity (sterling engines, thermogenerators) to help offset the electricity used to generate that heat?
Perhaps the relative inefficiency of the heat engines vs the cost to implement them? Still, you would think retrieving some electricity after generating all that heat would be useful.
Probably because those devices cost more than the electricity they make.
The trouble is that the heat is diffuse - it's not concentrated in one spot, and collecting it from an entire building is not practical.
(Just in case you are wondering, if you have active cooling then the heat is concentrated, but capturing it will raise your cooling cost, and it will raise it more than what you gain by capturing it.)
I think the major problem is that you need a high temperature difference for efficient power generation. Assuming an outside temperature of 0 celsius, and a maximum temperature of 100 celsius, you have a theoretical maximum efficiency of around 25%, and in practice much less than that. (see Carnot cycle)
"Facebook is to build a new server farm on the edge of the Arctic Circle – its first outside the United States – to improve performance for European users, officials of the social networking site said Thursday." - http://huff.to/vHOqne
Two years ago at our student conference we had a guest lecturer from a dutch company called "Kyoto cooling" they install these kinds of things in data centers. The guy said this kind of cooling works when the temperature outside is <22 degrees Celsius.
Perhaps the relative inefficiency of the heat engines vs the cost to implement them? Still, you would think retrieving some electricity after generating all that heat would be useful.