Now the consequence is, that power consumption by IT (servers) increased dramatically over the last years, ars technica writes, that US servers meanwhile consume more power than color tv nationwide and the energy consumption is doubling every 5 years (!). Meanwhile even companies like Google realized that facts and initiate research in the field of renewable energies.
And particularly the latter is of importance: These days we tend to think of operational costs, CO2 emission and energy use in operation. However, most of the energy is consumed before the device, the server gets into operation by manufacturing it. So whenever we can avoid a new server, we should!
I must confess, that I have not more ideas at the moment, but I wanted to get this issue out, hoping for some interesting ideas and reactions from the reader.