Ok, I confess, I did not find a better title for this blog entry. Now the question is, where is the connection between global warming and software engineering. Actually, this is exactly one of the things I try to find out these days.
We all face a severe challenge. Global warming is real, and so is the fact, that resources, particularly also energy resources are limited. IT is often seen as a "cleaner" way to do things; supposedly doing things in the virtual world is less resource intensive than doing them in the real world. E.g. if four people have a Skype meeting this should be less resource intensive than having three people traveling by plane to the meeting location.
Yet the IT industry, and now I come to the point, particularly also the software industry was not so particularly interested in efficiency in the last decade. We develop software that somewhat runs on the current hardware, because with the next generation of hardware it will be fine. This is actually embarrassing. Consider software engineering practices: there is a lot of talk about clustering, putting more iron to the backend if the application is slow, but who is really skilled in analysing an application, figuring out where the hot spots are, and optimizing those? Don't worry, just start a new server, that will do it.
Now the consequence is, that power consumption by IT (servers) increased dramatically over the last years, ars technica writes, that US servers meanwhile consume more power than color tv nationwide and the energy consumption is doubling every 5 years (!). Meanwhile even companies like Google realized that facts and initiate research in the field of renewable energies.
Now the consequence is, that power consumption by IT (servers) increased dramatically over the last years, ars technica writes, that US servers meanwhile consume more power than color tv nationwide and the energy consumption is doubling every 5 years (!). Meanwhile even companies like Google realized that facts and initiate research in the field of renewable energies.
Some other examples struck me recently. I compared three recent game consoles: Playstation 3 and XBox 360 consume approx 200 Watts during playing (some sites even quote numbers up to 300). The Nintendo Wii approximately 20 Watts. So this is a factor of roughly calculated 1:10. Playstation 2 takes approx 50 Watts, Gamecube approx. 20 Watts.
Second example: the XO Laptop from the OLPC projects consumes about 2 W during regular work, a conventional Laptop about 10-45 W, again we have a factor roughly of 1:10, maybe more.
Now, it is clear, that the Nintendo Wii is not as powerful as a Playstation 3 and the XO laptop is not as powerful as a Macbook Pro. Yet, is the difference 1:10? That is the question. I was playing a rather new Playstation 2 game and was astonished how the quality of the graphics can still increase compared with games 5 years ago. The same observation as with the old C64. Consider the quality of the games in the early 80s to the later ones. Clearly, developers learn how to operate the device and got over time the best out of the box; and it can be astonishing what is in these devices.
The point I want to make is this: Up to now energy consumption was hardly an issue for us Software Engineers, as it seems. The result is, that inefficient programming wastes a lot of hardware capacity because we just do not care. The OLPC project was very important as it showed us, what can be done with a laptop using 2 W! Now, I do not care if the XO is the next big thing or the Asus EEE, Christoph wrote a good article about this issue, but the point is, that also we as Software Engineers should start thinking about resources much more than we have done so far. It is just embarassing when a PS 3 consumes 10 times the energy of a Wii (which is probably also not optimised as it could be) or if a laptop consumes 10 times the amount of an XO and the user is just typing a plain text.
Now what can we do? How can we incorporate this issue in teaching and training young engineers. One thing that comes to my mind immediately is the use of profiling tools. Listen for example to the presentation from Rasmus Lerdorf (yes, PHP, I know *g*, still...) from IT conversations. I was not so interested in PHP, but it was very interesting what he told about profiling and optimizing web-applications. It seems, that in many, if not most web-application there is again easily the potential for a factor of 10 in efficiency gain. What does that mean: not only is our application faster, during regular operation it consumes less energy, takes less servers, hence consumes less hardware resources doing the same operation as before.
And particularly the latter is of importance: These days we tend to think of operational costs, CO2 emission and energy use in operation. However, most of the energy is consumed before the device, the server gets into operation by manufacturing it. So whenever we can avoid a new server, we should!
And particularly the latter is of importance: These days we tend to think of operational costs, CO2 emission and energy use in operation. However, most of the energy is consumed before the device, the server gets into operation by manufacturing it. So whenever we can avoid a new server, we should!
Maybe there are other things to be done? For example: what about a typical server, where several applications run parallel: As in energy consumption and power plant/grid planning: when all apps like to do resource intensive things at the same time, we have to provide a server that can handle the peak load. Could it be possible to let application communicate with e.g., the task scheduler about the current "computing cost" on the machine? And if they are too high to postpone or "nice" the currently planned activity?
I must confess, that I have not more ideas at the moment, but I wanted to get this issue out, hoping for some interesting ideas and reactions from the reader.
1 comment:
Thank you for this interesting post. In my opinion you came up with a topic that seems to be completely ignored by the IT community: that throw away IT infrastructure, high bandwidth consumption, inefficient programs, etc. each accelerate the process of global warming a little bit.
In my opinion the lack of innefficiency also correlates with myth of cheap hardware. In every second lecture at the university you hear how fast computers will be in a year and how cheap hardware and processing power is compared to a few years ago.
What you normally don't hear at university and what Michael T. Nygard points out in his excellen book "Release IT!" is that things get pretty fast pretty expensive when developing production ready software. Take a look at a typical webpage: the amount of innecessary whitspace is often incredibly and can make up a few kbs. Consider a high traffic webpage with millions of hits per day and calculate how much bandwidth is consumed just by transferring whitspace!
I my opinion creating efficient software is a slightly forgotten discipline and it would be time to bring these topics up into developers minds again. It might be only a small step towards more energy efficient IT but many small drops make cool down a hot stone (I know - that translation was brutal...)
Post a Comment