Priligy online now, save money

SEP 13

Recent Comment

"It would seem to me that the newer computers, the desktops, at least, ..."

View all Comments

Computer Energy Efficiency Doubles Every 18 Months

A new study done by an engineering professor at Stanford University has found that the energy efficiency of computers doubles roughly every 18 months, and has done so since the very first general purpose computer, the Electronic Numerical Integrator and only here cialis fast delivery Computer (ENIAC) that was built in 1956.

With help from Intel and Microsoft, Professor Jonathan Koomey was able to gather information about computing devices from 1956 until now and with this new finding, Koomey is revising and improving Moore's law -- the observation that computer processing power doubles every 18 months.  Fortunately, the things that contribute to that power improvement (reducing component size, capacitance and communication time between them) also increase energy efficiency.

This finding has great implications for the future of generic viagra mastercard computers and is it illegal to buy ultram online battery-powered devices.  As we constantly increase the performance power of computers and gadgets, we'll be improving their energy efficiency as well -- a much needed trend as we become more reliant on our portable devices.

Also, theoretically, we're far from the limit of how much electricity we can save.  In 1985, physicists projected that we could improve computer energy efficiency by a factor of 100 billion and since then we've only hit a factor of about 40,000.

via MIT Tech Review

Hits: 17934
Comments (7)Add Comment
Not exactly true
written by chiton, September 15, 2011
Moore's so called "law" stopped working a few years ago. As a work around, multi core microprocessors have been deployed. Multi-core CPUs introduced another problem which hasn't been solved yet - nearly all application software is incapable of using the only for you generic viagra online multiple cores because the programming languages (C, C++, C#, Java etc) are inherently not suitable for multi-core programming. Functional languages (e.g. Scale, Erlang etc) will save the day, but it will be quite a long time before consumer applications are developed in a Functional language.

Power consumption in desktop PCs has been steadily INCREASING over the years, not decreasing. This is generic viagra without presciption legal largely the result of the need for powerful Graphical Processing Units to render the demands of gaming software. Don't believe me? Examine the wattage of cialis soft the average desktop computer's power supply. These days 500W is considered the barest minimum, commonly 750W power supplies are used. In the 1990s, a 250W power supply was considered sufficient.
Actual power consumption
written by Charles, September 21, 2011

Power supply sizes may well have gone up to ~500 W but a typical office computer here uses 100 W. I am confident about that figure because our grid power is unreliable so we depend on UPSes and good choice buy cialis without prescription use/measure that figure when choosing capacity and analysing faults. smilies/smiley.gifsmilies/smiley.gif
written by fredm, September 25, 2011
Of course, there are so many more computers being sold that it would take a large drop in power consumption per machine to reduce net power requirements for computing. Ask Google, who carefully locate their server farms where power is plentiful and cheap (relatively speaking)
Moore's Law is Faster Than That
written by Derek, September 25, 2011
Back in the 1960s Moore's Law meant a doubling of processing power per dollar every 18 months. Contrary to Chiton's claim that Moore's Law has stopped working, it's not just alive and well, but thriving. The doubling now occurs every 12 months in some areas and it should last until around 2020. Quantum computing will likely take over where current computing leaves off and continue Moore's Law into the future...but it's too early to say for sure. But one thing's for sure: Moore's Law ain't dead.'s_Law_-_2011.svg
written by electronics recycling, February 07, 2012
There has been a steady decline in the internal voltages the CPU's operate on. As this voltage drops, and the distance between components drops, so has power cosumption. On the other hand, more powerfull processors have required larger heatsinks. Heat generation in the CPU is the main cause of loss of power, but the usefull link buy cialis online uk commercial presure to increase CPU speed also finances ways to buy discount cialis make them more thermally efficient.
Computer Energy Consumption
written by Calgary computers, October 03, 2012
Well, after reading this and the comments I'm a bit confused if computers are using less energy or not. I'd say the newer desktops with the huge power supplies 650 watts or more would be using more electricity.
More energy or not?
written by Calgary computers, October 03, 2012
It would seem to me that the newer computers, the desktops, at least, with their 650 watt power supplies or more would be using more energy than those less wattage. And the laptops maybe less?

Write comment

security code
Write the displayed characters


Are you an EcoGeek?

We've got to keep 7 billion people happy without destroying our planet. It's the biggest challenge we've ever faced....but we're taking it on. Are you with us?

The Most Popular Articles