Priligy online now, save money

SEP 13

Recent Comment

"It would seem to me that the newer computers, the desktops, at least, ..."

View all Comments

Computer Energy Efficiency Doubles Every 18 Months


A new study done by an engineering professor at Stanford University has found that the energy efficiency of generic viagra in the us computers doubles roughly every 18 months, and has done so since the very first general purpose computer, the Electronic Numerical Integrator and cialis how much Computer (ENIAC) that was built in 1956.

With help from Intel and Microsoft, Professor Jonathan Koomey was able to gather information about computing devices from 1956 until now and with this new finding, Koomey is revising and improving Moore's law -- the observation that computer processing power doubles every 18 months.  Fortunately, the things that contribute to mail order cialis that power improvement (reducing component size, capacitance and communication time between them) also increase energy efficiency.

This finding has great implications for the future of computers and battery-powered devices.  As we constantly increase the performance power of computers and gadgets, we'll be improving their energy efficiency as well -- a much needed trend as we become more reliant on our portable devices.

Also, theoretically, we're far from the tramadol next day limit of how much electricity we can save.  In 1985, physicists projected that we could improve computer energy efficiency by a factor of 100 billion and since then we've only hit a factor of about 40,000.

via MIT Tech Review

Hits: 16552
Comments (7)Add Comment
0
Not exactly true
written by chiton, September 15, 2011
Moore's so called "law" stopped working a few years ago. As a work around, multi core microprocessors have been deployed. Multi-core CPUs introduced another problem which hasn't been solved yet - nearly all application software is propecia uk no precription incapable of using the multiple cores because the programming languages (C, C++, C#, Java etc) are inherently not suitable for multi-core programming. Functional languages (e.g. Scale, Erlang etc) will save the day, but it will be quite a long time before consumer applications are developed in a Functional language.

Power consumption in desktop PCs has been steadily INCREASING over the years, not decreasing. This is largely the result of the need for powerful Graphical Processing Units to render the canadian cialis and healthcare demands of cialis india pharmacy gaming software. Don't believe me? Examine the wattage of the average desktop computer's power supply. These days 500W is considered the barest minimum, commonly 750W power supplies are used. In the 1990s, a 250W power supply was considered sufficient.
0
Actual power consumption
written by Charles, September 21, 2011
Hello

Power supply sizes may well have gone up to ~500 W but a typical office computer here uses 100 W. I am confident about that figure because our grid power is unreliable so we depend on UPSes and only for you lowest price levitra use/measure that figure when choosing capacity and analysing faults. smilies/smiley.gifsmilies/smiley.gif
0
...
written by fredm, September 25, 2011
Of course, there are so many more computers being sold that it would take a large drop in power consumption per machine to reduce net power requirements for computing. Ask Google, who carefully locate their server farms where power is plentiful and cheap (relatively speaking)
0
Moore's Law is Faster Than That
written by Derek, September 25, 2011
Back in the 1960s Moore's Law meant a doubling of processing power per dollar every 18 months. Contrary to Chiton's claim that Moore's Law has stopped working, it's not just alive and well, but thriving. The doubling now occurs every 12 months in some areas and it should last until around 2020. Quantum computing will likely take over where current computing leaves off and continue Moore's Law into the future...but it's too early to say for sure. But one thing's for sure: Moore's Law ain't dead.

http://en.wikipedia.org/wiki/File:Transistor_Count_and_Moore's_Law_-_2011.svg
0
...
written by electronics recycling, February 07, 2012
There has been a steady decline in the internal voltages the CPU's operate on. As this voltage drops, and the distance between components drops, so has power cosumption. On the other hand, more powerfull processors have required larger heatsinks. Heat generation in the CPU is the viagra super active australia main cause of loss of power, but the commercial presure to http://www.shoreacres.net/levitra-prescription increase CPU speed also finances ways to make them more thermally efficient.
0
Computer Energy Consumption
written by Calgary computers, October 03, 2012
Well, after reading this and the comments I'm a bit confused if computers are using less energy or not. I'd say the newer desktops with the huge power supplies 650 watts or more would be using more electricity.
0
More energy or not?
written by Calgary computers, October 03, 2012
It would seem to me that the newer computers, the desktops, at least, with their 650 watt power supplies or more would be using more energy than those less wattage. And the laptops maybe less?

Write comment

security code
Write the displayed characters


busy
 

Are you an EcoGeek?

We've got to keep 7 billion people happy without destroying our planet. It's the biggest challenge we've ever faced....but we're taking it on. Are you with us?