How many watts does a computer use?

Updated July 19, 2017

Computers, although essential, cost money and energy to operate. Your choice of peripherals play a big part in your annual energy bill. Newer technology and energy-saving techniques will cut those costs.


A desktop computer uses between 60 and 250 watts, depending on whether it is idle. A 17-inch cathode-ray monitor uses about 80 watts. A laptop consumes 15 to 45 watts; an LCD monitor uses an average of 35 watts.


To determine watts consumed annually, multiple 270 (wattage of typical computer plus monitor) by daily usage times 365 days. A computer and monitor on four hours a day would use 394 kilowatts yearly.


Using a computer's power management features can save one to five watts per computer, according to the EPA. A "sleeping" computer uses about a third of its "awake" energy.


Calculating your energy costs requires just three numbers and a simple formula. Add the watts consumed by your computer and monitor, then multiply by the hours and days used, dividing by 1,000 to reach the kilowatt hours. Multiply the kilowatt number by the electricity cost per kilowatt hour (found on your utility bill). Consuming 270 watts (at 8.5 cents per kilowatt hour) for four hours each day, means an annual energy bill of £21.7.


To receive the EPA's EnergyStar rating desktop computers must use two watts or less when in standby mode and four watts or less when in sleep mode. Laptops must use one watt or less when in standby and 1.7 watts or less, when in sleep mode. On-mode EnergyStar ratings are based on a formula that calculates watts with a monitor's megapixels.

Cite this Article A tool to create a citation to reference this article Cite this Article

About the Author

E.D. Strong has worked in journalism since 1980. He covers the technology sector, including companies such as Google, Apple and Microsoft. Strong's work has appeared on GigaOm, Cult of Mac and many other online and print publications in the United States.