Computers, although essential, cost money and energy to operate. Your choice of peripherals play a big part in your annual energy bill. Newer technology and energy-saving techniques will cut those costs.
Other People Are Reading
A desktop computer uses between 60 and 250 watts, depending on whether it is idle. A 17-inch cathode-ray monitor uses about 80 watts. A laptop consumes 15 to 45 watts; an LCD monitor uses an average of 35 watts.
To determine watts consumed annually, multiple 270 (wattage of typical computer plus monitor) by daily usage times 365 days. A computer and monitor on four hours a day would use 394 kilowatts yearly.
Using a computer's power management features can save one to five watts per computer, according to the EPA. A "sleeping" computer uses about a third of its "awake" energy.
Calculating your energy costs requires just three numbers and a simple formula. Add the watts consumed by your computer and monitor, then multiply by the hours and days used, dividing by 1,000 to reach the kilowatt hours. Multiply the kilowatt number by the electricity cost per kilowatt hour (found on your utility bill). Consuming 270 watts (at 8.5 cents per kilowatt hour) for four hours each day, means an annual energy bill of £21.7.
To receive the EPA's EnergyStar rating desktop computers must use two watts or less when in standby mode and four watts or less when in sleep mode. Laptops must use one watt or less when in standby and 1.7 watts or less, when in sleep mode. On-mode EnergyStar ratings are based on a formula that calculates watts with a monitor's megapixels.
- 20 of the funniest online reviews ever
- 14 Biggest lies people tell in online dating sites
- Hilarious things Google thinks you're trying to search for