Social Question

Dutchess_III's avatar

How much electricity does a computer use?

Asked by Dutchess_III (47126points) July 16th, 2010

My son was saying that since his room mate got a computer and the internet, which he’s on all day, his utility bill has gone up $50. It doesn’t use that much, does it??

Observing members: 0 Composing members: 0

8 Answers

LuckyGuy's avatar

A typical PC takes 100 Watts. This varies a lot. It can be 10W or as much as 200W.
Figure 24 hours per day x 100 W = 2400 W-hours = 2.4kWhr. At $0.12 per kWhr = 2.4×1.2= 0.28 per day x 30 days = $8.64 per month for 100 Watt PC. Read the specs on the PC and see what it really takes. That niumber is pretty close.

PC takes $8.64 per month. You’ll have to look elsewhere for the other $46. A/C? Extra hot water?

Dutchess_III's avatar

Most excellent @worriedguy! Thank you very much.

LuckyGuy's avatar

You’re welcome. It is simple math.
You can also get a device called kill-o-watt that plugs into an outlet and you then plug in the electrical device you want to measure. It will keep track for you.

Dutchess_III's avatar

I know it’s simple math. I once computed the monthly cost of running a washing machine, because my Landlord was trying to rip me off by saying it cost X amount a month. He paid the water, but according to his strange logic, water for running the washer had to be paid by me, and he figured it was at some outrageous amount each month. Anyway, I proved he was full of BS.

But thank you! And thanks for kill o watt link…I’ll keep my eyes open for one.

jerv's avatar

The “Kill-o-watt” is the only way to really tell since computers vary widely.

The CPU can draw up to 135W or under 3W on it’s own under full use, depending on which one you have. Some computers have multiple graphics cards that draw almost 300W each while some make do with integrated graphics that draw next to nothing. And how many hard drives does it have? A RAID array (multiple drives acting like one) draws more power than a single drive.

And then there are monitors. Old-school CRT monitors are energy hogs, especially if you have a 20” like I used to before switching to a flat-panel LCD.

So without knowing more about the computer, it really is impossible to say. I have one that draws 14.7W and another that can draw over 400W, and that one isn’t even all that high-powered like most gaming rigs. Either give us a complete parts list, or just do the easy thing and try a Kill-o-watt (or similar) and measure it directly.

I will say that my roommate leaves his computer on 24/7 and the other desktop is on at least six hours a day and our total electric bill (including electric heat and the electric stove) hasn’t hit $140/month yet even in the winter and averaged closer to $75.

john65pennington's avatar

The electricity for my computer averages about $5.00 a month. its not on all day and night, it averages about 6 hours a day and off overnight.

jerv's avatar

@john65pennington I think it safe to assume that your experience is about average. Your usage pattern is about average and I can’t see you with either a high-end power-sucking gaming rig or a low-powered econobox, so I assume you have a mid-range, mainstream, average computer as well.
As for whether the OP’s situation is average… well that is another matter entirely. However, I am inclined to think that there is something else happening here, like leaving all the lights on or doing more cooking with a microwave and/or electric stove. I know for a fact that the electric bill on my apartment is notably higher now that there are three people in it instead of just one, and it isn’t all computer-related.

mattbrowne's avatar

The real suckers are the servers. Cooling them down is a bitch. Every Google search for example consumes electricity, not only on your PC.

Answer this question

Login

or

Join

to answer.
Your answer will be saved while you login or join.

Have a question? Ask Fluther!

What do you know more about?
or
Knowledge Networking @ Fluther