The Cost of Power
Summary: Some musings comparing work accomplished by my computer to my personal out-of-pocket costs for the electricity to feed it 24-hours a day - something I dare say very few home computer users look at.
Last month I upgraded a Celeron D 2.53GHz media server to a Core 2 Duo 1.8Ghz. When not using the server for "media" (watching DVD or recording broadcast TV), I run BOINC/Rosetta distributed science jobs on it. Since the Celeron D was functional, I moved it to an old chassis & updated the power supply - both have efficient, after-market power supplies.
As a hobby (and part of what my Mother would call our inherited Scot's blood) I enjoy using an AC power meter to evaluate the cost of running appliances. My meter is from http://www.brandelectronics.com/ and it shows some interesting facts, such as that my Cox digital cable box consumes 24-watts when powered "ON" ... and 23-watts when turned "OFF" :-)
Obviously, the Core 2 Duo - running 2 jobs at once - contributes more credits to BONIC projects than the Celeron D. But I was interested in comparing what I gain given the monthly costs to run my now unnecessary Celeron D.
Computer Summary:
Core 2 Duo: 1.8GHz, 1GB DDR2-800 RAM, 320GB SATA drive, nVidia 7100 (fanless) 400w power supply
Sonce both systems eat about the same power, just rounding the wattage to 130 watts burned 24-hours per day amounts to from $7.50 to $13.00 per month. This ranges includes my Minnesota kwh charges of about $0.08 per KWH and also my California charge of about $0.14 respectively. I wonder how many people understand they pay that much per month to run their computer 24-hours a day? Over a year that totals from $90 to $160 per computer - and this is JUST the computer. I'm not including the wattage used by monitors, printers, Ethernet switches or the DSL/cable router hardware. Plus with the computers running in a cool Minnesota basement, I don't have to include the extra air conditioning load they'd create in a hot climate like my Southern California home.
So now for the true "musing" - if I average the last 10 Rosetta jobs handled for each computer:
So should I still run the Celeron D? Should I upgrade it to something closer to the Core 2 Duo? The upgrade cost me close to $450 once one considers the cost of the CPU, the new motherboard, and the new DDR2 RAM. This is an interesting question without a simple answer ... yes, running the old Celeron D doesn't cost me any more from a hardware stand-point ... but I am paying good money out of my pocket for the power.
So what is the real cost of power?
Last month I upgraded a Celeron D 2.53GHz media server to a Core 2 Duo 1.8Ghz. When not using the server for "media" (watching DVD or recording broadcast TV), I run BOINC/Rosetta distributed science jobs on it. Since the Celeron D was functional, I moved it to an old chassis & updated the power supply - both have efficient, after-market power supplies.
As a hobby (and part of what my Mother would call our inherited Scot's blood) I enjoy using an AC power meter to evaluate the cost of running appliances. My meter is from http://www.brandelectronics.com/ and it shows some interesting facts, such as that my Cox digital cable box consumes 24-watts when powered "ON" ... and 23-watts when turned "OFF" :-)
Obviously, the Core 2 Duo - running 2 jobs at once - contributes more credits to BONIC projects than the Celeron D. But I was interested in comparing what I gain given the monthly costs to run my now unnecessary Celeron D.
Computer Summary:
Core 2 Duo: 1.8GHz, 1GB DDR2-800 RAM, 320GB SATA drive, nVidia 7100 (fanless) 400w power supply
- Rosetta Benchmarks; fp=1744 int=3656 (since dual, means maybe fp=3488 int=7312)
- When Idle: CPU temp = 70 DegF, AC power usage = 105 watts
- When both cores at 100%: CPU temp = 100 DegF, AC power usage = 129 watts
- Rosetta Benchmarks; fp=764 int=1677
- When Idle: CPU temp = 100 DegF, AC power usage = 98 watts
- When sole CPU at 100%: CPU temp = 125 DegF, AC power usage = 134 watts
Sonce both systems eat about the same power, just rounding the wattage to 130 watts burned 24-hours per day amounts to from $7.50 to $13.00 per month. This ranges includes my Minnesota kwh charges of about $0.08 per KWH and also my California charge of about $0.14 respectively. I wonder how many people understand they pay that much per month to run their computer 24-hours a day? Over a year that totals from $90 to $160 per computer - and this is JUST the computer. I'm not including the wattage used by monitors, printers, Ethernet switches or the DSL/cable router hardware. Plus with the computers running in a cool Minnesota basement, I don't have to include the extra air conditioning load they'd create in a hot climate like my Southern California home.
So now for the true "musing" - if I average the last 10 Rosetta jobs handled for each computer:
- Core 2 Duo: average 10594 seconds and 36.87 credits granted per job
- Celeron D: average 10406 seconds and 22.75 credits granted per job
So should I still run the Celeron D? Should I upgrade it to something closer to the Core 2 Duo? The upgrade cost me close to $450 once one considers the cost of the CPU, the new motherboard, and the new DDR2 RAM. This is an interesting question without a simple answer ... yes, running the old Celeron D doesn't cost me any more from a hardware stand-point ... but I am paying good money out of my pocket for the power.
So what is the real cost of power?
4 Comments:
Awesome analysis man, i have recently started running bonic. I think the setup for bonic does say that it is for x86, so how come ur Core 2 execute 2 projects simultaneously?
Is bonic designed to exploit multicores? or is it just the processor which is doing the trick?
Just a guess, but BOINC likely just detects the 2 cores and allows 2 jobs to schedule at once ... probably is just the OS naturally splitting the 2 CPU-intensive jobs on the 2 cores.
So I'd hope a quad-core runs 4 at once. Just be aware of the larger RAM it may need. I've seen several people say that "leaving swapped out jobs in RAM" greatly reduces the probability of crashed BOINC jobs.
Actually, I now have 2 "working gals" running BOINC with Intel Q6600 quad-core instead. As expected, they run 4 jobs at once plus chew through them at an amazing rate.
Actually I have four Quad cores doing boincs now. Given each does the work of four computers, even at 150+ watts maxed out that is "cheaper" energy-wise than four x 110-120 watts for single-cores running at 100% CPU utilization.
Post a Comment
Subscribe to Post Comments [Atom]
<< Home