The Cost of Power
Summary: Some musings comparing work accomplished by my computer to my personal out-of-pocket costs for the electricity to feed it 24-hours a day - something I dare say very few home computer users look at.
Last month I upgraded a Celeron D 2.53GHz media server to a Core 2 Duo 1.8Ghz. When not using the server for "media" (watching DVD or recording broadcast TV), I run BOINC/Rosetta distributed science jobs on it. Since the Celeron D was functional, I moved it to an old chassis & updated the power supply - both have efficient, after-market power supplies.
As a hobby (and part of what my Mother would call our inherited Scot's blood) I enjoy using an AC power meter to evaluate the cost of running appliances. My meter is from http://www.brandelectronics.com/ and it shows some interesting facts, such as that my Cox digital cable box consumes 24-watts when powered "ON" ... and 23-watts when turned "OFF" :-)
Obviously, the Core 2 Duo - running 2 jobs at once - contributes more credits to BONIC projects than the Celeron D. But I was interested in comparing what I gain given the monthly costs to run my now unnecessary Celeron D.
Computer Summary:
Core 2 Duo: 1.8GHz, 1GB DDR2-800 RAM, 320GB SATA drive, nVidia 7100 (fanless) 400w power supply
Sonce both systems eat about the same power, just rounding the wattage to 130 watts burned 24-hours per day amounts to from $7.50 to $13.00 per month. This ranges includes my Minnesota kwh charges of about $0.08 per KWH and also my California charge of about $0.14 respectively. I wonder how many people understand they pay that much per month to run their computer 24-hours a day? Over a year that totals from $90 to $160 per computer - and this is JUST the computer. I'm not including the wattage used by monitors, printers, Ethernet switches or the DSL/cable router hardware. Plus with the computers running in a cool Minnesota basement, I don't have to include the extra air conditioning load they'd create in a hot climate like my Southern California home.
So now for the true "musing" - if I average the last 10 Rosetta jobs handled for each computer:
So should I still run the Celeron D? Should I upgrade it to something closer to the Core 2 Duo? The upgrade cost me close to $450 once one considers the cost of the CPU, the new motherboard, and the new DDR2 RAM. This is an interesting question without a simple answer ... yes, running the old Celeron D doesn't cost me any more from a hardware stand-point ... but I am paying good money out of my pocket for the power.
So what is the real cost of power?
Last month I upgraded a Celeron D 2.53GHz media server to a Core 2 Duo 1.8Ghz. When not using the server for "media" (watching DVD or recording broadcast TV), I run BOINC/Rosetta distributed science jobs on it. Since the Celeron D was functional, I moved it to an old chassis & updated the power supply - both have efficient, after-market power supplies.
As a hobby (and part of what my Mother would call our inherited Scot's blood) I enjoy using an AC power meter to evaluate the cost of running appliances. My meter is from http://www.brandelectronics.com/ and it shows some interesting facts, such as that my Cox digital cable box consumes 24-watts when powered "ON" ... and 23-watts when turned "OFF" :-)
Obviously, the Core 2 Duo - running 2 jobs at once - contributes more credits to BONIC projects than the Celeron D. But I was interested in comparing what I gain given the monthly costs to run my now unnecessary Celeron D.
Computer Summary:
Core 2 Duo: 1.8GHz, 1GB DDR2-800 RAM, 320GB SATA drive, nVidia 7100 (fanless) 400w power supply
- Rosetta Benchmarks; fp=1744 int=3656 (since dual, means maybe fp=3488 int=7312)
- When Idle: CPU temp = 70 DegF, AC power usage = 105 watts
- When both cores at 100%: CPU temp = 100 DegF, AC power usage = 129 watts
- Rosetta Benchmarks; fp=764 int=1677
- When Idle: CPU temp = 100 DegF, AC power usage = 98 watts
- When sole CPU at 100%: CPU temp = 125 DegF, AC power usage = 134 watts
Sonce both systems eat about the same power, just rounding the wattage to 130 watts burned 24-hours per day amounts to from $7.50 to $13.00 per month. This ranges includes my Minnesota kwh charges of about $0.08 per KWH and also my California charge of about $0.14 respectively. I wonder how many people understand they pay that much per month to run their computer 24-hours a day? Over a year that totals from $90 to $160 per computer - and this is JUST the computer. I'm not including the wattage used by monitors, printers, Ethernet switches or the DSL/cable router hardware. Plus with the computers running in a cool Minnesota basement, I don't have to include the extra air conditioning load they'd create in a hot climate like my Southern California home.
So now for the true "musing" - if I average the last 10 Rosetta jobs handled for each computer:
- Core 2 Duo: average 10594 seconds and 36.87 credits granted per job
- Celeron D: average 10406 seconds and 22.75 credits granted per job
So should I still run the Celeron D? Should I upgrade it to something closer to the Core 2 Duo? The upgrade cost me close to $450 once one considers the cost of the CPU, the new motherboard, and the new DDR2 RAM. This is an interesting question without a simple answer ... yes, running the old Celeron D doesn't cost me any more from a hardware stand-point ... but I am paying good money out of my pocket for the power.
So what is the real cost of power?