-3

I have these two servers Dell Power Edge R900 and HP DL580 G5 both 4 Xeon Quad Cores and 128GB RAM loaded with Hard Drives 8 x 2.5 on DL580 and 5 3.5 in R900

R900 Using 1570 Watt Power Supply DL580 Using 1200 Watt Power Supply

How much are they costing me to run 24 x 7?

I do understand that power usage changes based on the load but still just looking for rough worst case example

Please let me know what other information i can provide to help me calculate this

My electricity cost is $0.11 / KW

SeanClt
  • 97
  • 5
  • 5
    Use the drac/ilo and check the watt used, the stat is there, and do the math. – yagmoth555 Feb 25 '16 at 14:35
  • @yagmoth555 i wonder if on this scale figuring out out-of-band access will take longer. But I only had experience with IPMI/IMM – aaaaa says reinstate Monica Feb 25 '16 at 14:41
  • @aaaaaa this is a simple webpage to check out for the OP – yagmoth555 Feb 25 '16 at 14:43
  • 1
    `My electricity cost is $0.11 / KW` That's about what AEP charges for electricity to my **home** as well. ServerFault is not really a CaaS (Calulator as a Service) platform, nor an appropriate place to ask about home server questions. – HopelessN00b Feb 25 '16 at 16:37
  • please confirm if the correct place is superusers? – SeanClt Feb 25 '16 at 16:44
  • Yes, SuperUser is the right site for home server questions, but that being said, they're not a calculator service either, and the answer you have here looks accurate, so there's not much point int moving or re-asking it now. – Desperatuss0ccus Feb 25 '16 at 16:51
  • i already got the answer, i mean that can be said for every question every asked, nobody is anybody's personal service. IMO it's really a mute point after 2 hours of opening a question. Good fell aaaaa already answered the question. but i'll stay away from serverfault. my mistake and apologies – SeanClt Feb 25 '16 at 16:54
  • please migrate to superuser if possible – SeanClt Feb 25 '16 at 17:08

1 Answers1

3

Long answer and details First of all, power cost usually goes in energy units of kW*h (not kW), e.g. $N if you consume 1kW for 1 hour. Then you pay $24N for 1 day. I guess you pay $0.11 per every kW per hour, i.e. $2.64 per day when using 1kW.

Secondly, you can measure power consumption of your server directly. Google Kill-a-Watt device or get yourself a "metered PDU". You can find more info about these things on the internet. You can stress your computer to see what are limits of power consumption, google "PC stress test".

Thirdly, there are some online power calculators. You can find one by looking for "Power supply calculator" or likes. There you fill a questionnaire and get rough estimate of power consumption.

Finally, remember that your PSU efficiency changes with utilization. Server using 950W on 1000W power supply rated output might use 1100W out of the source (efficiency ~90%). But if PSU uses only 500W, it could use 600W from the wall (85% efficiency). Match your PSU with your needs.

Another final note is about your system. Do you use UPS? If so you have an option of running UPS with 208V output. I might be wrong, but maybe UPS with 208V output runs more efficiently than UPS with 110V output. Even if both were plugged into same 110V wall source.

Short answer Worst-case scenario (100% utilization with 90% efficiency) you will pay $8 per day. Because your PSU total power consumption is around 3kW tops, per day it goes 3kW*$0.11/(kW*hr)*24hrs=$7.9