We have a 600W APC UPS and a Server with two power adapters (each 800W). Originally, both power adapters were connected to the UPS, and the UPS connected to the power grid. Unfortunately the room where our server is located has no air con. So on a hot day it happened that the server drew more energy that the UPS could deliver and the system failed.
From the spec it's obvious that a 600W UPS is too weak to power a 2x800W Server. But our IT supplier argues that under normal circumstances the server would consume way below 600W, probably 300W, and our room was too hot.
But as we are a school with limited resources we can neither buy a new UPS with 1,600W capacity nor install an air con. So I am considering the following setup:
- 1 power adapter connected the UPS, which is connected to the power grid
- 1 power adapter directly connected to the power grid, bypassing the UPS
Justification: From this question I learned that a server with two power adapters distributes its load more or less evenly among both power sources. Then there would be 4 scenarios.
- Cold day: 300W power load by the server, 150W via the UPS, 150W directly from the power grid
- Cold day + power cut: UPS has to bear 300W
- Hot day: >600W power load by the server, >300W via the UPS, >300W directly from the power grid
- Hot day + power cut: UPS has to bear >600W
As only case 4 is a bad case, the setup reduces the risk of system failure. Or am I wrong? On other posts I learned about UPS-bypasses for maintenance or for redundancy. But here it would be a productive setup. I should add that the main purpose to have a UPS is to protect the server hardware from damages in case of a power cut, not to ensure 99.99% uptime. 99% uptime are also OK for us. Also, power cuts at our location are rare, maybe 1-2 per year, so case 4 should be even rarer.