last I checked with a kill-a-watt I was drawing an average of 2.5kWh after a week of monitoring my whole rack. that was about three years ago and the following was running in my rack.
r610 dual 1kw PSU
homebuilt server Gigabyte 750w PSU
homebuilt Asus gaming rig 650w PSU
homebuilt Asus retro(xp) gaming/testing rig 350w PSU
HP laptop as dev env/warmsite ~ 200w PSU
Amcrest NVR 80w (I guess?)
HP T610 65w PSU
Terramaster F5-422 90w PSU
TP-Link TL-SG2424P 180w PSU
Brocade ICX6610-48P-E dual dual 1kw PSU
Misc routers, rpis, poe aps, modems(cable & 5G) ~ 700w combined (cameras not included, brocade powers them directly)
I also have two battery systems split between high priority and low priority infrastructure.
I was drawing an average of 2.5kWh after a week of monitoring my whole rack
That doesn’t seem right; that’s only ~18W. Each one of those systems alone will exceed that at idle running 24/7. I’d expect 1-2 orders of magnitude more.
last I checked with a kill-a-watt I was drawing an average of 2.5kWh after a week of monitoring my whole rack. that was about three years ago and the following was running in my rack.
I also have two battery systems split between high priority and low priority infrastructure.
That doesn’t seem right; that’s only ~18W. Each one of those systems alone will exceed that at idle running 24/7. I’d expect 1-2 orders of magnitude more.
IDK, after a week of runtime it told me 2.5kwh average. could be average per hour?
Highest power bill I ever saw was summer of 2022. $1800. temps outside were into to 110-120 range and was the hottest ever here.
maybe I’ll hook it back up, but I’ve got different (newer) hardware now.
If it gives you kWh as a measure for power, you should toss it because it’s obviously made by someone who had no idea what they were doing.