PSUs are waaaaay more efficient when operating closer to their rated capacity. Pulling 200W through a 1kW power supply is like making a marathon runner breathe through a straw.
But it doesn’t make that much of a difference. The efficiency swing is maybe 10%. Like an bronze 80 rated PSU will have a minimum efficiency of 80%, but even if you’re at the 50% load mark it won’t be over 90% efficient.
The main point (to me anyways) is that its dumb to pay more for a power supply just so you can pay "more* on your power bill. If your idle load is 100W and your gaming load is 300W, you’ve got no reason running more than a 600W PSU
I’ve got a 850W power supply, which I bought 2-3 years ago in anticipation of the RTX 4000 series. My usual load with a GTX 1080 was 150W and now my entire system uses 520W completely loaded. Do I count? :)
I have a 4090 in my Ryzen 7700X system and a power meter; 850W is overkill for a 4090. My system never uses more than 650w. What’s more important than the power rating is buying a high-tier PSU with good overcurrent protection, cause the 4090 tends to have power spikes even a good 750w PSU should be able to handle.
If you bought a PSU certified for PCIe 5, then you’re most likely fine. If you didn’t have to use a squid adapter to plug in your GPU, then you’re more than likely good to go so long as you didn’t buy a shit tier PSU.
While true. How much would it actually save you in electricity? If you upgrade every year wouldn’t it be cheaper to just buy the bigger psu outright and pay the extra cost in electricity so you don’t have to buy another PSU when you get more power hungry components.
PSUs are waaaaay more efficient when operating closer to their rated capacity. Pulling 200W through a 1kW power supply is like making a marathon runner breathe through a straw.
The sweet spot is the 40-60% load.
But it doesn’t make that much of a difference. The efficiency swing is maybe 10%. Like an bronze 80 rated PSU will have a minimum efficiency of 80%, but even if you’re at the 50% load mark it won’t be over 90% efficient.
The main point (to me anyways) is that its dumb to pay more for a power supply just so you can pay "more* on your power bill. If your idle load is 100W and your gaming load is 300W, you’ve got no reason running more than a 600W PSU
I’ve got a 850W power supply, which I bought 2-3 years ago in anticipation of the RTX 4000 series. My usual load with a GTX 1080 was 150W and now my entire system uses 520W completely loaded. Do I count? :)
I have a 4090 in my Ryzen 7700X system and a power meter; 850W is overkill for a 4090. My system never uses more than 650w. What’s more important than the power rating is buying a high-tier PSU with good overcurrent protection, cause the 4090 tends to have power spikes even a good 750w PSU should be able to handle.
If you bought a PSU certified for PCIe 5, then you’re most likely fine. If you didn’t have to use a squid adapter to plug in your GPU, then you’re more than likely good to go so long as you didn’t buy a shit tier PSU.
While true. How much would it actually save you in electricity? If you upgrade every year wouldn’t it be cheaper to just buy the bigger psu outright and pay the extra cost in electricity so you don’t have to buy another PSU when you get more power hungry components.