Soldato
PcPer released an interesting article about increased power draw with high refresh rates.
http://www.pcper.com/news/Graphics-...raw-Increased-Refresh-Rates-using-ASUS-PG279Q
Afaik, according to the article it's Nvidia specific, as my clocks sit at a static 135MHz@144Hz, can't see how it would draw more power at those clocks myself, anyone tested this out@165Hz to see if their clocks do increase?
At 60Hz refresh rate, the monitor was drawing just 22.1 watts while the entire testing system was idling at 73.7 watts. (Note: the display was set to its post-calibration brightness of just 31.) Moving up to 100Hz and 120Hz saw very minor increases in power consumption from both the system and monitor.
But the jump to 144Hz is much more dramatic – idle system power jumps from 76 watts to almost 134 watts – an increase of 57 watts! Monitor power only increased by 1 watt at that transition though. At 165Hz we see another small increase, bringing the system power up to 137.8 watts.
Interestingly we did find that the system would repeatedly jump to as much as 200+ watts of idle power draw for 30 seconds at time and then drop back down to the 135-140 watt area for a few minutes. It was repeatable and very measurable.
http://www.pcper.com/news/Graphics-...raw-Increased-Refresh-Rates-using-ASUS-PG279Q
Afaik, according to the article it's Nvidia specific, as my clocks sit at a static 135MHz@144Hz, can't see how it would draw more power at those clocks myself, anyone tested this out@165Hz to see if their clocks do increase?