I have a 980Ti and dual monitors (144 + 60) and have *never* been able to resolve the clocks being stuck high when on desktop. I eventually gave up and just lived with it. But now I am pretty much certain to go for a 3080 I am wondering how much this is going to rack up in unnecessary power usage compared to my 980 setup. Apparently any card does use more power when running at high clocks on the desktop, but I have only seen this described as "slightly more" or vague words to that effect, since the card is not doing any actual true work. Does anyone have any clue as to how much it more it uses in real terms, between normal idle clocks compared to just stuck high clocks on desktop?
If it truly is just a tad more, like say 2-3% over a proper idle clock, then I would probably not bother trying to resolve it again and just live with it, if the same ratio carries over to the 3080 compared to a 980/1080 etc. But with newer cards sucking up more and more juice, it is making me wonder how much power will I be chucking down the drain. It's things like this that would definitely make me think twice about a 3090, though luckily I am probably never going to need that power with my current monitor setup. Would be nice if the thing just worked as advertised though of course.
If it truly is just a tad more, like say 2-3% over a proper idle clock, then I would probably not bother trying to resolve it again and just live with it, if the same ratio carries over to the 3080 compared to a 980/1080 etc. But with newer cards sucking up more and more juice, it is making me wonder how much power will I be chucking down the drain. It's things like this that would definitely make me think twice about a 3090, though luckily I am probably never going to need that power with my current monitor setup. Would be nice if the thing just worked as advertised though of course.