• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA ‘Ampere’ 8nm Graphics Cards

Soldato
Joined
6 Jan 2013
Posts
21,839
Location
Rollergirl

I don't believe there is any difference between 1080@144 and 1440@144. You're pushing the same frames, therefore the CPU has the same work to do. There's a misconception that higher resolution is less taxing on the CPU, but that's only because a) higher resolution is generally lower refresh & b) you're generally GPU limited at higher resolution therefore not running at the maximum refresh.

Edit: Just to be clear, Ampere is likely to push 1440 closer to 144hz therefore a bottleneck is likely if the CPU is weak.
 
Associate
Joined
7 Apr 2017
Posts
144
Location
London-Amstadam
I don't believe there is any difference between 1080@144 and 1440@144. You're pushing the same frames, therefore the CPU has the same work to do. There's a misconception that higher resolution is less taxing on the CPU, but that's only because a) higher resolution is generally lower refresh & b) you're generally GPU limited at higher resolution therefore not running at the maximum refresh.

Edit: Just to be clear, Ampere is likely to push 1440 closer to 144hz therefore a bottleneck is likely if the CPU is weak.
So, presumably, a 1700 at 3.8ghz should be fine with a higher end GPU in gaming and Davinci Resolve, the GPU should be taking up the extra slack on modern titles and editing software, at least to the extent that there shouldn't be any major differences in gaming or even editing. As I said, I do plan to eventually upgrade to 4000 series at some point next year.
 
Associate
Joined
14 Aug 2017
Posts
1,194
It's the CPU he's concerned about, not the GPU.

Right, but the conversation goes -
"A 3700X will bottleneck a high end Ampere at 1080p"
"But why run the biggest ampere at 1080p?"
"Refresh rate"

I was pointing out that even if you were aiming for a high refresh rate at 1080p, you're unlikely to need a "big" ampere, and by deduction from there the 3700X isn't really going to be a realistic bottleneck.

I'm actually kind-of amazed at the original comment - in what world is one of the current best processors on the market going to be a bottleneck ?
 
Soldato
Joined
6 Jan 2013
Posts
21,839
Location
Rollergirl
Right, but the conversation goes -
"A 3700X will bottleneck a high end Ampere at 1080p"
"But why run the biggest ampere at 1080p?"
"Refresh rate"

I was pointing out that even if you were aiming for a high refresh rate at 1080p, you're unlikely to need a "big" ampere, and by deduction from there the 3700X isn't really going to be a realistic bottleneck.

I'm actually kind-of amazed at the original comment - in what world is one of the current best processors on the market going to be a bottleneck ?

The original comment from Grim was theoretical to emphasise that there are factors involved with a bottleneck, it's not a given that CPU "a" will bottleneck GPU "b". That's why you see CPU benchmarks at 720p.

Yes, it's unlikely that you'd need massive GPU power for 1080p gaming, even at higher refresh. But that's not what the guy is querying; he's already decided to buy top end Ampere and he's concerned his 1700 will bottleneck it.
 
Soldato
Joined
6 Jan 2013
Posts
21,839
Location
Rollergirl
You don't need a Ampere to run 1080p on Ultra @ 144hz, A 2080s/2080Ti even a 1080Ti would accomplish that

I'm not sure a 1080ti could hold 144hz with ultra settings. It's surprisingly taxing on the GPU even at that resolution. I'm sure I tested Firestrike previously with a 1080ti/6700k setup and the GPU couldn't keep up at 144.

Edit to add: even a 2080ti can struggle for 144 in AAA games.

https://www.techpowerup.com/review/...er-benchmark-test-performance-analysis/4.html

https://www.techpowerup.com/review/kingdom-come-deliverance-benchmark-performance-test/4.html

https://www.guru3d.com/articles_pag..._graphics_performance_benchmark_review,6.html
 
Last edited:

bru

bru

Soldato
Joined
21 Oct 2002
Posts
7,360
Location
kent
I don't believe there is any difference between 1080@144 and 1440@144. You're pushing the same frames,

Say what........?
Your not pushing the same anything, each 1080p frame is 2 ,076,300 pixels.
1440p frame is 3,686,400 pixels.

Effectively slightly under twice the work each frame.
If you don't believe their is any difference, then I would suggest you go read up on it a bit.;)
 
Soldato
Joined
6 Jan 2013
Posts
21,839
Location
Rollergirl
Say what........?
Your not pushing the same anything, each 1080p frame is 2 ,076,300 pixels.
1440p frame is 3,686,400 pixels.

Effectively slightly under twice the work each frame.
If you don't believe their is any difference, then I would suggest you go read up on it a bit.;)

We're talking about CPU work load, not GPU work load. ;)
 
Associate
Joined
16 Jan 2010
Posts
1,415
Location
Earth
I'm not sure a 1080ti could hold 144hz with ultra settings. It's surprisingly taxing on the GPU even at that resolution. I'm sure I tested Firestrike previously with a 1080ti/6700k setup and the GPU couldn't keep up at 144.

Edit to add: even a 2080ti can struggle for 144 in AAA games.

https://www.techpowerup.com/review/...er-benchmark-test-performance-analysis/4.html

https://www.techpowerup.com/review/kingdom-come-deliverance-benchmark-performance-test/4.html

https://www.guru3d.com/articles_pag..._graphics_performance_benchmark_review,6.html
Yep and that's at today's AAA games/Ultra settings. We'll soon see higher end cards struggling at this resolution with RT if Nvidia/AMD don't stop drip feeding performance improvements from an eye dropper.
 
Soldato
Joined
9 Dec 2006
Posts
9,230
Location
@ManCave
I'm not sure a 1080ti could hold 144hz with ultra settings. It's surprisingly taxing on the GPU even at that resolution. I'm sure I tested Firestrike previously with a 1080ti/6700k setup and the GPU couldn't keep up at 144.

Edit to add: even a 2080ti can struggle for 144 in AAA games.

https://www.techpowerup.com/review/...er-benchmark-test-performance-analysis/4.html

https://www.techpowerup.com/review/kingdom-come-deliverance-benchmark-performance-test/4.html

https://www.guru3d.com/articles_pag..._graphics_performance_benchmark_review,6.html

the sad thing is lot of people say a 1080TI can do 144hz at 1080p and there right if you want 144FPS every 30 minutes or so.
  • Hitting 144FPS & maintaining 144FPS are two different things
  • can you say your taking advantage of your 144hz monitor with a 1080Ti yes in some games you can Rocket league for example. but AAA No.
    (if you exclude Doom & there perfect Engine....)
 
Soldato
Joined
23 Jul 2009
Posts
14,083
Location
Bath
So I've been putting 50 quid a month aside, and come September I'll have about 600 quid lined up, with a bit of topping up from the credit card available. Fingers crossed 700 quid nets me something decent. Looks like the 3080ti is off the cards, but at this point anything that gives a solid jump from my 980ti is welcome
 
Back
Top Bottom