To 4K, or not to 4K

Soldato
Joined
16 Jul 2004
Posts
14,075
Hello!

I'm planning to make my once in an 8-10 year PC upgrade before Christmas, and I'm struggling to decide what to do about 4K vs 1440p.

I love the idea of having a high-refresh 4K panel, that would be great for productivity on my PC or laptop, but also beautiful and buttery smooth for gaming. The problem is that it seems as if even an RTX 2080 Ti won't drive particularly high FPS to get all the benefit of high refresh rate. I'm looking at SLI, but it seems not many games fully support it, or when they do the FPS upside is not great (or even a downside!) I could run challenging games at 1080p, but that won't be a pixel-perfect upscale and would deliver a worse picture than just having a 1080p display, and definitely inferior to a 1440p display.

Does anyone have experience or views of gaming on 1440p vs 4K, with or without SLI?

I will be playing a mix of titles from older games, to strat like Civ VI and Anno, WoW, to Star Wars Jedi Fallen Order. I'm usually not in to the AAA games.
 
Soldato
Joined
6 Jun 2008
Posts
11,618
Location
Finland
2560x1440 is the sweet spot at the moment and will be for some time.
There aren't really even any higher refresh rate 3840x2160 monitors yet!
We might get few more decent ones released during the winter, but who knows if availability goes really toward the spring/summer.
Getting advertised years ago newer panels into actual production has been slipping forward badly.
Heck, we don't even have 32" 2560x1440 144Hz IPSes, despite of LG originally having advertised one going into production year ago.
27" variant in LG 27GL850 actually has very good response times for IPS...
(but that would be no update from 6 year old 30" 2560x1600 I'm currently watching)


And certainly it will be some years before graphics cards can push that resolution at high refresh rates in heavier games.
For its ridiculous price 2080 Ti can hope to achieve just 60 fps level with full graphics settings.
And to tell something about how synced to butt pricing is, 3/4ths of the performance can be gotten for sub 40% of the price.
Neither you can give future proofness value for raytracing:
Depending on settings RTX cards lose 30-50% of performance from enabling it.
So next gen GPUs are very likely going to make them look bad in raytracing.

It's simply way best to buy now good performance per buck GPU (or keep one if having decent currently) and then buy higher end card for raytracing in year from now...
When we have GPUs which should be properly raytracing capable and likely also have lot better competition.

Again Crossfire/SLI have been completely dead in the water for years, so don't bother with them.
They always required GPU maker to do game specific tweaks into drivers to give any proper performance extra/avoid negative performance.
Something GPU makers haven't really bothered with in years.
Also DirectX12 made it responsibility of game developers to utilize multiple GPUs.
So you can guess how much they're going to spend time onto that.


As for "base platform" for such usage time PC AM4 is only any sense making.
Next year's Zen3 doesn't need to improve much to take single core performance crown from Intel in anything but the most pro-Intel code.
While Intel makes you pay luxury prices from yesteryear's hardware on dead end upgradability platform stuck to 8 core, which will be mainstream level when next-gen consoles come in year.
 
Soldato
Joined
6 Jun 2008
Posts
11,618
Location
Finland
Here are 6...
G-Sync monitors make sense only, if you like getting "synced to rear" by Nvidia and Yakuza-Jen.
And Nvidia has shown all the arrogance and greed of Intel and if possible more. So who knows if they can avoid AMD doing same to them in few years, as what AMD is now doing to Intel in CPUs.
Also while Intel might not be able to compete in high end at first, they've got huge bank account to utilize for pushing forward.
 
Associate
Joined
31 Jul 2019
Posts
515
If space and money were not an issue, I'd be wanting this: https://www.overclockers.co.uk/asus...-widescreen-led-gaming-monitor-mo-0a8-as.html

Whilst current GPUs might not be able to make full use of that monitor, if this is an 8-10 year investment, I would want to plan for the future a bit and I think GPU capabilities will move on relatively quickly. The PS5 is supposedly going to support 8K resolutions which might prove a catalyst to some of this.

For the time being you could always, dare I say it, scale down resolution to 1440p in game. At least you'd have the option there - invest in a 1440p monitor and that's the only option you'll have.
 
Soldato
OP
Joined
16 Jul 2004
Posts
14,075
Soldato
Joined
6 Jun 2008
Posts
11,618
Location
Finland
It'll be an i9-9900KS and RTX 2080 Ti build, it's just down to how many 2080's and which monitor :)
8 core isn't future proof CPU.
In year 8 cores will be literally mainstream, in two starting to fall behind high end.
Time of stagnation is over and regardless if Intel wants to continue sitting with their hands in butt, in buyer's butt, AMD keeps pushing progress forward.
12/16 core Core Zen3 would be way best bet for having that half dozen years of usage. Something you can put into AM4 boards after updating BIOS.
Once game developers figure out ways/new things to utilize high core counts, getting benefits past 8 cores will be easier than getting from "Four cores is high end" of Intel stagnation era to use of eight cores.
While Intel keeps having more and more vulnerabilities to plug in performance nibbling patches.


Crossfire and SLI have been about as alive as comatose on life support for couple years, so don't bother:
https://thetechaltar.com/is-multi-gpu-dead/

And while 2080 Ti is certainly the fastest graphics card at the moment, buying G-Sync only monitor is taking whole unlubed caber into butt.
Don't think anyone could have thought it possible for Intel to lose their huge advantage over AMD.
So don't go on believing that Nvidia can stay unchallenged.
You'll be wanting to change graphics card after few years to stay in high end and Nvidia could well be not the fastest by then.
Next-gen consoles will no doubt increase GPU power demands of games fast.
Besides wanting hardware whose performance doesn't nose dive from roof from enabling raytracing.


27GL850 is also likely the fastest response time monitor outside TNs and literally challenging even those. Because of TNs having lot wider spread of response times from best to worst transitions.
https://www.overclockers.co.uk/lg-2...le-widescreen-led-gaming-monit-mo-155-lg.html

Don't think any 3840x2160 panels have as good response times.
 
Man of Honour
Joined
13 Oct 2006
Posts
91,109
Everyone is going to be a bit different in what works best for them but I have the choice of 4K, 2560x1440 (144Hz) and a couple of different UW monitors (amongst other choices) and usually run a combination of two monitors but every time my main monitor is a 2560x1440 high refresh and I don't see myself changing that anytime soon. 4K is amazing for some things but 1440p just provides the best overall balance for various applications.
 
Soldato
Joined
28 Jan 2011
Posts
7,375
Everyone is going to be a bit different in what works best for them but I have the choice of 4K, 2560x1440 (144Hz) and a couple of different UW monitors (amongst other choices) and usually run a combination of two monitors but every time my main monitor is a 2560x1440 high refresh and I don't see myself changing that anytime soon. 4K is amazing for some things but 1440p just provides the best overall balance for various applications.

this.
 
Soldato
OP
Joined
16 Jul 2004
Posts
14,075
Once game developers figure out ways/new things to utilize high core counts
I would disagree here. We've had quad core CPUs for 13 years and yet we still don't have proper utilisation. I am confident we are a long way off fully utilising 8 cores, let alone needing 10 or 12 to run programs effectively.

You'll be wanting to change graphics card after few years to stay in high end and Nvidia could well be not the fastest by then.
I won't :) I've been on a GTX 580 since 2011 and that's worked fine!

Either way, all ordered now. Went for a single RTX 2080 Ti with the 1440p 165Hz screen :)
 
Soldato
Joined
6 Jun 2008
Posts
11,618
Location
Finland
I would disagree here. We've had quad core CPUs for 13 years and yet we still don't have proper utilisation.
PC Master Race: Proudly looking into past to buy standard hardware at luxury prices...

And why would have game developers spent resources in trying to utilize more than two cores when Intel kept cheap market PCs at 2c/4t?
Also that four cores was never dedicated to games with Windows and all other stuff eating from it.
While next-gen console games likely have 7 cores/14 threads dedicated to them.
At least don't see any reason to reserve more than one core for OS.
 
Back
Top Bottom