1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

AMD will need a higher ROPS count for Navi 2X GPUs, for 4K gaming

Discussion in 'Graphics Cards' started by g67575, 17 Jun 2020.

  1. g67575


    Joined: 30 Jun 2019

    Posts: 568

    ROPs (Render Output Unit) are very important for rendering more pixels, for higher resolutions.

    To my knowledge, there are only 2 consumer graphics cards with more than 64 ROPs, the GTX 1080 TI and the RTX 2080 TI.

    To double the pixel rate (GPU clock x ROP count) of my current Graphics card (R9 390), I'd need to buy either a super overclocked Navi RX 5700 XT, or (expensive) Nvidia cards such as the GTX 1080 TI, or RTX 2080 TI.

    My old R9 390 has the same number of ROPs (64) as most modern GPUs.

    Alternatively, AMD could just use much higher GPU core clocks, which in my view, would be a worse option.

    EDIT - just noticed that the Xbox X series GPU will have 80 ROPs, with a slightly lower clock than the Navi RX 5700 XT, so pixel rates of ~152,400‬ (1905 MHz x 80 ROPs) should be achievable for Navi 2X GPUs.
    Last edited: 18 Jun 2020
  2. Finners


    Joined: 27 Mar 2009

    Posts: 2,593

    Apparently my titan X Pascal has 96Rops so assume the Titan Xp has the same.

    Just looked and even the 980Ti has 96 ROPs so I don't think it's the be all and end all of performance
  3. g67575


    Joined: 30 Jun 2019

    Posts: 568

    Nope, but lower pixel rate is a big disadvantage, and games will focus more and more on higher resolutions with the new console releases this year.

    Also, the 980Ti has a fairly low GPU clock of 1000, which reduces the pixel rate quite a bit.
    Last edited: 18 Jun 2020
  4. Grim5


    Joined: 6 Feb 2019

    Posts: 4,474

    He is right and it's the same reason why I doubt next gen consoles will have any 8k games.

    Even if you have the raw processing power, if the pipeline isn't built for it you can get a bottleneck.

    There are some 4K vs 8K 2080ti benchmarks out there and in some games the performance difference is massively different on 8k - because the gpu pipeline is bottlenecked - what I mean by this is there are games that can run at solid 60fps 4K on the 2080ti but when you change the resolution to 8k you get 15-20fps and not the 30fps that you would logically expect.

    edit: Here is some other examples

    Far Cry 5 on a 2080ti NVLink 4k High settings: 105fps
    Same settings but now at 8k: 37fps

    Shadow of Tomb Raider on 2080ti NVLink 4k High settings: 126fps
    Same settings but now at 8k: 40fps

    And then take F1 2018, a game obviously less reliant on ROPs
    F1 2018 on 2080ti NVLink High setting: 140fps
    Same settings but now at 8k: 80fps


    So for F1 2018, it's very close to the 50% performance loss I would expect with perfect scaling of 4k to 8k pixels. But for Tomb raider and Far Cry 5 the performance loss is close to 70% - indicating the GPU is suffering from a bottleneck in it's pipeline.
    Last edited: 18 Jun 2020
  5. g67575


    Joined: 30 Jun 2019

    Posts: 568

    I suppose I'm not surprised, 8K resolution is 33.177‬ million pixels, 4K resolution is 8.294 million pixels. Rendering 4x the number of pixels should logically need a GPU with about 4x the pixel rate.

    That's a lot of ROPs...

    I think that's why AMD has mentioned aiming for 4K in games, not 8K.
    Last edited: 18 Jun 2020
  6. KillBoY_UK


    Joined: 20 Apr 2004

    Posts: 4,150

    Location: Oxford

    There Rops where a real issue running up to Polaris but not so much anymore. The lack of ROPS showed badly with Fuji when cutting a load of shaders from the Fury X to the Fury had such a minor impact on performance. After that point memory bandwidth was AMD's gpu bottle neck
  7. HRL

    Wise Guy

    Joined: 22 Nov 2005

    Posts: 1,672

    Location: London

    There are no 8K games on the next gen consoles, just 8K video playback.
  8. SE-Lain


    Joined: 11 Nov 2003

    Posts: 124

    Location: Glasgow, Scotland

    Btw 8K is four times the pixels of 4K, not double:


  9. EsaT


    Joined: 6 Jun 2008

    Posts: 8,654

    Location: Finland

    Tell that to monitor panel makers...
    Guess they didn't get the memo and instead are just overclocking medieval 1920x1080.
  10. Kirby Wurm


    Joined: 31 Jan 2012

    Posts: 940

    Location: Wychbold

    Most variants of the card actually ran at boost clocks of 1200MHz+, but it's still pretty low by current standards.

    80 rops on the console chips? That's interesting to know.
  11. g67575


    Joined: 30 Jun 2019

    Posts: 568

    Nah, unfortunately Sony / AMD have (reportedly) agreed on the 'standard' 64 ROPs for the PS5 GPU:

    Specs here:

    However, the important thing is that the overall pixel rate is very similar to the Xbox Series X GPU (both over 140GPixel/s), due to the much higher GPU clock rate of 2233 MHz.

    If you combine the above clock rate with 80 ROPs, an impressive pixel rate of 178,640‬ is possible (maybe for higher end Navi 2x GPUs).
    Last edited: 18 Jun 2020
  12. g67575


    Joined: 30 Jun 2019

    Posts: 568

    I'm on a 28" 4K monitor and was wondering, how does VSR / DSR on a 1080p monitor, rendering at 4K resolution, compare to native 4k resolution?

    Has anyone compared it? Does VSR / DSR @ 4K on a small 1080p monitor get rid of the aliasing / jagged edges in games?
  13. Grim5


    Joined: 6 Feb 2019

    Posts: 4,474

    It will help but 4K resolution by itself is not enough to get rid of all jaggies, some form of anti aliasing is still needed. Only at 8k resolution is the need for anti aliasing removed from games
  14. g67575


    Joined: 30 Jun 2019

    Posts: 568

    I tried 8K on my 4K monitor on GTA V (frame scaling enabled) and it looked even better than MSAA 8x antialiasing, surprisingly I could get around 20 FPS even on an old GPU.

    Should be possible to run some games at 8K with next gen cards I think, if not running all settings at max.
    Last edited: 21 Jun 2020