1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

NVIDIA Next Generation ‘Ampere’ 8nm Graphics Cards

Discussion in 'Graphics Cards' started by LoadsaMoney, 4 Oct 2019.

  1. TT158

    Associate

    Joined: 15 May 2020

    Posts: 98

    I think mine uses around 9 or 9.5 in AC Odyssey at 4k ultra.

    12gb on the new one does seem like a potential limiting factor in the nearish future at 4K with consoles about to start using much higher quality assets.

    Although I think I read something about 3000 series maybe using less VRAM fortthe same scenario due to a better compression tech or something? I could also be totally making that up. If untrue, then 12gb is very disappointing in my mind.
     
  2. opethdisciple

    Capodecina

    Joined: 18 May 2010

    Posts: 20,441

    Location: London

    The 2000 series has some new compression algorithm.
     
  3. TT158

    Associate

    Joined: 15 May 2020

    Posts: 98

  4. Chuk_Chuk

    Wise Guy

    Joined: 12 May 2014

    Posts: 1,630

    Assest Quality is important to memory usage. If the next generation of games recieve a bump in assest quality then 12Gb will not be enough regardless of resolution.
     
  5. CuriousTomCat

    Wise Guy

    Joined: 22 Nov 2018

    Posts: 1,262

    AMD would just bung 16Gb on. Typical Nvidia holding back. I'm not surprised at all. I'm even expecting a repeat of last gen where people had to RMA their cards due to artifacts.
     
  6. DanielSon

    Gangster

    Joined: 20 Oct 2002

    Posts: 474

    Location: Sussex

    vram compression and bandwidth saving is deff a thing. scaling up the compute perf inside the GPU is lower cost and more efficient than just raw bandwidth. Having bandwidth and compression is a performance win :). look what PS5 is doing with the SSD. Would not be surprised if PC starts using the cpu chiplet to do similar stuff down the road (accelerate communication links)

    Dan
     
  7. deuse

    Capodecina

    Joined: 17 Jul 2007

    Posts: 20,865

    Location: Solihull-Florida


    Well if most people don't pay that kind of money.
    The prices will fall.
     
  8. TNA

    Capodecina

    Joined: 13 Mar 2008

    Posts: 16,287

    Location: London

    Well I had a Titan XP and games like final fantasy 15 were using close to the whole 12gb. What will the new final fantasy 7 Remake use I wonder? What about other next gen games?

    I am disappointed as 12gb should be on the 3070/80. And 16gb on the 3080Ti or whatever it will be called.

    What I don't like to see is a high price and skimping on ram so instead of us getting more ram the money goes towards even more profits. Now at least if these cards are priced very competitively due to AMD and consoles then I will be able to at least live with it.


    Did they also not say that about the fury x? In the end 4gb was 4gb. So I will believe it when I see it.
     
  9. TheRealDeal

    Sgarrista

    Joined: 28 May 2007

    Posts: 9,518

    Lol it's really only £100 but that's a 3 bedroom house so not bad.
     
  10. Richdog

    Capodecina

    Joined: 8 Sep 2005

    Posts: 24,939

    Location: Utopia

    I wonder if I am the only one getting sick of these incessant "omg a company is making profits at our expense" posts. It's like some of you have just crawled out from under a rock and learned how big businesses and corporations work.

    Nvidia are not going to intentionally cripple their top-end cards by not providing enough VRAM for demanding games at high resolutions and they are also not going to add more VRAM than they need due to the added expense. I have no logical reason to doubt that given how competitive this generation will be and how important that they retain the performance crown that they will put enough VRAM on the cards do the job in combination with the compression techniques they implement.
     
  11. TT158

    Associate

    Joined: 15 May 2020

    Posts: 98

    They won't put an amount of VRAM low enough to cripple them now, but will err to the lower end of what they can get away with to make future upgrades more likely. Planned obsolescence is a thing, and they know that people will look at benchmarks and relative performance in reviews first and be likely to forget about the life they'll get out of the card.
     
  12. HRL

    Wise Guy

    Joined: 22 Nov 2005

    Posts: 1,628

    Location: London


    I'll have a look at that myself tonight as that's the game I've just started playing. If it's going to be a problem next gen then I'd be pretty upset, but I've yet to see it cause problems and it's one of those things that always crops up when new cards are inbound.
     
  13. HRL

    Wise Guy

    Joined: 22 Nov 2005

    Posts: 1,628

    Location: London

    Fair point if that's true. Is it though? I love games but I don't know how they're made. :D
     
  14. Richdog

    Capodecina

    Joined: 8 Sep 2005

    Posts: 24,939

    Location: Utopia

    Planned obsolescence has always been a thing in the PC and electronics industry... it is nothing new or unexpected. Unless you have insider knowledge of Nvidia's new hardware design and compression techniques then I think it is safe to assume that the new high-end cards will last 2-3 years of demanding gaming at up to 4k.
     
  15. Chuk_Chuk

    Wise Guy

    Joined: 12 May 2014

    Posts: 1,630

    With the lastest consoles going from 8GB RAM to 16GB RAM, I am anticipating an increase in model complexity and texture resolution will take up all that extra memory.
     
  16. TT158

    Associate

    Joined: 15 May 2020

    Posts: 98

    No mate, no insider knowledge. I agree that planned obsolescence had always been a thing, but given the rumours around 3000 series' small / zero increase in vram and the expected increase in texture sizes driven by the new consoles, I'm feeling like it will be more aggressive than normal, and they will likely get away with it in the short term. Hope I'm wrong though :)
     
  17. TNA

    Capodecina

    Joined: 13 Mar 2008

    Posts: 16,287

    Location: London

    And I get tired of post like yours pointing out the obvious. We all know that. Won't stop me from wanting more value for my money. Essentially it seems like you are saying stfu and be happy with whatever they put in front of you and pay whatever they ask.

    Nah. I think I will continue to post what I am thinking and that is that we should be getting more than 12gb on a top end card that is essentially a 2021 card by the time it hits full availability.

    End of the day if they are putting 12gb on a 3080ti, they are likely to put less on a 3070, which is not good enough unless they really have some magic under the hood that makes 8gb act like 12gb or more.

    The games I was playing early last year on my Titan XP gobbled up 10-12gb of ram (I game at 4K). Now the 3070 will surely offer much more grunt than that gpu, but it might end up with less ram. So should I be happy about this?
     
  18. TNA

    Capodecina

    Joined: 13 Mar 2008

    Posts: 16,287

    Location: London

    Exactly, but you know, let's be happy with whatever nvidia put in front of us shall we :rolleyes:

    :D
     
  19. Besty

    Mobster

    Joined: 18 Oct 2002

    Posts: 3,540

    The 1080ti had 11GB of VRAM and once the 3000 series launches it will be almost three and half years since the 1080ti emerged.

    Unless there is some special sauce in here, I can't see people being happy with a 8GB but maybe some 3rd party boards will pack more.
     
  20. opethdisciple

    Capodecina

    Joined: 18 May 2010

    Posts: 20,441

    Location: London

    Whatever happened to the high bandwidth cache the Vegas had? It was used in like one game and since then nothing at all.