• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA ‘Ampere’ 8nm Graphics Cards

Soldato
Joined
24 Sep 2013
Posts
2,890
Location
Exmouth, Devon
Hi all, I'm starting to get tempted on pulling the trigger on a second-hand 2080Ti instead of waiting for a 3070.

So I'm asking the more informed here, what are going to be the differences between the two?

If I read about Ampere, its obviously the generational performance increase is a big deal, but if performance is 'equal' between a Turing and Ampere card... what features are missing? Any next-gen stuff (I won't need HDMI 2.1) or should they be the same?

(I have an ITX case so will be needing FE versions of either)


No one knows for sure until proper 3rd party reviews are out. Seems like AMpere may be a massive generational leap for games that use DLSS, but pure rasterization performance comparisons are yet to be seen and they may only be a small uplift in performance. Everyone is waiting for reviews.
 
Associate
Joined
6 Oct 2010
Posts
60
Has this been shared yet ? Now all we need is Goldilocks, we've got the three bears.

(sorry just seen it has 3 pages back)

Why do they have no centre sticker looks like the 3070 has had it ripped off ?


H05CDw0.jpg
https://videocardz.com/newz/nvidia-geforce-rtx-30-series-pictured-in-a-single-photo

That photo really shows the difference in size between the 3080 and 3090, bearing in mind there is only 30w difference in power usage between them (350w-3090, 330w-3080) makes me wonder how the 3080 cooler will cope :confused:
 
Associate
Joined
1 Oct 2009
Posts
1,033
Location
Norwich, UK
Will we change our minds if you pluck a figure out of thin air and claim that the +VRAM variants must be +£350 at least?

Funnily enough, no :p

The 3080 Ti might be £999, but it won't just be a VRAM bump.

A 16GB 3070 Ti I wouldn't expect to see north of £600 either.

But what I'd prefer more than anything is for AMD to launch models with a decent VRAM upgrade, so I can give them my money instead.

I never said 999 or +350, you're confusing me with someone else. I said a conservative estimate at +10Gb of vRAM would be £117 at minimum just for the additional memory modules, and that was an under-estimate because the cost was based on GDDR6 price per Gb not GDDR6x as I couldn't find a source for cost per Gb for GDDR6x, we do know its more expensive though. And then of course costs to possibly do a new change/updated PCB layout, and tool up manufacturing processes for different configs, and all that. So I said a conservative +£150 which I think is reasonable. I wouldn't pay £150 premium on top of an already expensive product in order to get no benefit, I'm just interested if others will.

If there's an update in the GPU for such a card then lovely but first of all it would still never justify 20Gb, that's way more memory than you'd need for gaming, maybe if it was leaning closer towards the speed of a 3090 then you could probably justify 12Gb.

I'm starting to see other people do deep dives on memory usage now as well because of this very same question about how much vRAM we really need, the few examples of titles we have where memory is actually exceeding 8Gb are being tested and looked at closer to see if those games really need that vRAM or if the engine merely allocates/reserves it, most performance tools will only tell you the latter, it's harder to get at real usage numbers.

So for example tests on FS2020 normal tools used to check vRAM usage show about 6Gb being used @ 3440x1440, but actual usage can be as low as 2.4Gb, with 4k ultra pushing actually closer to 9Gb of usage not the 12.5Gb we've seen listed prior in this thread. And again when you push your card that hard with settings that high, the frame rate is like 24fps, it's unplayable. So we're hitting GPU limits before we're hitting vRAM limits in modern titles, Avengers showed the same thing at 9.5Gb allocated and whatever that works out to under real use, my guess is probably less than 8Gb. You hit GPU bottlenecks in performance long before you hit vRAM bottlenecks.

You have to have a really set of specific reasons to expect that future games will be significantly more aggressive in vRAM usage while at the same time be significantly less aggressive on the GPU cycles...that's like topsyturvey world, that doesn't happen. That's the gamble you'd be making by buying a 20Gb 3080Ti over a 10Gb 3080. I think when people see the real world price difference and they see real world benchmarks and performance graphs showing games no where close to vRAM saturation and those which are being unplayable, they'll wont spend that extra cash.

I think 16Gb is also overkill, it's probably overkill for the 3090 even, I think a 3090 given how it's only realistically about 20% faster than a 3080 could get away with 12Gb. Right now if you estimate performance of these cards in something like FS2020 or Avengers based on relative speeds to other cards that have been benched, you do not achieve playable frame rates and you're still under 10Gb of vRAM actually used.

If AMD release something about the speed of a 3080 but with 16Gb of vRAM then you're probably paying for 6Gb of vRAM you can't use, which will drive the price of such a card up over what it otherwise could be. You've made this point a few times in the past alluding to me being talking specifically about Nvidia, when I never have. This is a general principle that applies to any video card. It sort of sounds like you really want any excuse to buy AMD and given all of your constant jabs about "leather jacket man" and Nvidia being "stingy" about whatever...I'd be interested to see what you do if AMD release a 3080 level card with 10Gb of vRAM.
 

TNA

TNA

Caporegime
Joined
13 Mar 2008
Posts
27,518
Location
Greater London
That photo really shows the difference in size between the 3080 and 3090, bearing in mind there is only 30w difference in power usage between them (350w-3090, 330w-3080) makes me wonder how the 3080 cooler will cope :confused:
Yes, I have to say I am slightly concerned about this myself. Wish the 3080 was closer to the size of the 3090 cooler.
 
Soldato
Joined
31 Dec 2007
Posts
13,616
Location
The TARDIS, Wakefield, UK
Yes, I have to say I am slightly concerned about this myself. Wish the 3080 was closer to the size of the 3090 cooler.

I think its that bit in the middle it looks a bit like a woman with a corset on sucking in the mid drift. :)

There seems to be a heckuva lot of surface area though over the end of the card but nothing to blow air at it. I think thats why one of the cards has a rear fan.

The 3090 and 3070 look right in proportion the 3080 looks a bit like the ugly one of the brood.
 
Soldato
Joined
21 Jun 2005
Posts
9,126
It's my birthday on the 18th and I'm out on the 17th, I need to see how easy it is to order on a phone!

I do have a few questions though and maybe we won't know before the reviews come out...

Will I get a bottleneck with my current CPU 7700K overclocked to 4.9 gaming at 1440p/165hz while streaming or do I need to think about a bigger upgrade? I kind of wanted to hold off on the big bang upgrade until the next-gen of intel chips so I can upgrade to that.

Also, if you order a FE or AIB from Overclockers how do you know if you are one of the lucky few that will receive it on the launch date?

Also I know we don't have the reviews but with the 750w PSU would I be safer going for a FE over an AIB?

Thanks
 
Last edited:
Associate
Joined
8 Feb 2014
Posts
846
Location
Aberdeen
They look very bland and basic which I guess is alright for a case with no side windows. I like the glowing writing along the top of my 1080ti however.
 
Caporegime
Joined
20 Jan 2005
Posts
45,677
Location
Co Durham
I think its that bit in the middle it looks a bit like a woman with a corset on sucking in the mid drift. :)

There seems to be a heckuva lot of surface area though over the end of the card but nothing to blow air at it. I think thats why one of the cards has a rear fan.

.

Don;t get what you mean? There doesnt seem to be that much not covered by the fan?

Its the fact that the 3090 has a cover on the second heatpipe bit that confuses/concerns me.
 
Associate
Joined
4 Aug 2014
Posts
1,111
No one knows for sure until proper 3rd party reviews are out. Seems like AMpere may be a massive generational leap for games that use DLSS, but pure rasterization performance comparisons are yet to be seen and they may only be a small uplift in performance. Everyone is waiting for reviews.

thanks, I'm trying to see if there has been any mention about certain tech being 3000 series only. I don't mind a card not being 'the fastest', but curious about whether some cool things might be unavailable on Turing cards.
 
Permabanned
Joined
8 Dec 2015
Posts
1,485
No way, he just used that misleading line in the video to boost 3090 sales.

If he had said there will be a Titan later then some people would forget the 3090 and wait for it.
Argh i get ya, nice tactic, so you was just joking then? Was that Jensen you was talking about as ive not seen any video, im using my initiative here :)
 
Caporegime
Joined
20 Jan 2005
Posts
45,677
Location
Co Durham
I agree with kaap. The 3090 isnt a full titan and there has been rumours of a Titan with all its cores and 48Gb ram. I will expect that Titan to cost £2500 though but it will be a monster.

And with the memory issues, if the 48gb is right, we wont see a Titan until next year but I am 100% confident we will see one.

Then Kaap will buy 4 ;)
 
Soldato
Joined
19 Jan 2006
Posts
4,531
Hi all, I'm starting to get tempted on pulling the trigger on a second-hand 2080Ti instead of waiting for a 3070.

(I have an ITX case so will be needing FE versions of either)

There are other 2 fan models of the 3070 available which should be short enough and are roughly 2.2 slot cards - I'm personally liking the look of this as my card of choice for my ITX build https://www.msi.com/Graphics-card/GeForce-RTX-3070-VENTUS-2X Would it fit for you?
 
Permabanned
Joined
8 Dec 2015
Posts
1,485
I agree with kaap. The 3090 isnt a full titan and there has been rumours of a Titan with all its cores and 48Gb ram. I will expect that Titan to cost £2500 though but it will be a monster.

And with the memory issues, if the 48gb is right, we wont see a Titan until next year but I am 100% confident we will see one.

Then Kaap will buy 4 ;)
I see what you mean.
 
Associate
Joined
4 Aug 2014
Posts
1,111
Back
Top Bottom