Will we change our minds if you pluck a figure out of thin air and claim that the +VRAM variants
must be +£350 at least?
Funnily enough, no
The 3080 Ti might be £999, but it won't just be a VRAM bump.
A 16GB 3070 Ti I wouldn't expect to see north of £600 either.
But what I'd prefer more than anything is for AMD to launch models with a decent VRAM upgrade, so I can give them my money instead.
I never said 999 or +350, you're confusing me with someone else. I said a conservative estimate at +10Gb of vRAM would be £117 at minimum just for the additional memory modules, and that was an under-estimate because the cost was based on GDDR6 price per Gb not GDDR6x as I couldn't find a source for cost per Gb for GDDR6x, we do know its more expensive though. And then of course costs to possibly do a new change/updated PCB layout, and tool up manufacturing processes for different configs, and all that. So I said a conservative +£150 which I think is reasonable. I wouldn't pay £150 premium on top of an already expensive product in order to get no benefit, I'm just interested if others will.
If there's an update in the GPU for such a card then lovely but first of all it would still never justify 20Gb, that's way more memory than you'd need for gaming, maybe if it was leaning closer towards the speed of a 3090 then you could probably justify 12Gb.
I'm starting to see other people do deep dives on memory usage now as well because of this very same question about how much vRAM we really need, the few examples of titles we have where memory is actually exceeding 8Gb are being tested and looked at closer to see if those games really need that vRAM or if the engine merely allocates/reserves it, most performance tools will only tell you the latter, it's harder to get at real usage numbers.
So for example tests on FS2020 normal tools used to check vRAM usage show about 6Gb being used @ 3440x1440, but actual usage can be as low as 2.4Gb, with 4k ultra pushing actually closer to 9Gb of usage not the 12.5Gb we've seen listed prior in this thread. And again when you push your card that hard with settings that high, the frame rate is like 24fps, it's unplayable. So we're hitting GPU limits before we're hitting vRAM limits in modern titles, Avengers showed the same thing at 9.5Gb allocated and whatever that works out to under real use, my guess is probably less than 8Gb. You hit GPU bottlenecks in performance long before you hit vRAM bottlenecks.
You have to have a really set of specific reasons to expect that future games will be significantly more aggressive in vRAM usage while at the same time be significantly less aggressive on the GPU cycles...that's like topsyturvey world, that doesn't happen. That's the gamble you'd be making by buying a 20Gb 3080Ti over a 10Gb 3080. I think when people see the real world price difference and they see real world benchmarks and performance graphs showing games no where close to vRAM saturation and those which are being unplayable, they'll wont spend that extra cash.
I think 16Gb is also overkill, it's probably overkill for the 3090 even, I think a 3090 given how it's only realistically about 20% faster than a 3080 could get away with 12Gb. Right now if you estimate performance of these cards in something like FS2020 or Avengers based on relative speeds to other cards that have been benched, you do not achieve playable frame rates and you're still under 10Gb of vRAM actually used.
If AMD release something about the speed of a 3080 but with 16Gb of vRAM then you're probably paying for 6Gb of vRAM you can't use, which will drive the price of such a card up over what it otherwise could be. You've made this point a few times in the past alluding to me being talking specifically about Nvidia, when I never have. This is a general principle that applies to any video card. It sort of sounds like you really want any excuse to buy AMD and given all of your constant jabs about "leather jacket man" and Nvidia being "stingy" about whatever...I'd be interested to see what you do if AMD release a 3080 level card with 10Gb of vRAM.