• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA ‘Ampere’ 8nm Graphics Cards

Caporegime
Joined
13 Jan 2010
Posts
32,574
Location
Llaneirwg
I wouldn't worry. My 2080 with HLA does absolutely fine and thats with some super sampling. There aren't many demanding games on the horizon. The HP Reverb might push the limits a bit but review reports have shown a 2080ti more then cuts the mustard on those games so a 3080 should blast through them.

As awesome as VR is, we don't have AAA graphical fidelity because we don't have AAA developers fully invested in the platform so we don't have AAA demands on games. The biggest developer is Oculus who produce for the weakest headsets. There is no way they're going to therefore push the boundaries of VR when they need their games to work on the bargain basement of PCVR headsets, the Rift S, and the portable snapdragon Oculus Quest.

Really useful thanks.
 
Soldato
Joined
18 Feb 2015
Posts
6,484
An EVEN BIGGER RTX has been seen in the wild! WOW :eek:

Eg3p2nBVgAAMh8O
 
Soldato
Joined
13 Aug 2012
Posts
4,277
It's £639 not £630, and for a tenner more for the Zotac Trinity, you get a better looking card (IMO), an extra THREE years warranty, with a metal front and rear backplate.

You would have to be off your rocker to buy the Palit over the Zotac.

So the zotac is the one to go for then. Thanks
 
Soldato
Joined
12 May 2014
Posts
5,236
Hammer on box seem to think that bandwidth "can help alleviate capacity issues". (@14:16)

The full quote because i think a lot of context is missing from your trimmed down quote.
In some instances a substantial increase in bandwidth can help alleviate capacity issues. However we also saw 1 instance were the 2080 was completely overwhelmed and as a result fell well behind the 1080ti despite easily beating the pascal based GPU at 1440p.

His sentence after the one i've just transcribed is also interesting.
 
Associate
Joined
11 Dec 2016
Posts
2,023
Location
Oxford
I have a seasonic 650w platinum but i dont think i'll risk it with a 3080.
3080 TDP is 70W higher that standard 2080Ti
And GN show that top 10900K + 2080Ti peaked at 537W in absolute CPU+GPU torture test. In gaming it stayed below 450W.
https://www.youtube.com/watch?v=X_wtoCBahhM

Plus remember, 650W on PSU rating is watts delivered, not consumed. So it is rated to deliver 650W DC to components after converting from AC at perhaps 90-95% efficiency. So it will be happy to consume 700W (and more because of safety margin) from the wall. Thats the consumption GN measure, at the wall.
 
Associate
Joined
23 Oct 2013
Posts
1,237
Location
Surrey
Haven't posted in ages, but, haha!! I got the prices exactly right.

Where is my cookie? :p

( )))))

Have a few, champ :p

So many posts in this thread I can't be arshed to look back but I figured somebody would have been close, even though most were expecting 2080/ti levels of pricing silliness (and with good reason!).
 
Last edited:
Associate
Joined
1 Oct 2009
Posts
1,033
Location
Norwich, UK
This is something that eventually when games incorporate it RTX IO might fix.

We just don't know yet how this is going to pan out.

As I pointed out in the other thread. Even on the consoles they need system ram for the OS and for the game before you even start talking about VRAM.

So lets say this game uses 10.5GB of vram on a console too. That leaves 5.5GB of ram for the OS and for the game it's self. How do you think it will run? Sounds like a rubbish experience to me. These days windows alone with nothing open uses 4GB of ram.

So yes, PC to PC comparisons of VRAM are valid. But console to PC are not. It's not apples to apples comparison.

Well this kinda confirms what FS2020 showed us, which is that when you exceed 10Gb of vRAM usage your frame rates are already unplayable. 17fps on a 2080Ti is not going to translate to playable on a 3080. The game isn't just taking a hit to vRAM swapping textures in and out, it's taking a hit because asking a GPU to process those texture to construct the next frame is more work than a smaller/simpler texture file. People can get that "win" of exceeding the 10Gb of vRAM usage like FS2020 @4k Ultra @25fps or Avengers @4k Ultra @ 17fps but it's completely meaningless, no one games at 17fps.

To drive that point home about the next gen consoles it wont matter for them. Their GPU will not give playable frame rates with 10.5Gb of vRAM in use. They KNOW that which is why they provisioned the consoles with 16Gb of memory, the GPU will never realistically be able to use more than 8Gb, the system uses about 2gb (or a bit more) leaving 4-5 Gb for the game engine which is about what they peak at today.
 

HRL

HRL

Soldato
Joined
22 Nov 2005
Posts
3,028
Location
Devon
The main thing for me now is seeing the performance of the 3090 compared to the 3080. If it is significantly different, I may just get the 3090..

Same here. People are citing only a 20% performance uplift over the 3080 but that doesn’t add up in my eyes, especially if they’ve purposely left a gap for a 3080Ti for a later date.

The two cards would then be stepping on each other’s toes. Genuinely expect the 3090 to be significantly more powerful than people think. No point in it otherwise and it’s a monster of a card size wise.
 
Back
Top Bottom