• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA ‘Ampere’ 8nm Graphics Cards

Soldato
Joined
20 Aug 2019
Posts
3,030
Location
SW Florida
The 2080Ti was a value champ compared to this monstrosity. Nvidia have fleeced consumers in a way they have never done before... completely shameless.

The 2080Ti offered less of an improvement over the previous generation's flagship card.
The 2080Ti had a larger price increase over the previous generation's flagship card.

The only thing making the 2080Ti look less bad is that the 2080 (non Ti) was a terrible value also, where this generation has a card with decent price/performance....sitting at the $700 price point, illustration how terrible the 3090's value is.
 
Last edited:
Soldato
Joined
18 Feb 2015
Posts
6,484
GeForce RTX 3070 to cost 520-680 EUR in Europe
According to Overclocking.com, the pricing of the RTX 3070 model is expected to be within 520 to 680 EUR. The prices actually come from a Spanish retailer ‘-’ who listed the cards three weeks in advance.
The cheapest custom RTX 3070 cards should retail at 519.90 EUR (with Spanish 21% VAT). According to the listing, the cheapest models come from PNY (RTX 3070 Dual Fan) and ZOTAC (RTX 3070 Twin Edge) both listed at 520 EUR. This is 20 ER higher than the official MSRP from NVIDIA (source).
More advanced models, such as ASUS ROG STRIX will retail at 660 to 680 EUR (non-OC and OC models). Unfortunately, the retailer does not currently list other popular brands such as Gigabyte, MSI, or EVGA.
NVIDIA GeForce RTX 3070 launches on October 15th. Unlike RTX 3080 and RTX 3090, this card features a different Ampere GPU. The RTX 3070 is a GA104-based model equipped with 5888 CUDA cores and 8GB of GDDR6 non-X memory rated at 14 Gbps. NVIDIA claims that the card should offer better performance than GeForce RTX 2080 Ti, although AIB leaked roadmaps do not seem to confirm this just yet.
GeForce RTX 3070 pricing in -
British retailer Overclockers.co.uk also lists RTX 3070 models, although the pricing is only shown on each product page:

  • ASUS RTX 3070 ROG STRIX OC: 660 GBP
  • ASUS RTX 3070 ROG STRIX OC: 630 GBP
  • PNY RTX 3070 XLR8 EPIC-X: 560 GBP
  • ASUS RTX 3070 DUAL OC: 549 GBP
  • PALIT RTX 3070 GAMEROCK OC: 549 GBP
  • PNY RTX 3070 XLR8 EPIC-X: 540 GBP
  • EVGA RTX 3070 ICX3: 530 GBP
  • INNO3D RTX 3070 ICHILL X4: 530 GBP
  • PALIT RTX 3070 GAMEROCK: 530 GBP
  • PALIT RTX 3070 GAMINGPRO OC: 530 GBP
  • INNO3D RTX 3070 ICHILL X3: 510 GBP
  • ASUS RTX 3070 DUAL OC: 500 GBP
  • INNO3D RTX 3070 TWIN X2: 500 GBP
  • KFA2 (GALAX) RTX 3070 SG: 500 GBP
  • PALIT RTX 3070 GAMINGPRO: 500 GBP
  • ZOTAC RTX 3070 TWIN EDGE: 500 GBP
The MSRP is UK is 469 GBP.

https://videocardz.com/newz/nvidia-geforce-rtx-3070-european-pricing-revealed

You gotta wonder, is the stock situation going to be better, or is it going to be worse for these "lower" end cards? It can't be that the GDDR6X is the bottleneck can it?
 
Associate
Joined
6 Feb 2013
Posts
135
Quick question: if I run 2 3090s in NVLINK with the first card running at PCIe x16 and the second card running at PCIe x8, will I lose a lot of performance versus a x16/x16 configuration?

I’m asking this because the 3090 NVLINK bridge is 4 slots long, and the fourth slot in my Rampage VI Apex motherboard runs at PCIe x8.

At the moment, I run 2 2080 Tis at PCIe x16/x16 in the first and third slots, but they are quite close to each other and 1 of the GPU throttles due to high temperatures under heavy load.

What’s the better option?

1) Running both cards at PCIe x16/x16 with 1 card throttling under heavy load

2) Running 1 card at PCIe x8 and having better temperatures
 
Caporegime
Joined
8 Sep 2005
Posts
27,421
Location
Utopia
Quick question: if I run 2 3090s in NVLINK with the first card running at PCIe x16 and the second card running at PCIe x8, will I lose a lot of performance versus a x16/x16 configuration?

I’m asking this because the 3090 NVLINK bridge is 4 slots long, and the fourth slot in my Rampage VI Apex motherboard runs at PCIe x8.

At the moment, I run 2 2080 Tis at PCIe x16/x16 in the first and third slots, but they are quite close to each other and 1 of the GPU throttles due to high temperatures under heavy load.

What’s the better option?

1) Running both cards at PCIe x16/x16 with 1 card throttling under heavy load

2) Running 1 card at PCIe x8 and having better temperatures
Oh god, you actually bought 2x 3090s? Why? :(
 
Caporegime
Joined
8 Sep 2005
Posts
27,421
Location
Utopia
The 2080Ti offered less of an improvement over the previous generation's flagship card.
The 2080Ti had a larger price increase over the previous generation's flagship card.

The only thing making the 2080Ti look less bad is that the 2080 (non Ti) was a terrible value also, where this generation has a card with decent price/performance....sitting at the $700 price point, illustration how terrible the 3090's value is.

Sorry Twinz but I think your logic is flawed. In this case it's not about how much faster the 3090 is than the previous flagship card, but how much faster it is than the 3080... which is 10% more performance for double the price. The 3090 is almost completely redundant for gaming as the 3080 is already filling so much of its performance bracket for half the price... that it why it is by far the worst value card ever released.

The 2080 Ti was 40% faster than a 2080 for double the price and was, depending on the resolution, 25-35% faster than a 1080Ti. Techspot recently re-tested the 2080Ti vs 1080Ti vs 980Ti and found up to 60% improvements https://www.techspot.com/review/2088-geforce-2080ti-vs-1080ti-vs-980ti/

The 2080TI was way better value than a 3090 and many owners who bought on release can attest to how happy they were with them... I wish I had bought one on release.

Must be using it for work. SLi doesn't work properly anymore for games with no support on the 3090 so it must be some synthetic stuff like blender or something.
True Grim, at least I hope so anyway!
 
Soldato
Joined
6 Feb 2019
Posts
17,565
First Strix OC 3090 review is out and these numbers are mouth watering.

* 10% faster than FE 3090
* Average power draw while gaming out of the box just 335w - yes the card uses less power than the FE 3090 while being 10% faster - the Strix OC has been superbly binned by Asus to be a monster
* For those who want to push it, the BIOS allows up to 480w
* Fantastic cooling performance - just 68c under 100% load out of the box

https://www.techpowerup.com/review/asus-geforce-rtx-3090-strix-oc/
 
Man of Honour
Joined
13 Oct 2006
Posts
91,053
First Strix OC 3090 review is out and these numbers are mouth watering.

* 10% faster than FE 3090
* Average power draw while gaming out of the box just 335w - yes the card uses less power than the FE 3090 while being 10% faster - the Strix OC has been superbly binned by Asus to be a monster
* For those who want to push it, the BIOS allows up to 480w
* Fantastic cooling performance - just 68c under 100% load out of the box

https://www.techpowerup.com/review/asus-geforce-rtx-3090-strix-oc/

Seems to further confirm to me there is a lot of variance in the quality of the cores and anyone not rigorously testing the binning of cores for OC models is probably going to have a percentage that are of marginal stability.
 
Soldato
Joined
17 Jul 2007
Posts
24,529
Location
Solihull-Florida
First Strix OC 3090 review is out and these numbers are mouth watering.

* 10% faster than FE 3090
* Average power draw while gaming out of the box just 335w - yes the card uses less power than the FE 3090 while being 10% faster - the Strix OC has been superbly binned by Asus to be a monster
* For those who want to push it, the BIOS allows up to 480w
* Fantastic cooling performance - just 68c under 100% load out of the box

https://www.techpowerup.com/review/asus-geforce-rtx-3090-strix-oc/


For twice the price of an 3080 it's not even good for gaming.
Gamers Nexus got it spot on.
 
Caporegime
Joined
8 Sep 2005
Posts
27,421
Location
Utopia
First Strix OC 3090 review is out and these numbers are mouth watering.

* 10% faster than FE 3090
* Average power draw while gaming out of the box just 335w - yes the card uses less power than the FE 3090 while being 10% faster - the Strix OC has been superbly binned by Asus to be a monster
* For those who want to push it, the BIOS allows up to 480w
* Fantastic cooling performance - just 68c under 100% load out of the box

https://www.techpowerup.com/review/asus-geforce-rtx-3090-strix-oc/
For $1800, 20% more expensive than the regular 3090 rrp of $1500,you would expect nothing less.

For twice the price of an 3080 it's not even good for gaming.
Gamers Nexus got it spot on.
And Hardware Canucks, they get credit for an honest review too. As for Linus Tech Tips... this guy is no longer credible and has sold out to corporate pressure.
 
Caporegime
Joined
8 Sep 2005
Posts
27,421
Location
Utopia
Sorry if this makes some of you 3090 owners feel even worse, but Moore Law has shared his insider info there will soon be a new RTX Titan released, a full GA102 card with 48GB GDDR5 (non-X) RAM and without any stripped features. THAT will be the card to get if you genuinely need something for workloads.


He also says that using GDDR6X on the 3080 was a big mistake and it would have been far better served with 12GB+ of regular GDDR6, a higher bit-bus and it would have had more bandwidth while being cheaper.

My personal belief is that Jensen has made some very, very bad executive decisions for this generation and when the dust has settled we will see him coming under fire from within Nvidia (in addition to the media) for this release.
 
Man of Honour
Joined
13 Oct 2006
Posts
91,053
Sorry if this makes some of you 3090 owners feel even worse, but Moore Law has shared his insider info there will soon be a new RTX Titan released, a full GA102 card with 48GB GDDR5 (non-X) RAM and without any stripped features. THAT will be the card to get if you genuinely need something for workloads.


He also says that using GDDR6X on the 3080 was a big mistake and it would have been far better served with 12GB+ of regular GDDR6, a higher bit-bus and it would have had more bandwidth while being cheaper.

My personal belief is that Jensen has made some very, very bad executive decisions for this generation and when the dust has settled we will see him coming under fire from within Nvidia (in addition to the media) for this release.

I think nVidia's hand (and plans) have been forced somewhat between the Samsung 8nm node and 7nm EUV situation.
 
Caporegime
Joined
8 Sep 2005
Posts
27,421
Location
Utopia
Guys, if you want to see something that truly makes you question your faith in the ability of people to act rationally then check out this video of people queueing, and SLEEPING. outside a store for 3 DAMN DAYS in order to get a 3090... and not everyone even got one! :eek:


The. Mind. Boggles. :(

I think nVidia's hand (and plans) have been forced somewhat between the Samsung 8nm node and 7nm EUV situation.
Yeah, that certainly did not help... but again this was Nvidias own fault for trying to bully TSMC. Their arrogance and hubris (or rather, Jensens) has now grown to the point where it has negatively impacted their business. The need to reduce his power and presence because it's a prime example of one man and his personality wielding a level of power and decision-making within a company that should not be granted to any one person.
 
Man of Honour
Joined
13 Oct 2006
Posts
91,053
Yeah, that certainly did not help... but again this was Nvidias own fault for trying to bully TSMC. Their arrogance has now grown to the point where it has negatively impacted their business.

I don't believe nVidia trying to bully TSMC has had a lot to do with it - a lot of that is people making up their own story. nVidia still has a lot of stuff being produced on TSMC 7nm and was able to buy up a lot of extra capacity. TSMC 7nm wafers have apparently increased in price quite a bit (apparently now gone past $10K per wafer compared to nVidia getting favourable rates on Samsung 8nm - closer to half that) - so I'm interested to see what AMD do there - and nVidia aren't the only ones that moved their larger, consumer rather than industry, products away from it - which includes MediaTek which are being supported by TSMC so isn't like AMD muscled them out on capacity.

The need to reduce his power and presence because it's a prime example of one man and his personality wielding a level of power and decision-making within a company that should not be granted to any one person.

I don't think that is going to happen quite frankly - the structure of nVidia doesn't really give anyone power to do that outside of extraordinary circumstances (if he did something criminal, etc.). With the setup even shareholders have more limited influence than is typical with a company like this.
 
Last edited:
Associate
Joined
19 Jun 2017
Posts
1,029
Quick question: if I run 2 3090s in NVLINK with the first card running at PCIe x16 and the second card running at PCIe x8, will I lose a lot of performance versus a x16/x16 configuration?

I’m asking this because the 3090 NVLINK bridge is 4 slots long, and the fourth slot in my Rampage VI Apex motherboard runs at PCIe x8.

At the moment, I run 2 2080 Tis at PCIe x16/x16 in the first and third slots, but they are quite close to each other and 1 of the GPU throttles due to high temperatures under heavy load.

What’s the better option?

1) Running both cards at PCIe x16/x16 with 1 card throttling under heavy load

2) Running 1 card at PCIe x8 and having better temperatures

SLi always works in x8/x8 (atleast with pcie 3) ...that being said, you would have to lookup that motherboard manual if the x8 slot is capable of SLi

But as things stand right now you got to worry abt 2 things:

1. Nvidia has noted that the 3090 will not have implicit SLi support..now we don't know what that means..if it's just missing profiles then there's the trusty Nvidia inspector to fall back on... But if it's fully disabled then no more SLi in DX 11 or earlier games. (I am yet to play witcher 3 that's how huge my backlog is)

2. Right now only the 4 slot bridge is available. I don't have a 4 slot on my motherboard so I am chilling out :D.. but it's definitely going to be faster than the 2080 ti setup, provided it works it's a nobrainer

Edit: it seems some motherboards can do x16/x16 as well with some additional onboard chip..so the manual again it is :)
 
Last edited:
Permabanned
Joined
15 Oct 2011
Posts
6,311
Location
Nottingham Carlton
5bd493ded108f3ce563eb2e818d62ae6afa2781d.jpeg

 
Associate
Joined
21 Apr 2007
Posts
2,485
The 2080TI was way better value than a 3090 and many owners who bought on release can attest to how happy they were with them... I wish I had bought one on release.

You could make that argument but given the 3090 is such poor value for gaming its a bit like saying 2080Ti was terrible and the 3090 is worse, its all just shades of bad value it doesn't make the 2080Ti good.
 
Back
Top Bottom