• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

When is Vega 2 out

Associate
Joined
7 Feb 2017
Posts
77
So i heard somewhere that the Vega 2 (Navi) was being worked on by the same team that worked on Ryzen. This is promising considering how poor Vega 64 is. I realise that all the buzz is for the 2080 Ti right now. But i want my GFX for 3D work and the current Vega 64 smashes the 1080 Ti when rendering in Blender.

So any idea on the wait for Vega 2 or just don't go there? I mean a year on a Vega 64 still under performs in games to the level of a 1070 (in some instances)
 
Soldato
Joined
19 Dec 2010
Posts
12,019
So i heard somewhere that the Vega 2 (Navi) was being worked on by the same team that worked on Ryzen. This is promising considering how poor Vega 64 is. I realise that all the buzz is for the 2080 Ti right now. But i want my GFX for 3D work and the current Vega 64 smashes the 1080 Ti when rendering in Blender.

So any idea on the wait for Vega 2 or just don't go there? I mean a year on a Vega 64 still under performs in games to the level of a 1070 (in some instances)

Sounds like you don't have your Vega 64 card configured right for gaming. Take a look in the owners thread for some tweaks you can do.

Not sure when the Vega 2 is coming, Not sure it's coming at all to be honest. Think the Vega cards coming next are all for professional work. Navi should be out sometime next year though. No clue as to how it will perform.

Maybe an option for you would be to buy a good Freesync monitor? Vega 64 with a freesync monitor gives a great gaming experience.
 
Soldato
Joined
22 Nov 2009
Posts
13,252
Location
Under the hot sun.
So i heard somewhere that the Vega 2 (Navi) was being worked on by the same team that worked on Ryzen. This is promising considering how poor Vega 64 is. I realise that all the buzz is for the 2080 Ti right now. But i want my GFX for 3D work and the current Vega 64 smashes the 1080 Ti when rendering in Blender.

So any idea on the wait for Vega 2 or just don't go there? I mean a year on a Vega 64 still under performs in games to the level of a 1070 (in some instances)

It outperforms the 1080 so seems you haven't configred your card properly.
Have a look here at these settings. Let me know how it goes.

https://forums.overclockers.co.uk/posts/32057078/

Also. Navi is not Vega. Is completely new GPU.
Vega 20 (20.9tflop monstrocity) is the Vega64 in 7nm process, but AMD said is only for data centers.
Also there is a Vega 12 we know nothing about.
 
Soldato
Joined
8 Mar 2010
Posts
4,967
Location
Aberdeenshire
So i heard somewhere that the Vega 2 (Navi) was being worked on by the same team that worked on Ryzen. This is promising considering how poor Vega 64 is. I realise that all the buzz is for the 2080 Ti right now. But i want my GFX for 3D work and the current Vega 64 smashes the 1080 Ti when rendering in Blender.

So any idea on the wait for Vega 2 or just don't go there? I mean a year on a Vega 64 still under performs in games to the level of a 1070 (in some instances)
If you really need it for work and it's the best for what you are going to work with then just get one. For £440 I wouldn't expect 1080ti beating performance in games but it's not bad for the money either. As said above do the tweaks and it will perform well.
 
Associate
OP
Joined
7 Feb 2017
Posts
77
Sorry guys for the confusion but i don't own a Vega 64. I'm looking to buy a new graphics card and am trying to choose. From the current crop, when working in Blender, i was surprised to learn that Vega 64 was nearly twice as fast as a 1080 Ti at rendering. However, after watching some videos by experts, reviewers, and owners of the card, it was explained that still, today, the drivers are still the cause for it lagging in FPS in modern games. So whilst i primarily want my GFX card for work, the work is for game development and such should be great for gaming.

With Vega 2/navi/12 or whatever the name of the next Radeon iteration for consumers, is it worth waiting for or should i just buy the best now? Like most people i don't want to drop £700 on a GFX card only to find 3 months later that price drops to £400 since the latest thing has just been released.
 
Associate
Joined
6 Nov 2005
Posts
2,417
Rumours suggest that we will see pro level "Vega 20" based 7nm cards out before the end of the year, but looks like their won't be gaming cards based on this. And the gaming navi cards should be some time next year. Though only very little info at the moment.

Edit. Just to confirm the current Vega cards are based on the Vega 10 architecture. It all gets confusing cos the products Vega 56 and Vega 64 refer to the number of CUs, but Vega 10 and Vega 20 is just the names of architecture
 
Soldato
Joined
23 Feb 2009
Posts
4,978
Location
South Wirral
With Vega 2/navi/12 or whatever the name of the next Radeon iteration for consumers, is it worth waiting for or should i just buy the best now? Like most people i don't want to drop £700 on a GFX card only to find 3 months later that price drops to £400 since the latest thing has just been released.

Judging from threads like this, I think the manufacturers have got a lot smarter about clearing the channel of old stock before releasing new cards. I really doubt there are warehouses full of "about to be old stock". The mining craze meant stocks never went really high.

To my mind there are three schools of thought:

- Upgrade constantly and make a reasonable amount back on re-sale of the card
- Risk a secondhard buy
- Buy for longevity

I'm in the third camp, still rocking a 290 from June 2014. I'll only upgrade when it dies or there's something I really feel the urge to play that it can't handle.

In your shoes, I'd just buy the best I can within you budget and then not look at prices for a while.
 
Associate
OP
Joined
7 Feb 2017
Posts
77
Something else bothers me about the Vega 64 too. It's power consumption. So under load, playing Battlefield 1, according to AnandTech a Vega 64 (on air) is 459w, with the 1080 Ti coming in at 379w. That's a difference of 80w. 80w per hour over 12 hours is just under a kilowatt. If a kW of electricity cost 25p then, and i run my card full pelt for 12 hours a day, then this would be £1.75 per week. Translate that to 52 weeks per year, and it's £91. I'd expect to keep my card for 2 years. So over the life of the card it would have cost me an extra £182 to run over a 1080 Ti.

And whilst graphics card prices go down, electricity prices are always going up!

Now this might seem unfair, after all my computer will run idle at points throughout the day. But my PC is always on so i've accounted for that. And no, i wouldn't play a game for a straight 12 hours a day, 365 days per year. But i do consider 12 hours rendering time to be realistic. And i've not accounted for the fact that i would render at night, whilst asleep, and using cheaper electricity on Economy 7.

I just wanted to show the stark reality of what an extra 80w of power actually costs when your parents aren't paying the bills.
 
Man of Honour
Joined
13 Oct 2006
Posts
90,805
If it bothers you - you can usually underclock Vega cards very slightly and drop the voltage down quite a bit with a significant reduction in power draw for only a small performance decrease - if you have one of the better cards you might even manage that while pulling the clocks up a bit as Vega seems to have a lot of voltage overhead on the average card - probably due to not great yields especially on earlier production.

Though some of the power claims some posters on here make are somewhat exaggerated in my experience (likewise performance claims as over a wide range of games the 1080 and Vega 64 average performance is usually within ~5% of each other).
 
Soldato
Joined
22 Nov 2009
Posts
13,252
Location
Under the hot sun.
Something else bothers me about the Vega 64 too. It's power consumption. So under load, playing Battlefield 1, according to AnandTech a Vega 64 (on air) is 459w, with the 1080 Ti coming in at 379w. That's a difference of 80w. 80w per hour over 12 hours is just under a kilowatt. If a kW of electricity cost 25p then, and i run my card full pelt for 12 hours a day, then this would be £1.75 per week. Translate that to 52 weeks per year, and it's £91. I'd expect to keep my card for 2 years. So over the life of the card it would have cost me an extra £182 to run over a 1080 Ti.

a) The avg cost in uk is 12.5p per kw/h not 25. If you pay that much change power supplier
b) True the Reference Vega 64 burns more power than the GTX1080Ti FE. However the moment you go over the 1080Ti FE clocks, the GTX1080Ti is as power hungry. Those 1080Ti Xtreme/FTW etc burn 330W easily on their factory clocks. (2021)
c) Reference or Asus Strix Vega 64s are the worst sample to make an indication of the Vega 64 power consumption and performance.
Nitro and Red Devil with bit tweaking (see above settings) burn 260-270W and beat the performance of the GTX1080 which is the card they should be compared with.
d) In addition AMD drivers have settings for 234W and 176W profiles for 10% less perf than the Turbo Profile (276W-300W card depending) Which is completely bad if you consider how much you can get out of the card by spending 3 minutes to put settings the community has created.
 
Associate
OP
Joined
7 Feb 2017
Posts
77
@peterwalkley from your three schools of thought i'm not sure i fit any. You see i've just bought a 1700x because OCUK had them on sale for a silly £150. Before this offer i was considering the first gen threadripper, i think it was the 1920X? for £325? or something like that. But i want to get into game development and i didn't feel that i was at the level yet where my ability demanded a use for that number of cores and threads. And then the 1700X came up at £150 and decided it all for me.

It's the same with the GFX card. Sure i could splash out on a 2080 Ti, but i'm not really at the level where i'll be taking advantage of its processing power so i'm better off waiting until i am. By which time the price will have dropped because something else will have been released.

So my school of thought is to buy a new 1080 Ti in October after the 2080 Ti has been released and hope the prices drop further. Then, when i want more performance, buy a second, used 1080 Ti, to render using SLI. Once they're holding me back i'd probably look at getting the latest 3080Ti in 2020/21
 
Associate
OP
Joined
7 Feb 2017
Posts
77
@Panos. I appreciate that your electricity supplier and specific variant of card GFX card will impact costs and power usage. I was just drawing a broad example. I suppose the point being that if you save £200 buying a Vega 64 over a 1080 Ti now, then in the end you still end up paying the £200.

So am i right in understanding that like AMD CPUs that the GPUs require a lot more voltage to eke out 2% more performance. Hence why super over clocked cards like the Asus Strix are so power hungry?
 
Soldato
Joined
19 Dec 2010
Posts
12,019
@Panos. I appreciate that your electricity supplier and specific variant of card GFX card will impact costs and power usage. I was just drawing a broad example. I suppose the point being that if you save £200 buying a Vega 64 over a 1080 Ti now, then in the end you still end up paying the £200.

So am i right in understanding that like AMD CPUs that the GPUs require a lot more voltage to eke out 2% more performance. Hence why super over clocked cards like the Asus Strix are so power hungry?

If you underclock Vega cards you get more performance. The power consumption is exaggerated a lot and most of the reviews etc are done using stock settings which are terrible on Vega cards. I used a 650W power supply for my Vega card and never had a problem with it.
 
Associate
Joined
9 Sep 2018
Posts
84
Location
Barcelona
Something else bothers me about the Vega 64 too. It's power consumption. So under load, playing Battlefield 1, according to AnandTech a Vega 64 (on air) is 459w, with the 1080 Ti coming in at 379w. That's a difference of 80w. 80w per hour over 12 hours is just under a kilowatt. If a kW of electricity cost 25p then, and i run my card full pelt for 12 hours a day, then this would be £1.75 per week. Translate that to 52 weeks per year, and it's £91. I'd expect to keep my card for 2 years. So over the life of the card it would have cost me an extra £182 to run over a 1080 Ti.

And whilst graphics card prices go down, electricity prices are always going up!

Now this might seem unfair, after all my computer will run idle at points throughout the day. But my PC is always on so i've accounted for that. And no, i wouldn't play a game for a straight 12 hours a day, 365 days per year. But i do consider 12 hours rendering time to be realistic. And i've not accounted for the fact that i would render at night, whilst asleep, and using cheaper electricity on Economy 7.

I just wanted to show the stark reality of what an extra 80w of power actually costs when your parents aren't paying the bills.

I guess those numbers refer to the total system consumption?

Here's a video of my rig running the game, it doesn't even reach 350W with my cpu at 5GHz https://youtu.be/J8ID11sVYu0

Cheers.
 
Soldato
Joined
22 Nov 2009
Posts
13,252
Location
Under the hot sun.
So am i right in understanding that like AMD CPUs that the GPUs require a lot more voltage to eke out 2% more performance. Hence why super over clocked cards like the Asus Strix are so power hungry?
LOL
I have 5 hours drive to start and cannot go to more details but your understanding is VERY wrong, especially when comes to CPUs.
A 8 core Ryzen CPU burns 2/3 the power of a 6 core Intel while providing superior multithread performance.
 
Soldato
Joined
28 May 2007
Posts
18,190
Something else bothers me about the Vega 64 too. It's power consumption. So under load, playing Battlefield 1, according to AnandTech a Vega 64 (on air) is 459w, with the 1080 Ti coming in at 379w. That's a difference of 80w. 80w per hour over 12 hours is just under a kilowatt. If a kW of electricity cost 25p then, and i run my card full pelt for 12 hours a day, then this would be £1.75 per week. Translate that to 52 weeks per year, and it's £91. I'd expect to keep my card for 2 years. So over the life of the card it would have cost me an extra £182 to run over a 1080 Ti.

And whilst graphics card prices go down, electricity prices are always going up!

Now this might seem unfair, after all my computer will run idle at points throughout the day. But my PC is always on so i've accounted for that. And no, i wouldn't play a game for a straight 12 hours a day, 365 days per year. But i do consider 12 hours rendering time to be realistic. And i've not accounted for the fact that i would render at night, whilst asleep, and using cheaper electricity on Economy 7.

I just wanted to show the stark reality of what an extra 80w of power actually costs when your parents aren't paying the bills.

Look at an APU. It will save you a fortune. Use the money saved from the 1080ti (550-600) and put it into 1kw of solar panels. The combined savings on upfront costs and electricity over the life times will probably be a deposit on your own house.
 
Soldato
Joined
22 Nov 2006
Posts
23,299
I'm also running a Vega64 on a 650w PSU. Currently at 1050mv, in some games it won't even use all of that. The AMD cards are quite efficient once you tweak them a bit and they scale a lot depending on what you want from it.

How I have mine set up uses 260w at full load. That's still easily more than enough performance to run pretty much any game.
 
Last edited:
Associate
OP
Joined
7 Feb 2017
Posts
77
Yeah, well why is that the convention seems to be that you buy an nVidia card and it works and then you buy a Radeon card and you have to tweak loads of settings to even make it work correctly. I mean surely if the likes of Asus cannot build a card and set it up straight from the outset then what chance do us laymans have?
 
Associate
OP
Joined
7 Feb 2017
Posts
77
@Panos. You say my understanding is wrong. But i always thought that overclockers whom go for a world record, cool using liquid nitrogen because of the heat generated by how much voltage they have to push through the CPU to get that extra 100mhz.

@jigger. Unfortunately an APU doesn't have the core count i require for 3D work

@Bloot I just watched your video and Battlefield looks great. What PSU are your running?
 
Last edited:
Back
Top Bottom