• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

GPU tribalism. Is it a thing?

Associate
Joined
3 Jul 2012
Posts
425
how does that affect performance, not baiting I honestly don't know

Strictly on performance it doesn't matter but for certain applications CUDA is a requirement (typically AI related apps), unfortunately they tend to be ones I am interested in so I tend to be stuck with Nvidia.
 
Soldato
Joined
22 Apr 2016
Posts
3,427
I go for what’s best and currently that’s nvidia (by a country mile) for gpu’s and AMD for cpus!

So I have both vendors in my system but choose what they are best at!
 
Associate
Joined
17 Sep 2020
Posts
624
It's most certainly still a thing. Some people here would let Jen bend their mother and sisters over if it meant they had first dibs on a 3080.

Yer but once Jen is done said mother and sister would report back not all Jen's equipment is working and the gains were smaller than claimed. ;)
 
Soldato
Joined
30 Jun 2019
Posts
7,875
Yup it's a thing. Once a brand name becomes successful and well known enough, people get lazy (and sometimes disillusioned in competitors) and don't want to / won't consider alternatives.

Another thing people sometimes do is convince themselves that what they buy must have a certain feature to be a viable option, e.g. Gsync or ray tracing.
 
Soldato
Joined
21 Jul 2005
Posts
20,020
Location
Officially least sunny location -Ronskistats
Why? Maybe ask if he has a specific requirement for cuda before jumping in feet first?

edit: Seems he does.

Problem with AMD is it doesn't have CUDA, regardless of how good the card performs on the whole.

Maybe he could have expanded more in his OP instead of just saying this ^ then I (or anyone) wouldn't need to ask? In fact it makes no sense. Its nvidia's tech proprietary at that how on earth would AMD 'have' it?

Problem with nvidia is it doesnt have Radeon Image Sharpening or Radeon Chill. You see what I mean.
 
Associate
Joined
3 Jul 2012
Posts
425
Maybe he could have expanded more in his OP instead of just saying this ^ then I (or anyone) wouldn't need to ask? In fact it makes no sense. Its nvidia's tech proprietary at that how on earth would AMD 'have' it?

Problem with nvidia is it doesnt have Radeon Image Sharpening or Radeon Chill. You see what I mean.

The point is that AMD hasn't developed an equivalent that allows apps to be cross compatible.
 
Soldato
Joined
21 Jul 2005
Posts
20,020
Location
Officially least sunny location -Ronskistats
The point is that AMD hasn't developed an equivalent that allows apps to be cross compatible.

That may be so (it has a translator HIP?). But AMD out of the two are renown for working on open source or industry standard to allow for mass adoption. I would have to read into it again but I know ROCm is open source. If you specialise in CUDA for parallel computing then its not really AMD's fault. Its like learning a programming language and wanting another to be 'cross compatible' when your better off just learning it and transferring your skills over.
 
Associate
Joined
29 Jan 2018
Posts
100
I think it is. I always try and strive to get the best when I can and when entering the PC gaming world I was told that Intel were the best for CPU's and NVIDIA for GPU's and I've always stuck to them, even when my wallet was saying it wasn't worth it.

Though this whole 3080 saga and waiting for it to arrive has been quite disappointing and disheartening and I'm actually considering looking at the new AMD card when it comes out.
 
Associate
Joined
3 Jul 2012
Posts
425
That may be so (it has a translator HIP?). But AMD out of the two are renown for working on open source or industry standard to allow for mass adoption. I would have to read into it again but I know ROCm is open source. If you specialise in CUDA for parallel computing then its not really AMD's fault. Its like learning a programming language and wanting another to be 'cross compatible' when your better off just learning it and transferring your skills over.

Not just programming but professional apps that require CUDA and have no viable open source alternatives.

If AMD was competitive at the high end I suspect there would be more alternatives.
 
Man of Honour
Joined
25 Oct 2002
Posts
31,736
Location
Hampshire
I've had better experiences with 3dfx/Nvidia than ATI/AMD so I favour them slightly, but two of my last three cards have been AMD. Occasionally AMD put out decent mid-range parts (which is where I tend to buy) that make them a compelling proposition, the way I tend to operate is if think there is a good deal on I will go for it regardless of manufacturer but in the unlikely event I had options from either side with the same performance/price I'd probably go green.
 
Associate
Joined
1 Feb 2017
Posts
1,052
I stick with Nvidia purely because I spent £600 on a gsync monitor a few years ago and it’s still working great. Now that they have opened up gsync I’ll get a gsync compatible one so I can have the choice of Nvidia, Amd or intel when they enter the market.
 

TNA

TNA

Caporegime
Joined
13 Mar 2008
Posts
27,524
Location
Greater London
It's been a thing for as long as I can remember, 3dfx/ati/nvidia/matrox etc

It's never going away, but it is amusing reading the blinkered opinions in this forum :D
especially at the moment :p
For me, I don't give a monkeys for who makes the card, I've owned most types over years, and enjoyed them.

I did have a soft spot for 3Dfx as that was the very first wow factor for me all those years ago, sweeping around the Unreal intro castle, just magic :cool:
+1

I never did get to buy a 3Dfx card though, was too young and couldn’t afford it, at least not the ones I used to see at PC World back then :p

My first expensive GPU I purchased as I recall was a Radeon 9700 which I read the review in one of the PC magazines and saw it thrashing everything. Wanted to be ready for Half Life 2. I got and flashed to a pro. Saved about £100 in doing so :D
 
Don
Joined
20 Feb 2006
Posts
5,220
Location
Leeds
Problem with AMD is it doesn't have CUDA, regardless of how good the card performs on the whole.

Maybe he could have expanded more in his OP instead of just saying this ^ then I (or anyone) wouldn't need to ask? In fact it makes no sense. Its nvidia's tech proprietary at that how on earth would AMD 'have' it?

Problem with nvidia is it doesnt have Radeon Image Sharpening or Radeon Chill. You see what I mean.

It has nothing to do with RIS, DLSS etc. We have alternatives from each vendor, he uses cuda, AMD do not have this what more can we say? I need cuda, it’s okay use image sharpening instead?

As to regards asking a question for clarity, yes do. Engage in conversation if you are unsure.
 
Associate
Joined
23 Sep 2020
Posts
20
CUDA is important to me too as I use video editing software which has accelerated processing because of it. However we should know that AMD has stream processors. The AMD equivalents can accelerate this sort of job just the same way CUDA accelerates jobs. The problem is one of adoption. There needs to be more AMD cards out there for software makers to make use of the hardware. NVidia has done an excellent job in branding tech to the point people want/need "CUDA" or "RTX" which are jobs that can be done with other tech. I acknowledge at the moment "CUDA" or nothing is it, but don't think as a gamer that the lack of "CUDA" in graphics cards is a minus, because it isn't, because AMD do have an alternative.

Having a brand that has this amount of loyalty and domination on a market is not healthy and drives up prices. Ironically I too have a pre-ordered 3080, so I too am part of the problem. CUDA is nothing special but has brand awareness, and that is something we all need to be aware of.
 
Associate
Joined
3 Jul 2012
Posts
425
It's been a thing for as long as I can remember, 3dfx/ati/nvidia/matrox etc

It's never going away, but it is amusing reading the blinkered opinions in this forum :D
especially at the moment :p
For me, I don't give a monkeys for who makes the card, I've owned most types over years, and enjoyed them.

I did have a soft spot for 3Dfx as that was the very first wow factor for me all those years ago, sweeping around the Unreal intro castle, just magic :cool:

+1

I never did get to buy a 3Dfx card though, was too young and couldn’t afford it, at least not the ones I used to see at PC World back then :p

My first expensive GPU I purchased as I recall was a Radeon 9700 which I read the review in one of the PC magazines and saw it thrashing everything. Wanted to be ready for Half Life 2. I got and flashed to a pro. Saved about £100 in doing so :D



If you want to experience Unreal again, you can get the DX11 plugin and HD texture pack.

https://www.youtube.com/watch?v=hrHPWPD54Is
 
Associate
Joined
5 Aug 2020
Posts
303
Can’t we all just get along? :D

I’m a little demanding when it comes to purchases but ultimately I just want the best performance for my money, but that doesn’t necessarily mean best VFM. Tried for a 3090 at launch but failed to get one, though that may yet be a blessing in disguise.

Am I alone I’m not caring whether it’s Nvidia or AMD that manufacture your GPU of choice?

Or are there really people that will only buy from a specific manufacturer due to some misplaced brand loyalty?

Not looking to start an argument over which is best, just interested if it’s actually a thing.

Phew, managed to write that without using the dreaded F word once... :p
Haha I don't really care to be honest. I've switch back and forth over the years from a gtx 980 to a rx 5700 xt. I've literally just sold my rx 5700 xt and plan on becoming team green!
 
Back
Top Bottom