• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

64 bit graphics?

Associate
Joined
12 Aug 2005
Posts
1,025
Location
Team 10
According to someone I was talking to on MSN, ATI cards 'lack' a set of certain instructions in their core for running 64-bit games efficiently. Just as it happens, nVidia DO have these instructions.

Anyone know anything about this?
 
Soldato
Joined
6 Jun 2005
Posts
22,598
I dont honestly know - but are there ANY 64 bit games around ??

Are there likely to be?

I am only asking , the above arent meant to be arguementative, but I dont know of that many games that would utilise them even if they were present. ( and you had the X64 version of windows installed of course)

dual core support is slowly getting better - I wouldnt expect much interest from developers for this until Vista?

Anyone have any other comments?
 
Soldato
Joined
27 Feb 2006
Posts
3,919
Location
Lincolnshire
Hi, as far as I know, Farcry has an update for running on XP64 as does HF2. However this is the first I've heard about a diff between ATI, NV & 64-bit games and I've played them both, sure it's not a NV troll attempt?
ChrisC
 
Associate
Joined
26 Aug 2003
Posts
581
Location
Grimsby
Hi

as far as im aware its the cpu and OS that is 64bit ... the gpu isnt and only the drivers that run it need to be 64bit ... i have two 64bit rigs one NV6800GT and one ATIx1800XT and although not a fair comparison i dont see any probs with ATI card on x64 benchies appear to be on par with 32bit but of course im open to be proved wrong
 
Soldato
Joined
6 Jun 2005
Posts
22,598
but from a technology point of view why cant gpu's be 64 bit? As OS's and CPU's are going /have gone that way, and PCI (and I guess e) are regarded as 32 bit slots ( with PCI - x on servers being 64 bit)

Seems a logical progression one day
 
Man of Honour
Joined
15 Jan 2006
Posts
32,369
Location
Tosche Station
It's nothing to do with 64 bit "Graphics", it's to do with the cpu shifting textures around. Also, our pc's have MANY MANY bit structures through them - GPU's are 256 to graphics RAM IIRC.
 
Soldato
Joined
6 Jun 2005
Posts
22,598
Zefan said:
It's nothing to do with 64 bit "Graphics", it's to do with the cpu shifting textures around. Also, our pc's have MANY MANY bit structures through them - GPU's are 256 to graphics RAM IIRC.


I am right in thinking that the PCI-E slot is still only 32-bit though?





Please be kind its My birthday for the next hour or so lol
 
Man of Honour
Joined
15 Jan 2006
Posts
32,369
Location
Tosche Station
FrankJH said:
I am right in thinking that the PCI-E slot is still only 32-bit though?





Please be kind its My birthday for the next hour or so lol

No the slots are 64 bit, so really the graphics cards can talk to the cpu and RAM without bottlenecking. I don't think there's really any point to it all though.

Happy Birthday :D
 
Associate
OP
Joined
12 Aug 2005
Posts
1,025
Location
Team 10
Well this guy seemed to know a lot but got a lil quieter when I pointed out that on both the nVidia and ATI site, they both advertise 64-bit HDR and textures and such on their top cards (7900GTX and X1900)
 

str

str

Soldato
Joined
18 Oct 2002
Posts
3,052
NightmareXX said:
According to someone I was talking to on MSN, ATI cards 'lack' a set of certain instructions in their core for running 64-bit games efficiently. Just as it happens, nVidia DO have these instructions.
Imagine how crazy it would be if people read that and word got around it was true when in fact it's absolute nonsense. :)
 
Man of Honour
Joined
15 Jan 2006
Posts
32,369
Location
Tosche Station
NightmareXX said:
Well this guy seemed to know a lot but got a lil quieter when I pointed out that on both the nVidia and ATI site, they both advertise 64-bit HDR and textures and such on their top cards (7900GTX and X1900)

It's not actually anything to do with the 3d cards... you can run 64bit textures on a 9800pro or whatever. It's the difference where the cpu comes in to shift textures around that matters. I have no idea why they're boasting on their site about it as all cards will support it.
 

str

str

Soldato
Joined
18 Oct 2002
Posts
3,052
Zefan said:
It's not actually anything to do with the 3d cards... you can run 64bit textures on a 9800pro or whatever. It's the difference where the cpu comes in to shift textures around that matters. I have no idea why they're boasting on their site about it as all cards will support it.
I'm having a hard time making sense of the above. Can you elaborate? :)

Also here is a quote from Carmack explaining why 64-bit precision is a good thing:

We need more bits per color component in our 3D accelerators.

I have been pushing for a couple more bits of range for several years now, but I now extend that to wanting full 16 bit floating point colors throughout the graphics pipeline. A sign bit, ten bits of mantissa, and five bits of exponent (possibly trading a bit or two between the mantissa and exponent). Even that isn't all you could want, but it is the rational step.

There are other more subtle issues [due to limited precision - editor], like the loss of potential result values from repeated squarings of input values, and clamping issues when you sum up multiple incident lights before modulating down by a material. Range is even more clear cut. There are some values that have intrinsic ranges of 0.0 to 1.0, like factors of reflection and filtering. Normalized vectors have a range of -1.0 to 1.0. However, the most central quantity in rendering, light, is completely unbounded. We want a LOT more than a 0.0 to 1.0 range. Q3 hacks the gamma tables to sacrifice a bit of precision to get a 0.0 to 2.0 range, but I wanted more than that for even primitive rendering techniques. To accurately model the full human sensable range of light values, you would need more than even a five bit exponent.

64 bit pixels. It is The Right Thing to do. Hardware vendors: don't you be the company that is the last to make the transition.
 
Man of Honour
Joined
15 Jan 2006
Posts
32,369
Location
Tosche Station
3D cards have been able to process 64 bit textures for a long time, it's nothing new. CPU's however have not - this is the real issue and where the difference between running 32 bit and 64 bit textures lies.
 
Man of Honour
Joined
25 Oct 2002
Posts
31,707
Location
Hampshire
If we do get 64bit 3d accelerators, it will be interesting to see what kind of performance it gives (and difference in visual quality of course).

Thinking back to when Nvidia were pushing 32bit colour (TNT), performance was pretty poor so often you were better off running 16bit anyway - it wasn't until the Geforce (or possibly TNT2u) that 32bit became feasible in high resolution for modern games.
 
Back
Top Bottom