• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

So have PCs caught up?

Soldato
Joined
23 Nov 2004
Posts
8,024
Location
The Place To Be
PLEASE DO NOT turn this into a PC vs Console thread. I am purely wondering.

When the Xbox 360 was released, I think the 7800GTX was the best PC GFX solution, and quite rightly - everyone said that the 360's graphics chip was superior.

However, is this still the case with the X1900XT-X?
48 pixel shaders is simply disgusting :D And from watching the X1900XT rendered Crysis video, it looks like it sure packs a punch!

Thoughts?
 
Soldato
Joined
25 Oct 2005
Posts
13,779
Depending on your point of view, PCs have surpassed the Xbox 360 for some time. The Xbox 360 has to upscale video for HD resolution, which is considered very low-resolution by today's standards in PC gaming.

I'm not being funny, but trying to get an Xbox 360 to play Oblivion on a Dell 30" LCD would be comedy if it couldn't upscale. It would grind to a complete halt. The only advantage it really has is next-gen unified shader architecture, coming to a PC near you in 2006/2007, and the fact that it's got a uniform specification whereas the PC is as unique as a person and the OS must cover all bases.
 
Last edited:
Soldato
Joined
25 Oct 2005
Posts
13,779
Lynx24 said:
Can i ask where did you see the video with the x1900xt with crysis?
There was a Crysis interview with video released a short time ago, it was all done on current-gen hardware. I think all the Crysis videos were.
 
Soldato
Joined
29 Oct 2004
Posts
10,884
Exsomnis said:
Depending on your point of view, PCs have surpassed the Xbox 360 for some time. The Xbox 360 has to upscale video for HD resolution, which is considered very low-resolution by today's standards in PC gaming.

I'm not being funny, but trying to get an Xbox 360 to play Oblivion on a Dell 30" LCD would be comedy if it couldn't upscale. It would grind to a complete halt. The only advantage it really has is next-gen unified shader architecture, coming to a PC near you in 2006/2007, and the fact that it's got a uniform specification whereas the PC is as unique as a person and the OS must cover all bases.

Good point, 1280x720 with 2xAA shouldn't be too challenging for any current cards.
 
Associate
Joined
8 Sep 2005
Posts
303
Location
Leixlip, Ireland
Yeah the thinking was around E3 last year that the GPU going into the xbox360 wasent as completly revolutionary as ATI kept claiming and was just an beefed up x1800xtx with some new features. Given the speed that ATI got the x1900 given much of their development team being tied up with the x1800 problems its assumable that it was a spin off of the xbox360 project. I dont think ATI have the resources to be workiing on 3 GPU's at once.

And i don't think the the whole unified shader architecture does all that much, i just think it can help out in some odd unbalanced situations where the current design doesnt do all that bad to begin with. The dedicated ram to do antialiasing is something unusual though, but if it were so great then you would have thought ATI would be quick to stick in on their PC cards. Im guessing that at 10mb it only at relatively low resolutions. Also i would think the low speed of the GPU (thermal issues?) holds back performance a bit, given the speeds of other ATI gpu's

CPU wise i dont know, but devlopers have said it isnt as good as an Athlon 64, so it has to be holding any top notch graphics card back, especially at low resolutions.

I would think all an Dual Core Athlon/Opteron, 2gb of Ram and a x1900xtx would wipe the floor with an xbox360. Just think then, what a pc equipped with an FX-60, crossfired x1900's (or 7900gtx's) and 2gb of ram all overclocked with some sort of RAID array. That would really put the Xbox360 to shame. And with new things coming, like Conroe, DX10 cards, extremely fast DDR2 modules and a hyrbid hard drive, the gap will only widen.
 
Associate
Joined
8 Sep 2005
Posts
303
Location
Leixlip, Ireland
I wouldent think nvidia could work on 3 similtaenious GPU cores either...

I can kind of believe ATI was working on the x1800 and x1900 GPU's at the same time, but a 3rd completly different xbox360 GPU, thats pretty intense. More likely the xbox360 gpu is related closely to one of them.

Oh and of course im ignoring laptop gpu's and mobile phone gpu's and all that, because im convinced they are derived from desktop cores 100% or made by well trained monkeys.
 
Associate
Joined
27 Dec 2005
Posts
201
Location
Edinburgh
Exsomnis said:
Thanks. :) Such a good interview, I love when he says "the multiplayer in Far Cry was... well... crap really." :D

Yeah, but I think some of the more important parts are his description of how you can change settings on the go - such as having 4 different settings for bullet effects and how you can use strength/power for various scenarios.

This sounds far more advanced than Crytek previous (Farcry) game and any of the current FPS games that are on the market just now.

Game is meant for DX10 but hopefully most of the advances will still be able to be used by current cards.
Roll on December :D
 
Back
Top Bottom