Associate
- Joined
- 30 Dec 2008
- Posts
- 289
overclocking gtx280, anyone clocked it over 666, 1433 and 1200 and stable.
Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
LOL what a load of crockery, the inards of the x2 aren't exactly comparable to the inards or the nv top line card are they. Foolish to even think they are.
That comment saying more about you than anyone here seeing as both sides are as bad as each other & you become they very thing your calling others by pointing out a particular side with out any reasoned argument as to why
overclocking gtx280, anyone clocked it over 666, 1433 and 1200 and stable.
By very definition a fanboy cannot be reasoned with (as this thread illustrates beutifully). Put your GCSE psychology book away lad.
So the innard have to be the same to provide a valid comparison? Let us just summarize this in easy-to-rea dlanguage for those of the minority here that think they know better than professional reviewers, and seem to lack some basic common sense:
1) The ATI X2 4870 is ATI's fastest available enthusiast card in production, and is marketed as a single card despite it's dual core architecture.
2) The GTX 280 is NV's fastest available enthusiast card, they have nothing faster as yet.
ZOMG, reviewers are comparing the two fastest available cards from both manufacturers! That doesn't make sense... does it?
Jesus wept, what's wrong with some of you people, it's not rocket science. The only people I see having a problem with it are those with bias towards NV. ANyone else with half a brain and some objectivity knows it's as valid a comparison as any. The future will be multi-core GPU's acting as one card... ATI just got there first. It really is that simple.
And CUDA....
wheres ati's version of cuda ? /fail
Richdog telling it like it is
It's called Stream, look into it you might learn something
Cuda only benefits most people at the moment who want Physx and reasonably decent video encoding.
Point i was trying to make was that i hoped nvidia would be trying new alternatives rather then going for the same old solution to advance things which i didn't think was that unreasonable and certainly didn't think it made me a fanboy but then this is OcUK gfx card forums so i should have known better.
Wonder why this has the name "GTX285", up untll now these gaffer taped jobs have always been named GX2. So thats GTX260 GTX280 and now GTX285.
This isn't the 'gaffer taped job', thats the GTX 295 (still not GX2 for some reason), the 285 is just an overclocked 280 on a smaller process.
Funny thing is i actually had 295 typed initially then changed it to 285.
'Innovation' can only happen with new releases of DX. Any features not in DX will barely get any use (like the various forms of tesselation in the Radeons).
Sure you can go down different paths with the hardware, but ultimately the end-user is isolated from this. Both SLI and CrossFire are a complete joke, little more than brute-force solutions requiring driver hacks which results in the requisite bugs, issues and (sometimes / often?) poor user experience. NV GX2 and ATI X2 series are imho hack solutions to brute-force FPS figures in benchmarks to sell cards and gain a halo product.
My X2 has been amazing, I don't need any driver hacks and it's far from a "poor user experience". The problems people seem to have stem from earlier driver releases but I've never had 1 problem with my X2!
The X2 is a better attempt at dual GPUs than the GX2 IMO.