• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD's new offensive in the Dev Bribe Wars!

Soldato
OP
Joined
22 Aug 2008
Posts
8,338
I played Batman on 360 and haven't looked into it much myself, but if a simple hack is all that's needed to enable AA on ATI, and people at home report it works fine, why did the devs feel the need to "not block" ( :p ) it?

Surely they tested on ATI hardware and would've known how easy it would be to get it working. But I don't really have all the facts and technical business straight in my head so I guess I'm missing something.
 
Permabanned
Joined
8 Oct 2009
Posts
725
^^^
No Rroff because people would just disable AA and give Ati some earache.
Instead Nvidia chose to get xxxxxx number of people up in arms for being anti-competitive.
 
Associate
Joined
6 Nov 2005
Posts
157
Which leads to the more important question of why ATI didn't include their own path... and its equally possible that deals were done behind closed doors to prevent them or that they were too lazy to bother...

Not this chestnut again, of all the myriad of reasons why ATi gave some or maybe most developers (i don't know, neither do you) a cold shoulder, you always come out with "lazy", "cant be bothered".

Is it not possible they simply did not have enough money, being hard up on cash effects businesses in many ways. But still you want to paint ATi as this emotional entity.
 
Permabanned
Joined
8 Oct 2009
Posts
725
If your going to claim that nVidia purposefully locked out ATI cards to disadvantage them then the subtle difference in that code means everything.

No because locking out all competition and locking out Ati = the same thing Rroff!!! And both are anti competitive and Nvidia IS harming consumers!
 
Soldato
Joined
16 Jan 2003
Posts
10,565
Location
Nottingham
Why the game had no generic AA path - well you should be blaming the developer/publisher for that... why ATI didn't provide their own path is a more interesting one... everything points to ATI being too lazy to do it and would rather the nVidia one was enabled untested on their cards because it apparently appeared to work.

If they were going to prevent AA on non Nvidia cards did Nvidia really have to make it so other vendors cards did all the work to apply AA but not actually applying it? Meaning they got all the performance reduction from AA with no actual image quality increase, hardly sounds like playing fair.
 
Soldato
Joined
6 Oct 2007
Posts
22,281
Location
North West
Of course they would have tested it on ATI hardware, FFS the game wasn't developed just for consumers that have NV hardware, there is no defense, unless you worship Huangs hairy crack.
 
Soldato
Joined
17 Aug 2003
Posts
20,158
Location
Woburn Sand Dunes
I played Batman on 360 and haven't looked into it much myself, but if a simple hack is all that's needed to enable AA on ATI, and people at home report it works fine, why did the devs feel the need to "not block" ( :p ) it?

Surely they tested on ATI hardware and would've known how easy it would be to get it working. But I don't really have all the facts and technical business straight in my head so I guess I'm missing something.

Well, thats the thing. it DID work on ati cards without the renaming of the card id trick....when the demo was released. but it was "not blocked" when the game was released.


I like the "not blocked" term btw. im going to use that from now on :p


Oh, i have a question: has anybody looked at the code and confirmed how nvidia actually did it?
 
Last edited:
Man of Honour
Joined
13 Oct 2006
Posts
91,047
^^^
No Rroff because people would just disable AA and give Ati some earache.
Instead Nvidia chose to get xxxxxx number of people up in arms for being anti-competitive.

That would have just made things even worse

If they were going to prevent AA on non Nvidia cards did Nvidia really have to make it so other vendors cards did all the work to apply AA but not actually applying it? Meaning they got all the performance reduction from AA with no actual image quality increase, hardly sounds like playing fair.

Theres several different conclusions you can draw from this... one being that the programmer thought ATI would include a solution before it shipped, another potentially nVidia were trying to slowdown performance on ATI cards by making it do unnecessary work (tho for a number of technical reasons that seems unlikely).

Of course they would have tested it on ATI hardware, FFS the game wasn't developed just for consumers that have NV hardware, there is no defense, unless you worship Huangs hairy crack.

No need to be rude. AA and deferred shaders are notoriously tricky to get working together, it would seem the developer lacked the resources so wouldn't have had any AA path for any hardware at all if nVidia hadn't stepped in, testing it would have required fairly detailed knowledge of how the ATI hardware handled it beyond the scope of your average middleware developer.
 
Soldato
Joined
17 Aug 2003
Posts
20,158
Location
Woburn Sand Dunes
Re: News - AMD exec says NVIDIA neglecting gamers
The way I see it is that MSAA (As Huddy clearly states in his e-mail subject line) in DX10 mode should work because it's part of the damn specs of the unreal engine 3.5 (aka UE3 DX10) AFAIK and ARE FOR SURE PART OF DX10 SPEC. UE3 DX9 does not support AA with deferred shadows (DPR+AA...Or essentially deferred pixel/shadow HDR+AA) because of dx9 limitations (no DPR+AA in the API), but DX10 does, and UE3.5 supports DX10. Nvidia can tout their DX9 workaround that ATi doesn't have, but could support in hardware (since the X1000 series, nvidia 8000 series), that's fine and lovely. Good for them. IN DX9 mode.

Remember the x1000 series pushing HDR+AA that nVIDIA couldn't do in the 7000 series? Yeah, that's this essentially, as the deferred shadows in UE3 (that the DX9 API can't support simultaneously with AA) are HDR. Hence why this only works on 8000 series and up nvidia hardware. Huddy knows this. ATi should be able to push AA through catalyst, just like nvidia does with their essentially built-in force work-around they've used in the past (that Huddy also mentions). Granted, if to be supported in-game, ATi should provide or help write that code...For DX9.

This game runs in DX10 mode. ATi supports DX10. DX10 will do DPR+AA. ATi should be supported. End of conversation.

Like Huddy says in his e-mail, they are likely using the DX10 codepath for AA...Why are they locked out? Because nvidia enabled AA in DX9 mode through forcing hax and built it into the game GUI? This should not transfer over to DX10. That's just BS on all fronts, any way you cut it.

This is clearly what can only be described as TWIMTBP douchebaggery of EPIC proportions, or some very strange misunderstanding on EIDOS' and Rocksteady's part. You'd think devs would know better?

Huddy is confused as to what nVIDIA did, and so am I. I am willing to bet nVIDIA did jack **** to enable AA in DX10 mode, because DX10 supports deferred rendering + AA in the ****** SPEC API, where-as dx9 (and nvidia dx9 cards) does not. Nvidia did not create the dx10 spec, nor the engine's ability to support the dx10 API. Even further, why should they care if someone forces it through cat in dx9 mode? Obviously it wouldn't be supported, but to lock it out completely in both modes can only be described as a dev super fail.

Got it Eidos/Rocksteady? DX10 yes. DX9 if forced through catalyst or ATi helps with code for in-game (that could be supported back to X1000 series..although would prolly run very badly).

Posted on a forum i cant mention the name of here. interesting......

No need to be rude. AA and deferred shaders are notoriously tricky to get working together, it would seem the developer lacked the resources so wouldn't have had any AA path for any hardware at all if nVidia hadn't stepped in, testing it would have required fairly detailed knowledge of how the ATI hardware handled it beyond the scope of your average middleware developer.

first up you seemed fine with insulting my intelligence and second, you just, literally just, said the path is actually very close to a generic path so it cant be notoriously difficult, cant it? because it wouldnt work otherwise.....

i think that's an internetz checkmate.
 
Last edited:
Man of Honour
Joined
13 Oct 2006
Posts
91,047
The licensed version of the UE3 engine as used in Batman AA has no AA path or edge filtering shader path in any version of DX - epics internal dev builds yes, 3rd party licensee no*.



*Although some "preferred" studios appear to have access to it.
 
Back
Top Bottom