PDA

View Full Version : Anti-Aliasing on DX10 ATI cards



J Paul
4th Sep 2009, 17:23
Why is Anti-Aliasing restricted to only Nvidia GPUs?

Having played and tweaked the PC demo, I know that it is possible to use AA on ATI graphics cards when operating under DirectX 10 by either forcing it via the CCC or by changing the VendorID of your video card. There are absolutely no problems that result in doing so, so it's obviously not disabled due to any bugs that it may cause that would impact the gameplay experience. I'm starting to believe that it is disabled due to immense and incomprehensible douchebaggery on part of Eidos and possibly Nvidia as well.

This isn't like PhysX. Personally, I don't care a bit about PhysX, but I completely understand why it won't work on ATI cards; it's an Nvidia technology and is therefore only supported on capable Nvidia GPUs. However, Anti-Aliasing is not an Nvidia-specific technology, and works just fine on ATI cards after bypassing certain restrictions.

Why would you impose arbitrary restrictions upon customers who use an ATI GPU for no reason other than because they use an ATI GPU rather than an Nvidia GPU? What's the point in it? What could you possibly be looking to gain in doing so?

I absolutely loved the demo, and I've played a little bit of the full game on a friend's 360, and that was even better. This looks like it will be a terrific game, but sadly, I refuse to purchase it until this incredibly stupid arbitrary restriction is removed. By principal, I do not support developers who do things like this to their customers. I understand that I could simply force it in the same way that I already have done, but that's not the actual issue here. It's not really about the lack of an almost unnoticeable improvement in image quality. The issue here is treating paying customers with the respect they deserve.

***Being that the full game isn't yet released, this entire thread may be null and void, as I do not know whether or not this restriction exists in the full game. Come release day, it may very well turn out that this restriction is specific to the demo and isn't present in the full game. If this is the case, then I will be glad to purchase the game and support Eidos for making such an excellent title. If this is not the case, then I humbly suggest that Eidos stops defecating on their customers (metaphorically speaking, of course).

Evilbaby
7th Sep 2009, 17:42
I agree with you even as a Nvidia user. In fact you dont even need DX10. People had successfully forced AA on winxp as well. Another unreal engine game Mirrors Edge does AA fine on ATI cards in DX9.

Anyone who messed around with 3d analyze to change vendor ID will know that Eidos have no problem supporting AA on ATI cards if they wanted to. By intentionally forcing something as basic as anti aliasing AA off on ATI hardware, they are really making themselves look stupid.

Physx? Dont make me laugh. Games like Crysis, Fallout3 and Dirt had more impressive physics done on the CPU and dont run at 10fps.

And no this game is a no buy for me. Too retarded gameplay. Go there press this button, jump there, do that. Spoon fed and linear gameplay. No surprising these are the people who suddenly decide we all need nvidia card to enable AA.

The Coca Cola Company
7th Sep 2009, 23:16
This is the answer of Rocksteady:


The form of anti-aliasing implemented in Batman: Arkham Asylum uses features specific to NVIDIA cards. However, we are aware some users have managed to hack AA into the demo on ATI cards. We are speaking with ATI/AMD now to make sure it’s easier to enable some form of AA in the final game.


While it can be used with Ati cards, it hasn't been tested and it's not officially supported. It's up to Ati now that they have been contacted to provide a 100% working solution of implementing an AA filter in the game for their cards.

Evilbaby
8th Sep 2009, 01:00
This is the answer of Rocksteady:




While it can be used with Ati cards, it hasn't been tested and it's not officially supported. It's up to Ati now that they have been contacted to provide a 100% working solution of implementing an AA filter in the game for their cards.

Have you read my post? Rocksteady is intentionally locking out AA on ATI cards by using vendorID check. That reply by RS is a lie to put blame on Ati. There is nothing that ati can do to make AA work short of changing their cards ID to Nvidia. AA on mirrors edge works on day one without problem.

The Coca Cola Company
8th Sep 2009, 06:31
Rocksteady is intentionally locking out AA on ATI cards by using vendorID check.

Yes, since the method they have implemented is built to work on nvidia cards the most easy way to check if the system supports the method is by checking the vendorID. Unless you know of any other way to check what graphics card is being used I see no issue with the way they check it.


AA on mirrors edge works on day one without problem.

That means nothing. Epic (creators of Unreal Engine) DO NOT SUPPORT AA on dx9 due to the differed shading technique their engine is using. In fact both of their games on the PC that use the engine (UT III & Gears of War) do not have any AA whatsoever in dx9 mode. It's up to Rocksteady to think of a way to implement AA filtering, and Rocksteady has decided to use a method that works with nvidia cards while DICE for Mirror's Edge have used a method that works with all cards. If you have noticed Rocksteady's implementation doesn't work on older than Geforce 8xxx cards either, because they use specific features of the latest generations of Geforce. It's up to Ati now to help them implement a solution that works flawlessly with their hardware.


------------------------------
Just to be sure, can you provide any screenshots where you have changed the vendorID to nvidia and AA works?

Please make sure you don't rename the exe or force-enable AA in the Catalyst control panel. The only option you should use is in the launcher.

Evilbaby
8th Sep 2009, 11:34
Yes, since the method they have implemented is built to work on nvidia cards the most easy way to check if the system supports the method is by checking the vendorID. Unless you know of any other way to check what graphics card is being used I see no issue with the way they check it.

This argument doesn't hold water. AA works without any problem on ATI hardware by spoofing vendor ID. So there is no reason to disable except promote Nvidia hardware. Why else who there be" Nvidia 8000+ required" in the video options? Did you think the developers who spent millions of dollars and years developing this game had no access to testing hardware from ATI?



That means nothing. Epic (creators of Unreal Engine) DO NOT SUPPORT AA on dx9 due to the differed shading technique their engine is using. In fact both of their games on the PC that use the engine (UT III & Gears of War) do not have any AA whatsoever in dx9 mode. It's up to Rocksteady to think of a way to implement AA filtering, and Rocksteady has decided to use a method that works with nvidia cards while DICE for Mirror's Edge have used a method that works with all cards. If you have noticed Rocksteady's implementation doesn't work on older than Geforce 8xxx cards either, because they use specific features of the latest generations of Geforce. It's up to Ati now to help them implement a solution that works flawlessly with their hardware.

That means something since Mirrors edge is using UE3 engine as well. And how is it up to ATI to implement a solution when ATI drivers and hardware already supports AA with no driver tweaks needed. It is only up to Eidos/RS to enable it. (remove vendor ID check) See this thread reply from dmwhitedaragon: http://www.tomshardware.com/forum/269812-33-when-nvidia



------------------------------
Just to be sure, can you provide any screenshots where you have changed the vendorID to nvidia and AA works?

Please make sure you don't rename the exe or force-enable AA in the Catalyst control panel. The only option you should use is in the launcher.

I can't provide any screenshots and I can't be motivated to since I am using Nvidia hardware and AA works just fine on my side. OP can do this. But this has been discussed and proven on many forums before. No exe or CCC tweaks needed. Just fool software that Nvidia card is present. Will dig up a link when I have time.

Evilbaby
8th Sep 2009, 11:44
To Eidos/ Rocksteady:

Thank you for making me feel good about my Nvidia purchase. But the problem is ATI cards still gets nice looking graphics and smooth gameplay. So do a proper job next time and disable these on ATI hardware as well:

1) Shadows
2) High polygon characters
3) Anisotropic Filtering
4) Resolutions above 800x600
5) Bump maps and specular surfaces

The Coca Cola Company
8th Sep 2009, 16:40
This argument doesn't hold water. AA works without any problem on ATI hardware by spoofing vendor ID. So there is no reason to disable except promote Nvidia hardware. Why else who there be" Nvidia 8000+ required" in the video options? Did you think the developers who spent millions of dollars and years developing this game had no access to testing hardware from ATI?

Why are you asking me why is it written?? Obviously because it works only with nvidia 8000+ cards!! Would you prefer to learn by yourself that AA doesn't work on Ati?

Of course they had access to ATI hardware, however they chose to use an nvidia implementation to AA filtering. AA on engines using deferred shading doesn't work out of the box, and the company has to find their own way IF THEY WANT TO! Since nvidia is providing their knowledge and support to all TWIMTBP partners it was easy to find a way to make it work for nvidia hardware without wasting time and money in a feature that most games using differed shading simple ignore. I'll name a few: Dead Space, S.T.A.L.K.E.R., Ghostbusters, Bionic Commando.

I'll use your words: "Did you think the developers who spent millions of dollars and years developing this game had no access to testing hardware from" nvidia and ati? :rolleyes: As you can see the implementation of AA on games that use such a technique IS NOT TRIVIAL.

My point: They didn't disable functionality for Ati, they ENABLED functionality for nvidia.





That means something since Mirrors edge is using UE3 engine as well.

No. Unreal Engine 3 as given by Epic to licensees DOES NOT SUPPORT AA on any card, on dx9. The licensee has to find a way to implement AA for the graphics card models he's interested in. DICE has implement AA with a solution that works on both cards. Rocksteady didn't. And you are forgetting one thing: Mirror's Edge is using a highly customized version of UE3, that has replaced the lighting system of Unreal Engine with that of Beast. http://www.illuminatelabs.com/products/beast

As you see the engines the two games use are very different.

To further prove my point here's a quote by Keita Lida, Director of Content Manager at nvidia:

http://www.nzone.com/object/nzone_darkvoid_interview.html


PhysX actually is integrated in a bunch of major game engines. So we’re in the Unreal engine; we’re in Gamebryo; we’re in Vision; we’re in Instincts; we’re in Trinigy; we’re in Diesel; we’re in Hero; we’re also in Big World. So pretty much every major game engine in the world has PhysX integrated into it.

And also keep in mind that a lot of the key games, just like MT framework from Capcom, a lot of games of major game developers, they use their own internally developed engines. And so with Mirror’s Edge from Dice that’s also their own proprietary engine that has a PhysX implementation.

See what he said there? Although he mentioned that PhysX is in Unreal Engine 3, he later says that it's also in Mirror's Edge who uses a PROPRIETARY engine. Do you understand now that Mirror's Edge doesn't use the "standard" Unreal Engine 3 version that Batman AA has?


And how is it up to ATI to implement a solution when ATI drivers and hardware already supports AA with no driver tweaks needed.

Nah, ah. Wolfenstein (another game that uses a deferred shading technique) doesn't support AA at all. Nvidia's drivers don't support it either. But if you force the drivers to use a known workaround for enabling AA in S.T.A.L.K.E.R. the edges are being antialiased.
Link: http://forums.guru3d.com/showpost.php?p=3249453&postcount=273

It's not perfect, the performance drop is higher than usual and stability issues might arise since you are not meant to do this.

On Ati on the other hand no matter what you do (change vendorid, rename the exe, force it through catalyst) it simple doesn't work at all.
Link: http://forums.guru3d.com/showpost.php?p=3260097&postcount=397

That means that nvidia's drivers must do something in a different way since the effect (partly) works only on nvidia cards, right? Even if you force the Catalyst drivers to use S.T.A.L.K.E.R.S' profile that uses a workaround ATI has found to enable AA on that game it simply doesn't work. Isn't it obvious that nvidia uses a different (more universal) method than ati?



It is only up to Eidos/RS to enable it. (remove vendor ID check) See this thread reply from dmwhitedaragon: http://www.tomshardware.com/forum/269812-33-when-nvidia

What he says:


DirectX 9 supports AA directly... where did you get the idea it didn't?

[...]

If the Unreal engine doesnt nativley support this then its an oversight... however i suspect you are quite off base and that it actually DOES and its more like the developers generally dont add an option to set it in their video options screen. (as all interfaces are done by the end-dev obviously)

Now let me prove that he has no idea what he's talking about and he' s only spreading misinformation.

http://en.wikipedia.org/wiki/Deferred_shading


Another rather important disadvantage is, that due to separating the lighting stage from the geometric stage, hardware anti alias does not produce correct results any more: although the first pass used when rendering the basic properties (diffuse, normal etc.) can use anti alias, it's not until full lighting has been applied that anti alias is needed.

Here's what Tim Sweeny, engineer of the Unreal Engine 3, says about the engine that HE, HIMSELF HAS CODED:

PCGH: You are using deferred shading. Will there be any chance to get FSAA working with DX9-Level-cards? What about DX10-cards?

[quote]Tim Sweeney: Unreal Engine 3 uses deferred shading to accelerate dynamic lighting and shadowing. Integrating this feature with multisampling requires lower-level control over FSAA than the DirectX9 API provides. In Gears of War on Xbox 360, we implemented multisampling using the platform's extended MSAA features. On PC, the solution is to support multisampling only on DirectX 10.

PCGH: Are there any plans at Epic to upgrade the engine for DX 10? Have you already made experience with Microsoft's new API?

Tim Sweeney: Yes, we'll ship Unreal Tournament 3 with full DirectX 10 support. Support for multisampling is the most visible benefit.

Do you see how dmwhitedragon doesn't know what he's talking about but will insist he somehow found an easy way to implement AA on engines using deferred shading techniques, something that people with much more knowledge than him still struggle with?



I can't provide any screenshots and I can't be motivated to since I am using Nvidia hardware and AA works just fine on my side. OP can do this. But this has been discussed and proven on many forums before. No exe or CCC tweaks needed. Just fool software that Nvidia card is present. Will dig up a link when I have time.

It has been already debunked in at least one case that AA works correctly on Ati. As you can see it does not work as it should if forced and might even break the game:

http://www.yougamers.com/forum/showpost.php?p=1171117&postcount=214

If you haven't properly researched it because you lack the resources, please stop presenting your opinion as a fact.

jaywalker2309
8th Sep 2009, 16:52
The whole nVidia vs AMD (ATI) debate is a lot like the Sony vs Microsoft debates.. There will always be a VS somewhere in the gaming industry. Why should consoles be only ones getting hardware specific `features`. ;)

The `official` full game should hopefully address the ATI issue not being able to get AA enabled etc.I believe a driver update from ATI should provide the profile needed.

Inzane
8th Sep 2009, 17:29
I am just getting tired of all the garbage Nvidia has been pulling for the past few years.I have been to E3 watched the hardware demos. talked with the reps. A few years back Direct X 9.c had a killer demo. It blew me away. There is still shaders and features hidden in the api that is not being used. Gaming should be far better than what it is. And For years I been wondering why its been held back. And so have others in the industry. Than a light dawned and it became clear. who and why. Nvidia attempted to skip past DX 10. well that only lasted so long. and with the final 10.1 they refused to support it. And now with the launch of dx11, Nvidia has no plans of releasing cards on launch day. ATI on the other hand is ready to go. the dx11 api is going to end the reign of terror Nvidia has on the gaming realm. ATI makes the standards for DX, THATS right folks ATI. Nvidia is the fat kid how can't run a race, but trys to fool you with mind games to make you think they indeed finished the race, this will become apparent in the years to come. Closed formats are bad for the progress of ANY industry. In the end it hurts consumers and than the company it self..think of where our world would be if there was nothing but closed formats?
For one no telephone, Cars and my favorite skyscrapers:D guys take a good long look at this and for once think for yourselves.

The Coca Cola Company
8th Sep 2009, 18:16
Unfortunately you need to do better research.


A few years back Direct X 9.c had a killer demo. It blew me away. There is still shaders and features hidden in the api that is not being used. Gaming should be far better than what it is.

nvidia supported dx9.0c long before ati (see below).


Than a light dawned and it became clear. who and why. Nvidia attempted to skip past DX 10. well that only lasted so long.

No. Just no. nvidia supported dx10 long before ati (see below)

Also the first dx9 game (Far Cry) and the first dx10 (Lost Planet) game were both made by partners of nvidia (Crytek and Capcom) in the TWIMTBP program. nvidia has actually helped those developers to create games that were technologically advanced and used the power of the cutting edge graphics cards.


and with the final 10.1 they refused to support it.

dx10.1 is a superset of dx10 which just makes sure that some features that a dx10 card may or may not support are there and fully functional. The thing is that since all dx10 graphics cards are very programmable, these features can be used even if the card isn't dx10.1 certified.

And in the case of nvidia through the use of nvapi such features are available in dx10 cards and work just/almost (it depends on the game) as good as a native dx10.1 card without compromising quality or speed. (That was impossible in the past, before dx10 hardware became available.) But I won't disagree with you that a native dx10.1 card is preferred.


And now with the launch of dx11, Nvidia has no plans of releasing cards on launch day.

It's almost always ati who is late in adapting the newest dx api.

dx7: Geforce 256 Series: October 1999 - Radeon R100 Series: 2000
dx8: Geforce 3 Series: February 2001 - Radeon R200 Series: Late 2001
dx9: Geforce FX Series: January 2003 - Radeon R300 Series: August 2003
dx9.0c: Geforce 6 Series: April 2004 - Radeon R520 Series: October 2005
dx10: Geforce 8 Series: Nobember 2006 - Radeon R600 Series: 2007

(All info come from wikipedia)

If anything ati is the one who was holding back the development for all those years! (Though note that I do not believe something like that, I'm just showing you how wrong you are)

And there's no dx11 game out there yet!


Closed formats are bad for the progress of ANY industry.

I agree, they should be avoided if an alternative exists.

Inzane
8th Sep 2009, 19:22
"The Coca Cola Company" sounds to me your a fanboi. I like to see your sources other than something fabricated up by Nvidia. And living it and seeing it first hand in the industry is a whole other ball game.Nvidia has lied countless times over and over again. they have streached the truth about their benchmarks,marketing, support products. this is all well documented. Nvidia can talk a good game and they are great at cheating and hiding their short comings. Nvidia is headed in the same direction as 3dfx. the same way nvidia bought out 3DFX is the same way its going to happen to them.while under contract Devs don't dare say anything bad about their products. Only when out of contract the truth comes out. I have had many contacts get screwed by Nvidia in the end. you really need to get your facts straight. ATI has always been Direct X launch partners. let me make this clearer for you Microsoft,Intel and ATI have been moving against Nvidia for years.Microsoft gave them a chance once to use their api when they made the chipsets for the XBox (original name was Direct X BOX). Nvidia had plans of their own and refused to use Parts of the API, also there were other issues. When Microsoft decide to build a new next gen system. they brought a trusted launch partner ATI in on it.
And finally Wikipedia
is not a reliable source. Why don't you find the white papers.Press releases. Interviews. I can hop into Wikipedia and change those dates myself. I blindly supported Nvidia for years. Until I started seeing it myself and many industry friends where unemployed because of their crappy ways. In the past 2 years almost all Nvidia launch partners went under Or came very close to it. Look this up yourself. I will give you one, Grin was so happy to please Nvidia, they focus on the physx part of the game, that never ever got released btw. Just keep reading the crap Nvidia posts on the web for its fanboi Army. as I said they have become great at streaching the truth, or making a mountain over a ant hill.

Ohh and BTW are you one of these? clicky (http://consumerist.com/152874/did-nvidia-hire-online-actors-to-promote-their-products) very low, very low

The Coca Cola Company
8th Sep 2009, 20:08
"The Coca Cola Company" sounds to me your a fanboi.

Good thinking. When a person disagrees with you and bothers to write a long response explaining why he believes you are wrong, then you should call him a fanboy. :thumb:

Please do not use again such characterizations because you are violating Rule #5 & #6 of this forum.

As I haven't called you names but instead took the time to write a long post to debunk your statements, I expect from you to do the same.


I like to see your sources other than something fabricated up by Nvidia.

Please read my posts more carefully. I haven't used as an (exclusive) source nvidia so far.

Also note that continuously calling nvidia a company that does not respect it's customers while recommending to stay as far as possible from them, might be violating Rule #10 of the forum. http://forums.eidosgames.com/announcement.php?f=270&a=29

Please provide good reasoning when making such claims.


ATI has always been Direct X launch partners. let me make this clearer for you Microsoft,Intel and ATI have been moving against Nvidia for years.Microsoft gave them a chance once to use their api when they made the chipsets for the XBox (original name was Direct X BOX). Nvidia had plans of their own and refused to use Parts of the API, also there were other issues. When Microsoft decide to build a new next gen system. they brought a trusted launch partner ATI in on it.

I cannot see how this "inside" knowledge that you have has anything to do with Batman AA, or the fact that the first dx9 & dx10 games were sponsored by nvidia or that nvidia was the one that launched the first dx7, 8, 9, 9.0c & 10 cards. These are facts, they are not information that you are only privy about and we are supposed to believe you just because you say so. BTW intel, Microsoft & Xbox have nothing to do with the discussion we have.


And finally Wikipedia
is not a reliable source. Why don't you find the white papers.Press releases. Interviews.

I'm sorry but you are the one that has to do that if you think my source is wrong. I won't do the work for you. :rolleyes: BTW I know that all the release dates I have posted are correct, because I'm closely following the PC gaming scene for many years and the dates in wikipedia are the ones I, myself remember. If you can't remember them for yourself, please research it and post back if you find a source that states differently. Do not ask me to do this for you.



In the past 2 years almost all Nvidia launch partners went under Or came very close to it.

Which means?



I will give you one, Grin was so happy to please Nvidia, they focus on the physx part of the game, that never ever got released btw.

Grin were a company that released 3 (mediocre) titles in only a year. You shouldn't surprised that things didn't work as they thought they would.

Oh, and it seems to me that the guys from Grin want to thank nvidia for their co-operation all those years: http://www.grin.se/

We would like to thank a few special companies and individuals that have meant a lot for us during these years:

Phil Scott & Phil Wright and crew at Nvidia

BTW I don't see how it's the fault of nvidia if Grin was so happy to please them. Logically it can only be the fault of Grin.


Just keep reading the crap Nvidia posts on the web for its fanboi Army.

You know that using such statements will make it even harder for someone to listen to you, right? :scratch:

Inzane
8th Sep 2009, 22:21
when you spew nothing more than Nvidia rhetoric, how can any of us take you serious, There once was a day when you could buy a game and it would run on any gfx card, those days are coming to a end. I don't like the viral marketing that Nvidia attempts. If you build something good, that should speak for its self. Cheating drivers and hiring actors, dodging questions by the press about under handed activities,these things are not going to win me over. I made a choice to no long support their company. And so have millions of other people. Its a free world right? so instead of fixing the problem and embracing new tech(what nvidia uses is old tech 10 years old to be exact, and some of it from 3dfx) they waste time and resource to wage a online war. go figure

The lies Nvidia has been caught on
.Millions of HP Notebook users that got burned, Nvidia blams HP. Flawed GPU was at fault
.CineFX= fail
.Renaming old cards to exploit users into buying new hardware
.Nvidia "lies" about Shader Model 3.0 and that its hardware isn't SM3.0 compliant.
.Nvidia disables physx if another card other than Nvidia card is present on system, Screws users that own ati for rendering and nvidia for physx

Thats a short list of their epic fails.
MORE LIES (http://www.theinquirer.net/inquirer/news/1137261/nvidia-spin-borders-truth)
More Lieing (http://http://vr-zone.com/forums/396330/nvidia-cuts-out-reviewers-who-speak-the-truth.html)

And if your gonna pull the rule of the Board out, your in violation yourself. I am just defending those that you have slam posted yourself FRIEND. And to be exact I in no way shape or form, told anyone not to buy Nvidia Products, How could I am using one right now. I no longer support their methods.

J Paul
8th Sep 2009, 22:50
This thread isn't about Nvidia vs. ATI. Brand loyalty is the stupidest thing in which anyone could ever indulge, and I wholeheartedly suggest that anyone who cares about things as trivial as the name on the sticker should reconsider their concept of value. Purchase the product that does what you require of it at the price point you're comfortable with paying.

This thread is about the fact that AA does indeed work perfectly on ATI cards, yet the game doesn't normally allow one to do so due to arbitrary and nonsensical restrictions. It has been proven to work properly time and time again and you can test it yourself. Hell, even transparency AA works.

I don't care if you have a stick up your ass over one brand or another because it isn't relevant to this thread.

Inzane
8th Sep 2009, 23:00
This thread isn't about Nvidia vs. ATI. Brand loyalty is the stupidest thing in which anyone could ever indulge, and I wholeheartedly suggest that anyone who cares about things as trivial as the name on the sticker should reconsider their concept of value. Purchase the product that does what you require of it at the price point you're comfortable with paying.

This thread is about the fact that AA does indeed work perfectly on ATI cards, yet the game doesn't normally allow one to do so due to arbitrary and nonsensical restrictions. It has been proven to work properly time and time again and you can test it yourself. Hell, even transparency AA works.

I don't care if you have a stick up your ass over one brand or another because it isn't relevant to this thread.

Thank you for putting this back on topic. And since I do use a ATI card with my Nvidia, it will come in handy. Thank you for finding this out for us. :D

The Coca Cola Company
8th Sep 2009, 23:45
This thread isn't about Nvidia vs. ATI. Brand loyalty is the stupidest thing in which anyone could ever indulge, and I wholeheartedly suggest that anyone who cares about things as trivial as the name on the sticker should reconsider their concept of value. Purchase the product that does what you require of it at the price point you're comfortable with paying.

I couldn't agree any more.


This thread is about the fact that AA does indeed work perfectly on ATI cards, yet the game doesn't normally allow one to do so due to arbitrary and nonsensical restrictions. It has been proven to work properly time and time again and you can test it yourself. Hell, even transparency AA works.

http://www.yougamers.com/forum/showpost.php?p=1171117&postcount=214

http://www.yougamers.com/forum/showpost.php?p=1171154&postcount=216

To me it seems it doesn't work properly, which means that there is work that has to be done and it's not the vendorid check that is restricting the proper use of AA.

If you can take screenshots which are as antialiased as on nvidia's hardware perhaps I will begin to think differently.

So far the above thread is the only one I've found on the internet where users bothered to actually test how good it works and it seems it doesn't work properly.

Inzane
9th Sep 2009, 00:30
dude leave him alone, stop trolling already. he is right, you are wrong face it. there alot of things this card can to, like run Physx with modded drivers..So he found something useful, I tested it myself and works well. you just keep beating a dead horse
http://www.masonicinfo.com/images/BeatDeadHorse.gif

The Coca Cola Company
9th Sep 2009, 01:09
I'm not trolling. I've found something useful too you see if you take the time to click on the links I've posted.

If you don't care, please don't post. :(

J Paul
9th Sep 2009, 02:59
I couldn't agree any more.



http://www.yougamers.com/forum/showpost.php?p=1171117&postcount=214

http://www.yougamers.com/forum/showpost.php?p=1171154&postcount=216

To me it seems it doesn't work properly, which means that there is work that has to be done and it's not the vendorid check that is restricting the proper use of AA.

If you can take screenshots which are as antialiased as on nvidia's hardware perhaps I will begin to think differently.

So far the above thread is the only one I've found on the internet where users bothered to actually test how good it works and it seems it doesn't work properly.

Since the game is going to be released soon and I've played the demo too much already, I've uninstalled it, therefore I cannot provide screenshots. I do know that it was working properly because I made sure to do all the necessary comparisons before making these claims. I know the difference between an image that has been anti-aliased and one that has not.

I understand that without screenshots, I have no credibility, and I understand completely if you refuse to believe me. I only have my word, and since this is the internet, that is worth nothing.

The Coca Cola Company
9th Sep 2009, 09:59
In the thread I posted it took 23 pages for someone to realize that AA wasn't working, that's why I'm asking you if you can be absolutely sure that AA works as it should. Also the people on that thread seem to have more than enough technical knowledge (if you bother to read their posts), yet they didn't bother to check the quality of the AA until 200+ posts later.

If it works, good.
It it doesn't, too bad.

I tried my best to explain some things that were wrong assumptions, like the thread from tomshardware claiming that AA in UE3 is a piece of cake when Tim Sweeney ( http://en.wikipedia.org/wiki/Tim_Sweeney_%28game_developer%29 ) hasn't managed to make it work or explaining how there are other games out there (Wolfenstein) that through workarounds can have antialiasing working but only on nvidia hardware, or how wrong it is to say that just because developers have millions of dollars in their hands and years to make a game it's impossible for them to not implement such a feature unless intentionally wanting to screw over their customers. (S.T.A.L.K.E.R. anyone? It was in development for 6-7 years!).

My point is that the issue might be more complicated than most ati users think.. or less complicated if an updated profile is released that enables the proper use of AA.

LeoNatan
9th Sep 2009, 12:57
Coca Cola Company is a known troll from another forum. He is here to preach morals of "buy the game" (while he will probably pirate it the first day it comes out) and "nvidia rules, ati sucks". Don't pay attention to the idiot.

[Yes, it took me so long to see who you are, but this thread cleared it perfectly.]

The Coca Cola Company
9th Sep 2009, 14:07
I've already told you that I'm not a member of the warez related forum that you moderate, mr. i. I'm reading it though because it's always up to date with the latest gaming news, just like I love reading kotaku. YOU DO NOT KNOW ME.

None of my actions here in this forum can be classified as trolling and if you think differently feel free to report my posts to the moderators.

But please do not post here if you have nothing to add to the subject at hand which is if and why Rocksteady has disabled the AA functionality of ATi cards.

Evilbaby
9th Sep 2009, 15:14
No. Unreal Engine 3 as given by Epic to licensees DOES NOT SUPPORT AA on any card, on dx9. The licensee has to find a way to implement AA for the graphics card models he's interested in. DICE has implement AA with a solution that works on both cards. Rocksteady didn't. And you are forgetting one thing: Mirror's Edge is using a highly customized version of UE3, that has replaced the lighting system of Unreal Engine with that of Beast. http://www.illuminatelabs.com/products/beast


You are just making things up without proof. You simply assumed because the engine has BEAST integrated it automatically adds AA support? Beast is integrated into the mirror edge UE3 engine to provide enhanced lighting with effects such as global illumination and improved HDR. Where does it say that it adds AA support into UE3?



Nah, ah. Wolfenstein (another game that uses a deferred shading technique) doesn't support AA at all. Nvidia's drivers don't support it either. But if you force the drivers to use a known workaround for enabling AA in S.T.A.L.K.E.R. the edges are being antialiased.
Link: http://forums.guru3d.com/showpost.php?p=3249453&postcount=273
On Ati on the other hand no matter what you do (change vendorid, rename the exe, force it through catalyst) it simple doesn't work at all.
Link: http://forums.guru3d.com/showpost.php?p=3260097&postcount=397


Since wolfenstein uses a completely different engine you are just comparing apples to oranges. Just because AA doesnt work with ATI cards in this game doesnt mean it does not work on UE3 engine games. Nice try but doesn't work that way.

All this just goes to show your lack of credibility.

jaywalker2309
9th Sep 2009, 15:26
Quit with the personal attacks.. AA on ATI is being addressed between us and ATI.. hopefully by time people can legally play the game should all be sorted