Thread: PC GAmer Recomends ATI!?!?!

PC GAmer Recomends ATI!?!?!

  1. #1

    PC GAmer Recomends ATI!?!?!

    Ya so i boot up deus ex, and hey wadda u know it syas it was created on a nvidea GPU, i have an old ti4600, so i'm like hey it should work and it does, everything at medium and it runs smooth. However, 2 of my firends, one with a fx5950 and the other a fx5600 ultra can't play it, they just can't. I;m like HEY it said they used an nvidia card to make it.

    So then last week i got my new PC Gamer in the mail, and in their review of DX2 (%83) they speccifically reccomend a 128mb ATI card, Adn well that blew me away. I know the patch fixed some stuff but still, if anything the game just shouldn't work with the ATI cards

    Also, soon enough my last statement will be true. Nvidia and T will be like consoles, you have to have each card to play cetain games, and soon enough motherboards will have 2X AGP8X slots.

    Well thats all for now.

  2. #2

    Re: PC GAmer Recomends ATI!?!?!

    Originally posted by gamingguy87
    if anything the game just shouldn't work with the ATI cards
    Why? Just because Nvidia paid for their logo to be on the game?

  3. #3
    Join Date
    Dec 2003
    Posts
    66
    Yeah exactly dude. NVIDIA has such a small that they feel the need to pay out EVERY software developer to emblazen THEIR logo on their games. Ati makes enough money and has enough rep, that they dont need to do this. Just because it says "Nvidia" doesnt mean it was CREATED to run best with nvidia cards.

  4. #4
    Join Date
    Dec 2003
    Posts
    66
    Sam Roberts!


  5. #5
    Originally posted by Santa99
    Ati makes enough money and has enough rep, that they dont need to do this.
    Both ATI and Nvidia have marketing campaigns to get their names on a particular game:

    http://www.ati.com/gitg/index.html

    ATI did it with Raven Shield and Half-Life2.

    Your point about their logos on games not meaning much still stands though.

  6. #6
    Well it works fine here on three computers with old g4's and FX5900's.

  7. #7
    ok look, i was fully "there" when i wrote this.

    Also, after the patch my friends games work great, the only thing i miss from the original deus ex was the internet mode thing( maybe that was a mod, or maybe i'm dreaming it) Also, for some reason when UT2003 came out, everyone was like man i need a powerful rig to run this with all graphics turned up, well with my trusty g4 ti4600 i can get a whopping 80fps with everything turned up. I think ym card i like god, i must have bough it from the 'special store' where everything there worx better than it normally does.

    Moral of the story, don't smoke 'stuff' or u turn out like me. And who wants to be posting meaningless things on forums at 10 at night.

  8. #8
    gamingguy87 wrote:
    Also, soon enough my last statement will be true. Nvidia and T will be like consoles, you have to have each card to play cetain games, and soon enough motherboards will have 2X AGP8X slots.
    They actually once anounced this as an april fools Joke

    BUT!!

    Why... would developers make something optimized for 1 card..
    and not just DirectX/OpenGL compatible..? and leave it to the hardware-companies to follow these standards ?

    Let's say Ati/nvidia go 50/50 as far as market-sales for PC go...
    Making a game only for nVidia would mean 50% of the gamers cannot run your game....
    Make it any standard (DirectX or OpenGL) compatible and it'll run on any PC!!..
    enabeling you to sell to 100% of the users...

    It's like selling let's say.. bread.. that's not compatible with women or something..
    A crazy and completely useless ID if you ask me...
    Standards are there for a reason...and microsoft will be kicking their * if they drop DirectX

    So why risque loosing custumers by writing a game using direct driver-calls ?

    But hack...
    If you ask me... this time nVidia just FAILED to overcome their lack of quality by brute force... FX-series was a misser... except for
    the last 2 (FX 5900 and FX5950) which were made to make up for their weird behavior...

    And what is this btw... MX-series...
    It has the name of the new line...(like Geforce4) but all you get is cheap stripped cards...

    nVidia has been making some strange twists and turns the last few years...
    And I think it's time they get back to earth and start WORKING once again...

  9. #9

    Hey gamingguy internet in dx1

    I think the "internet" thing you were thinking of was in the original System Shock game. You could "surf" the net. It was pretty low tech if I remeber correctly, but you plugged in and did battle with the computer to finish goals. I dont remember internet surf in DX1. Any one else?

  10. #10
    Originally posted by Etrigan the Demon
    When my dear old Voodoo 3 3000 was one of the finest cards in the market, and Nvidia company was making low-end cheap ones, they stuck a bunch of bucks on 3Dfx to buy a succesful company with quality products, leaving us without technical support an even driver upgrades, trying to force us to buy their lousy products.
    That's not how it happened. 3dfx started taking thier direction in a strange way. BTW, they did release a Voodoo 5. And nVidia didn't just buy them. 3dfx was basically sunk nVidia purchased what was left. Why, I don't know, because I don't think they ever used the Voodoo name.

  11. #11
    Don't get me started on 3Dfx... I'd almost start crying...
    Once bought a TNT2M64 to replace my voodoo banshee..
    Thought to get me a nice value card... only to find out that the render-quality was absolute crap...
    Even the geometry was wrong... compared to the flawless lines and shapes of the banshee..
    and it needed a (for my budget at the time) hell of a PC to put the card to normal use/performance...

    I didn't know HOW FAST I had to sell that crappy card..
    In the end... I sold it for 75% of sales price at the time,
    and the guy who bought it was rather pleased with the deal... (happy to hear that...)

    But I went back to my trusty old banshee.... (still working)
    Cause it wasn't the best in framerate... it could only do 16bit color...
    But it was a DAMN good card.. with unmatched image-quality... as any voodoo card...

    3Dfx... they died in lack of speed...
    but DAMN those were good times *sob*..

    See... now you got me all upset.. happy now?!
    I'll have to go and find comfort.. playing with my Radeon9700pro now...
    Allthough I have to say... my old geforce 2MX 400 was a good value-card...
    Geforce2 had no shaders to begin with.. so nothing to strip I guess except speed....

  12. #12
    Originally posted by scottws
    That's not how it happened. 3dfx started taking thier direction in a strange way. BTW, they did release a Voodoo 5. And nVidia didn't just buy them. 3dfx was basically sunk nVidia purchased what was left. Why, I don't know, because I don't think they ever used the Voodoo name.
    Indeed. 3dfx shot themselves in the foot when they bought STB and became a board manufacturer. This seriously upset their customers (and now competition) - who then turned to Nvidia for chips. Their half-hearted DirectX support didn't help either. By the time Nvidia bought them, they were close to bankruptcy.

  13. #13

    Religion or facts?

    I think this is getting a litte like religion instead of facts, I once had a 3dfx voodoo 3 card and was very happy with it at first, the TNT 2 card was very close in perfomance but had 32 bit color, which I saw no point in at the time. but It performed as well as the geforce 2 mx mid range cards but with much lesser image quality. This was because voodoo 3 wasn't able to use bigger textures than 128*128 pixels if I remember correctly but the nvidia cards higher.
    3DFX dug its own grave when they started using raw power (up to 4 processor cards) and nvidia was matching that strength with only one gpu, they won fair and square there. They bought 3dfx up for their factories and tech, (the highest range had some features the nvidia had not (T-buffer)).
    After that Nvidia was unrivaled until ATI started to crawl up into the gaming market again after years of mediocricty. They mostly aimed at the budget market at first and made the first decent attempt around the time geforce 2 was released and that was the radeon 7000 series. the lowest in that were better than the gf2 mx cards but not as good as gf2 the higher range was about as good.
    With Radeon 8500 The exceeded Nvidias Geforce 3 cards in my oppinion, and supported for example lot more of dx effects like pixel shader 1.4, Nvidia decided not to implement those features even in the gf4 series although fast.

    Then it came strictly out of the blue, with one month notice ATI announced the 9000 series a little faster but lot more feature rich budget cards, and a faster card anyone could have imagined The radeon 9700 pro, Since then Nvidia seem to have been acting in desperation, even their fx series after that failed miserably compared to the ATI cards, until the most recent ones 5900 series. But still it seems that ATI have done a better job following standards and because their card was the first working dx9 one the dx9 standard is really close to ATI and seemes to be alot faster, and all dx9 code seems to run alot faster on the Radeon cards, although Nvidia continues to bully benchmark creators, putting their ads into alot of games, making drivers that cheat benchmark, and lot of Bad politics.
    These Bad politics and my dislike of Nvidia policies and politics I decided to buy Radeon 9800 pro instead of the fx 5900 ultra , Now I am seeing more and more that I picked the right thing, dx9 code seems to need alot of optimisation on nvidia cards, although some Opengl code might work a bit faster.

    The bottom line is that nvidia isn't good enough on standards and need s to much special programming, but ATI cards work well when code is done by standards. Although this FX bug is most unfortunate and I have no doubt that either Nvidia or Eidos (or both) will fix this if there isn't a fix already, but I think its nvidia who is not following standards well enough but ATI is, the only fault of ATI I see is that they should work more on their linux support, and perhaps let the owners of 9700 9800 to get some 3d app secific driver optimizations.

    Thats all folks

  14. #14
    Originally posted by thegrommit
    Both ATI and Nvidia have marketing campaigns to get their names on a particular game:

    http://www.ati.com/gitg/index.html

    ATI did it with Raven Shield and Half-Life2.

    Your point about their logos on games not meaning much still stands though.
    ATI doesn't bang you over the head with it, like Nvidia does, with an ad for it's stuff around every corner in the game.

  15. #15

    Re: Religion or facts?

    Originally posted by Skrekkur
    I think this is getting a litte like religion instead of facts,
    Who are you referring to?

    Ironically, it was just over three years ago that 3dfx threw in the towel:

    http://www.fool.com/news/2000/nvda001218.htm

    http://zdnet.com.com/2100-11-526472.html?legacy=zdnn

    How things have changed

  16. #16

    Re: PC GAmer Recomends ATI!?!?!

    "Ya so i boot up deus ex, and hey wadda u know it syas it was created on a nvidea GPU,"

    Actually, no it doesn't. This is something called an ADVERTISEMENT. Nvidia paid for it to be put in the game. (they have been doing this with several games)

    It means nothing. The game is not specially optimized for Nvidia while shafting ATI or anything of that nature. In fact most people agree that the ATI cards run the game better than the Nvidia cards.

    Nvidia, however, is no doubt pissed they paid for an advert in such a shoddy console port.

  17. #17
    Isn't it true that the companies sometimes get hardware sponsored to use and test their products on ?

    if not, then only a developer can tell us what the game was actually written on (maybe it was multiple systems.. it's DirectX after all)

    Or maybe it was Matrox, or XGI

  18. #18
    Originally posted by Wracky
    Isn't it true that the companies sometimes get hardware sponsored to use and test their products on ?

    if not, then only a developer can tell us what the game was actually written on (maybe it was multiple systems.. it's DirectX after all)

    Or maybe it was Matrox, or XGI
    You obviously don't know much about code. What do you mean by "written on"? Do you mean the graphics card used on the systems that code was being written on? In that instance, I trust you are aware they could have just as easily been using an ATI RAGE IIC 4meg card? Source code isn't exactly graphics intensive.

    The ONLY time the card matters - the ONLY time - occurs if they write calls specific to X video card. Generic OpenGL/D3D is what devs use 99% of the time unless they want to throw in a special feature. (TF or something along those lines)

  19. #19
    Yeah.. ye're right...
    wrong choice of words I guess...
    Written to work on... I meant to say

    i'm writing a point'n'click adventure together with some friends,
    and I'm the one doing the DirectDraw stuff... Happens that I constantly test the program
    on my PC while coding (or any lill test-programs to test out some functions)

    I own a Radeon9700 pro and I noticed that when we ran the same program on my friends PC who
    owned an nVidia Geforce 2MX 400 at the time, the results were very different.

    Loads of those "errors" didn't appear untill we ran the game for longer time
    (color-differences... buffers being flushed without reason etc.)
    I had to find some other wayz that gave me more control over the drawing-process to make it work well on both cards...

    I can imagine that if you test a game in development mainly on one card, you'll overlook some 'flaws'
    or incompatibilities in your code...
    or won't notice them untill it's played by others on a variety of PC's

  20. #20
    Originally posted by Trek234

    The ONLY time the card matters - the ONLY time - occurs if they write calls specific to X video card. Generic OpenGL/D3D is what devs use 99% of the time unless they want to throw in a special feature. (TF or something along those lines)
    Very true, and that is why Nvidia's cards were stinkin' up the joint because they were trying the old school trick (3DFX and Glide, LOL) of having specialized hardware that, with specific calls/hooks and optimizations, could supposedly shine at the cost of other cards (or the devs would have to code for both).

    The idea is for generic calls to D3D/oGL API which handle the calls/hooks and in turn should be passed along from driver -> hardware where it should handle it naturally.

    Program_Call -> API -> Driver -> Hardware.

    That is why the new Nvidia drivers have an equivilent JIT compiler that can optimize calls as they come in off the handler. They should have done that in the first place instead of trying to use their influence and position to basically say, "Hey, unless you code and optimize specifically for our hardware, then your games are gonna run slower," and expect devs to go, "oh, ok!!!!" When a nice, generically coded (well as generic as it can get) routine with standardized calls should let the hardware shine on its own.