Advertisement
If you have a new account but are having problems posting or verifying your account, please email us on hello@boards.ie for help. Thanks :)
Hello all! Please ensure that you are posting a new thread or question in the appropriate forum. The Feedback forum is overwhelmed with questions that are having to be moved elsewhere. If you need help to verify your account contact hello@boards.ie

ATI ''shader days'' Event - HL2 Benchmark

  • 11-09-2003 1:43pm
    #1
    Registered Users, Registered Users 2 Posts: 4,317 ✭✭✭


    I was wating for some crap like this,uugh this pi$$es me off soo much,apparantly the 9800 pro rus HL2 twice as fast as a GF FX 5900 Ultra - wtf ?

    radeon 9800 Pro - 60 FPS
    GF FX 5900 Ultra - 30 FPS

    Could it have to do with : The company recently signed a deal allowing ATI to distribute Half-Life 2 with its Radeon cards.?

    This sux ass, the websites will be getting the tech demo to try out themselves tonight soo im gonna hold off killing some one until i hear the real story.

    They'd hardly say ATI cards suck at an ATI event!



    :mad:

    CombatCow


Comments

  • Registered Users, Registered Users 2 Posts: 915 ✭✭✭logistic


    Where did you get your information from??


  • Registered Users, Registered Users 2 Posts: 4,317 ✭✭✭CombatCow


    Main Headline:

    http://www.bluesnews.com/

    3-4 diff websites have an article on it

    CombatCow


  • Registered Users, Registered Users 2 Posts: 696 ✭✭✭Kevok


    It has nothing to do with a 'special' relationship.

    Microsoft releases specs of DX9 to Hardware manufacturers. ATi goes ahead and implements it in the Radeon 9700pro. Meanwhile nVidia in it's infinite wisdom, decides to write a customised rendering routine for DX9 into it's FX line of cards.

    Result? Developers have to spend more time optimizing engines for the NV3x. The Pixel Shader 2.0 support in the FX line is a joke, it's fine with doom3 because there are relatively few shaders in use at any one time (lighting and shadows only i believe) whereas in HL2 a fully dx9 compliant game, there are multiple shaders working at once.

    As a result, valve had to create a new code-path for FX cards which used a hybrid dx8/9 rendering engine so the cards could perform favourably.

    This isn't the first time that this has come up. The FX line has always struggled with the dx9 benchmark tests in 3dmark03, especially the ps2.0 test.


  • Registered Users, Registered Users 2 Posts: 4,317 ✭✭✭CombatCow


    3DMark03 is a frikken joke

    A p3 600 with a radeon 9500 scored higher than a GF FX 5900 with a P4 processor,i mean come'on thats bull**** - just because nvidia pulled out of testing with futuremark.Then futuremark had the nerve to release a patch which ''fixed'' some of the low score problems with nvidia cards.all this Game company's supporting a certain make of card is a load of arse.imo

    CombatCow


  • Registered Users, Registered Users 2 Posts: 8,718 ✭✭✭Matt Simis


    It appears you havent quite grasped the complexities involved here.
    The FX have many fine traits, PS performance isnt one of them. This is a simple truth.

    This is a problem that some developers will\can work around, and others wont. It has nothing to do with "special" relationships. Its in SW vendors interests to ensure games run well on the majority of systems. At a guess I would imagine there is actually more NV hardware out there than ATi.

    Perhaps ID Software and the Doom3 dev team are also involved in this little conspiracy you are trying to invent:
    GD: John, we've found that NVIDIA hardware seems to come to a crawl whenever Pixel Shader's are involved, namely PS 2.0..

    Have you witnessed any of this while testing under the Doom3 environment?

    "Yes. NV30 class hardware can run the ARB2 path that uses ARB_fragment_program, but it is very slow, which is why I have a separate NV30 back end that uses NV_fragment_program to specify most of the operations as 12 or 16 bit instead of 32 bit."

    John Carmack

    http://www.gamersdepot.com/hardware/video_cards/ati_vs_nvidia/dx9_desktop/001.htm

    Opinions on Ati, Valve and 3DMark arent worth much unless you actually understand and can back up what you type.


    Matt


  • Advertisement
  • Registered Users, Registered Users 2 Posts: 16,812 ✭✭✭✭astrofool


    the FX line of cards was really a year ahead of it's time tech wise, they really should have just taken the ti series and made it dx9 compliant along with updating memory algorithms + adding texture/vertex units, and a 256bit bus.

    After all thats basically what ATI did to become the dominant power. You gotta wonder if the 3dfx people at nvidia influenced what has happened with their latest line up. Fortunately (from a competitive point of view), nvidia aren't ones to rest when something like this happens, so the next round of cards will probably see things swing the other way, esp. as they get their heads around implementing the full floating point setup of the FX line up properly.


  • Registered Users, Registered Users 2 Posts: 1,068 ✭✭✭Magic Monkey


    Originally posted by Matt Simis
    Its in SW vendors interests to ensure games run well on the majority of systems. At a guess I would imagine there is actually more NV hardware out there than ATi.

    Yes, but in this thread, it seems there are quite a few people upgrading their systems (and, obviously, their graphics cards) in preparation for the release of Half-Life 2.

    So, assuming that a lot of people will be upgrading, and regardless of the amount of Nvidia vs. ATI hardware currently out there (as we're still assuming Joe Soap will be splashing out on a new graphics card), the benchmarks Valve released may just sway people into purchasing an ATI card over a Nvidia.

    Also, in a Half-Life 2 tech video, you can plainly hear one of the Valve people recommending people to upgrade their systems for Half-Life 2. Obviously he doesn't say to go for ATI over Nvidia, but after those benchmarks, what do you think most people will go for?

    And it doesn't hurt Valve, nor ATI, that Half-Life 2 is shipping with Radeon's...


  • Registered Users, Registered Users 2 Posts: 3,312 ✭✭✭mr_angry


    Also, in a Half-Life 2 tech video, you can plainly hear one of the Valve people recommending people to upgrade their systems for Half-Life 2. Obviously he doesn't say to go for ATI over Nvidia, but after those benchmarks, what do you think most people will go for?

    And it doesn't hurt Valve, nor ATI, that Half-Life 2 is shipping with Radeon's...

    That is just pure paranoia. The only reason, and I mean ONLY, that Radeons perform faster in Half Life 2 is because NVidia's implementation of DX9 on their cards is non-compliant with the original standards. NVidia have nobody to blame but themselves. This is not a Valve-ATI conspiracy.

    Believe me, Valve have put a lot of effort in just getting the 5900 to run at 30 FPS - its not like they've deliberately avoided optimising for it. In fact, they've spent far more time optimising for the NV30, because of the problems described in previous posts.


  • Closed Accounts Posts: 975 ✭✭✭j0e9o


    sweet! i just got me a 9800pro


  • Registered Users, Registered Users 2 Posts: 2,005 ✭✭✭CivilServant


    The tech demo done at E3 was performed on an ATi Radeon 9800 Pro and we all know how good that looked.


  • Advertisement
  • Registered Users, Registered Users 2 Posts: 1,068 ✭✭✭Magic Monkey


    mr_angry4, if that is the case, then ATI will have it in the bag regarding the choice of graphics card people will make when upgrading specifically for Half-Life 2.

    I'd be annoyed buying an ATI now for HL2, then finding out later when Doom3 comes out, it's significantly better on a Nvidia... :rolleyes:


  • Registered Users, Registered Users 2 Posts: 2,005 ✭✭✭CivilServant


    HL2 is the first full dx9 game and as such will be the best synthetic benchmark for dx9 games in the future. If a graphics card runs HL2 well then it's fair to say that the future dx9 games will perform in a scaleable pattern. I have a feeling that Ati will provide the goods for Doom3 as well and let's not forget 9800XL just around the corner.


  • Registered Users, Registered Users 2 Posts: 4,317 ✭✭✭CombatCow


    From: Blues News

    NVIDIA & Half-Life 2 [September 11, 2003, 4:59 PM EDT] - 42 Comments
    Following up on some comments from Gabe Newell and the Half-Life 2 benchmarks presented at yesterday's ATI festivities (story), NVIDIA has issued a statement in response, which has been posted on 3DGPU and Gamers Depot. The statement questions the use of older drivers for benchmark comparisons, offers their understanding that there is only one Half-Life 2 bug remaining in their newest (as yet unreleased) drivers, and concludes like this:

    The optimal code path for ATI and NVIDIA GPUs is different - so trying to test them with the same code path will always disadvantage one or the other. The default settings for each game have been chosen by both the developers and NVIDIA in order to produce the best results for our consumers.

    In addition to the developer efforts, our driver team has developed a next-generation automatic shader optimizer that vastly improves GeForce FX pixel shader performance across the board. The fruits of these efforts will be seen in our Rel.50 driver release. Many other improvements have also been included in Rel.50, and these were all created either in response to, or in anticipation of the first wave of shipping DirectX 9 titles, such as Half Life 2.

    We are committed to working with Gabe to fully understand his concerns and with Valve to ensure that 100+ million NVIDIA consumers get the best possible experience with Half Life 2 on NVIDIA hardware.



    :cool:
    CombatCow


  • Registered Users, Registered Users 2 Posts: 803 ✭✭✭Shamo


    Aslong as they don't degrade the imagine quality to get the extra boost they need!


  • Registered Users, Registered Users 2 Posts: 696 ✭✭✭Kevok


    Originally posted by nVidia (Combatcow)
    The optimal code path for ATI and NVIDIA GPUs is different - so trying to test them with the same code path will always disadvantage one or the other. The default settings for each game have been chosen by both the developers and NVIDIA in order to produce the best results for our consumers.

    Yes the codepaths are different. One is the default DX9 one, that SHOULD work with ALL DX9 compliant cards. Yet nVidia wasn't content and made it difficult.

    The nvidia codepath is not a true DX9 codepath, the simple truth is, that nVidias cards are not fully compliant, the codepath that will be used for NV3x will be a DX8/9 hybrid, which as far as i'm aware means that if you use an fx card, you will not get the eye candy on an ATi card.

    Not ATi's fault, not Valves, but nVidias. You cannot take the word of Futuremark (who was using the DX9 specs released by MS), John Carmack (using OpenGL and still having problems with PS 2.0) and Gabe Newell...... and arrive at the result that there is a mass conspiracy to take nVidia down. It simply is not true.


  • Registered Users, Registered Users 2 Posts: 1,068 ✭✭✭Magic Monkey


    No-one mentioned a "mass conspiracy to take nVidia down". After reading about it, I accept nVidia are responsible by have being irresponsible. More benchmarks would be interesting though, on different drivers - it's always best to get a second opinion.

    I don't think Combatcow is strictly annoyed with any alleged back-handed dealings with Valve & ATi, it's more like... it's more like the stuff we the consumer put up with. By that I mean buying a new graphics card every 8-12 months to try to keep up with the latest games, and cards optimised for certain games - ideally, a "one size fits all" solution would be best; that nVidia and ATi cards would run equally well for HL2 and we wouldn't be having this discussion and everyone would be happy.

    But it seems nVidia messed up, so anyone who pre-empted any benchmark results and went out and purchased a shiny new Geforce FX is going to be pretty disappointed right now, along with those who favour and are loyal to the nVidia brand.


  • Registered Users, Registered Users 2 Posts: 8,718 ✭✭✭Matt Simis


    Originally posted by Magic Monkey
    By that I mean buying a new graphics card every 8-12 months to try to keep up with the latest games, and cards optimised for certain games - ideally, a "one size fits all" solution would be best; that nVidia and ATi cards would run equally well for HL2 and we wouldn't be having this discussion and everyone would be happy.
    .


    What a boring, boring world that would be.



    Matt


  • Registered Users, Registered Users 2 Posts: 1,068 ✭✭✭Magic Monkey


    Quite. I'd rather be bored than having to upgrade the graphics card near twice a year, and switching certain brands for specific game titles :rolleyes:


  • Registered Users, Registered Users 2 Posts: 4,317 ✭✭✭CombatCow


    O maybe matt likes splashing out on a new gfx card every few months,thats not boring, just sad.

    CombatCow


  • Closed Accounts Posts: 1,502 ✭✭✭MrPinK


    Would you rather the industy just said "ok I think everything is fast enough as it is"? If that were the case then games like HL2 would still be years away.


  • Advertisement
  • Registered Users, Registered Users 2 Posts: 1,068 ✭✭✭Magic Monkey


    Hey Mick :)

    No industry will rest on their laurels, especially not the I.T. industry; it's all about being ahead of, or trying to keep up with, the competition. Took ATi a good while to catch up with nVidia's 3D status.

    I remember you yourself saying you'd never touch an ATi card again after the poor driver support you had with the Rage 128. It looks like the 97/800 series has changed all that, but there are still people fiercly loyal to nVidia, especially those after having similar experiences to yourself. You get stung badly by a particular brand once, it takes a lot for someone to go back to using it again.

    I'm not too fussed what graphics card I have; as long as it plays Commandos 3 ok, then HL2 can go to pot :p


  • Registered Users, Registered Users 2 Posts: 771 ✭✭✭Sir Random


    Well, Gabe Newells comments were totally inappropriate for someone in his position. He was either ill-advised or heavily rewarded.
    However, that doesn't change the fact that the nVidia FX series is a dead duck.
    Apart from the FX5900 Ultra's appalling DX9 performance, the FX5200 can't even match a GeForce3.
    I'm so glad I've waited until I actually need a DX9 card (which could be months yet) as I can see even more developments before HL2 arrives. The 9800XT has to be the card of choice atm, but even that could change...


  • Closed Accounts Posts: 975 ✭✭✭j0e9o


    *** random if ure wanting a bargin go for a 9800pro once the newer verison comes as there is only the smallest difference in clock speed


  • Registered Users, Registered Users 2 Posts: 771 ✭✭✭Sir Random


    The smallest difference in clock speed? I bet you heard that from a 9800 Pro owner.

    Comparative clock speeds Core/Ram:
    9700 Pro: 325/310
    9800 Pro: 380/325
    9800XT: 425/350

    The difference is almost the same as the jump from 9700 Pro to 9800 Pro (just not enough to warrant the 9900 label)
    I know the argument about o/cing a 9800pro to XT speeds, same as o/cing a 9800 to pro speeds etc... but I'd rather be o/cing an XT ;)
    The 9800XT will also have other features including a GPU temp sensor and it comes with HL2 bundled :)

    Anyway, I'm leaving my options open until HL2 actually arrives. If it's delayed until November/December, there may be an even newer card on the horizon, the 9800XT Ultra De-luxe MKII?
    I don't want to blow my cash until I have a reason to buy a DX9 card. Any card I buy now would have it's PS2.0 shaders collecting dust for weeks unless I ran benchmarks over and over.


Advertisement