Advertisement
If you have a new account but are having problems posting or verifying your account, please email us on hello@boards.ie for help. Thanks :)
Hello all! Please ensure that you are posting a new thread or question in the appropriate forum. The Feedback forum is overwhelmed with questions that are having to be moved elsewhere. If you need help to verify your account contact hello@boards.ie

Doom 3 - brought to you by..Nvidia

Options
2»

Comments

  • Closed Accounts Posts: 29,130 ✭✭✭✭Karl Hungus


    Aye Deadwing, I'm well aware of that.

    Which is why I gave an example of a game that has the Nvidia running perfectly fine on an ATI card. Hence reasuring you that you will indeed get the best out of your card, despite the plugs by nvidia. Personally, I shall be shelling out on a new GFX Card myself, more than likely a 9800pro, and I have no worries that it'll run everything perfectly fine.

    Anyway, sorry for quoting you in my own defence, but Matt Simis was taking me up completely wrong.


  • Closed Accounts Posts: 2,918 ✭✭✭Deadwing


    I just noticed Deus Ex 2 had a 'the way its meant to be played' on the box. Runs absolutely perfectly on the 9800pro :D


  • Registered Users Posts: 15,815 ✭✭✭✭po0k


    Originally posted by Matt Simis
    There is no sane business model that supports such an approach. These companies arent idiots, so you dont need to worry bout that happening any time soon.


    Matt

    3Dfx anyone??

    If some company comes up with some killer-tech-patent which makes coding lovely fast smooth hi-res graphics engines easy as pie for developers and markets it to the audience do you think another monopoly could start?

    Unlikely, but "possible" :)


  • Registered Users Posts: 8,718 ✭✭✭Matt Simis


    Well, 3DFX initially provided Glide as there was no generic API available at the time. They also had a hardware monopoly (due to lack of competition) at the same time as the software one they stumbled into. When the "3Dfx Enhanced" (or whatever they called it) campaign started DirectX was established, Glide became a marketing tool. Mind you, it was good, real good. They never stopped anyone making games that supported both DX and Glide however, they had no need to as the Glide version genuinely looked and played better.

    Sure, anything is possible, but I would say its far more likely we see an escalation of the Doom3 "NV30" codepaths approach, it will run on any card, but has special ways of dealing with certain hardware. Funnily enough, the D3 NV30 codepath is as much a kludge to handle nVidias (mis-)interpetation of a standard as it is a performance enhancement.

    Lets not forget, NV30+ nVidia hardware has accelerated methods of dealing with shadows, to use that the game needs to be written with NV in mind (afaik).


    Matt


  • Registered Users Posts: 3,312 ✭✭✭mr_angry


    The fact that ATi's current generation of cards is so much better than nVidia's means that the game will most likely run better on any comparable ATi card over any nVidia card, and there's nothing that nVidia's propaganda division, or John Carmack's choice of code path can do about that (unless he went totally mad).

    However, the game wont be released until the next generation of cards are in full swing, so its hard to tell who's will run best. Carmack publicly stated that he was having trouble coding using the NV3x code-path though.

    As we know, the ATi R420 will be out at the end of this month, and the NV40 will be out sometime around the middle of May. However, there was a rumour recently that nVidia had scrapped the NV40, and rushed the NV45 (rebadging it as the NV40 so as nobody would notice) because they knew the R420 was gonna kick its ass. Hence the reason its coming out so far after the R420.

    If you want to play any DX9 game on a current generation card, you want to be buying an ATi. 6 months from now? Who knows...


  • Advertisement
  • Registered Users Posts: 8,718 ✭✭✭Matt Simis


    Originally posted by mr_angry


    If you want to play any DX9 game on a current generation card, you want to be buying an ATi. 6 months from now? Who knows...


    Admittedly I currently have a FX5950 Ultra so I could be accused of bias, but I switched from a Radeon 9800pro. Ive had Radeons for the last 2 years (and I switch cards a lot, so many Radeons) and could have got the 9800XT at a good price, but it just got boring using ATI for so long..

    The fact remains that NV HW is usually faster at DX8 games and is faster in Doom3 (as per Anandtech sneak peak) while being on par in Stalker with ATI counterparts (Doom3 and Stalker being "DX9" gen games). It tanks in Tomb Raider and Half Life2 (to a lesser degree). Its a bit of hit and miss, and generally ATI cards seem "safer" bet for DX9, but its far from as black and white as you portray it.

    nVidia has other advantages too:
    nVidia has better SW vendor support, better drivers under both Windows and Nix (switching back to NV drivers after so long really reminded me, tho ATI drivers arent bad, they arent as polished), vastly better 3D Stereo support, sometimes better image quality (although both of them can be accused of playing with quality to "fix" scores), better DX7/8 support, more robust HW (can run at higher AGP speed and overclocks well, expect to see good early PCI Express implementations), the ingenious OTES cooling system alone on the NV reference cards is worth $60+.

    Whats happening to nVidia now among the digerati is a lot like the OTT critique that plagued 3DFX in their later days. People latched onto their lack of 32bit colour support and dogged them on every mishap afterwards, even when they were on the verge of releasing a vastly superior product. People like to see market leaders go down in flames, deservedly or not. There are still people who claim as if it was a matter of fact that NV cards are "hot and noisy" despite the fact that to more recent NV cards compare favorably to ATI cards. When the Voodoo3 3500 was released, it was the fastest and most feature packed product on the market, but nobody cared as NV marketing did such a good job convincing them that they needed HW T&L that it never recieved the accolades and commendation it deserved.

    Sure, NV cards have problems, but compare them for a second to their other competitions "attempts": Intel, S3 or VIA cards. Nvidia are also doing a lot of things right.


    Matt


  • Closed Accounts Posts: 975 ✭✭✭j0e9o


    from what i have read the best card to have for this game is the ti4800, a bit rippin since i have a 9800pro


  • Registered Users Posts: 8,718 ✭✭✭Matt Simis


    Originally posted by thomasmckinless
    from what i have read the best card to have for this game is the ti4800, a bit rippin since i have a 9800pro

    Umm... Doom3? Ti4800? Where did you see that?

    Heres the Anand article I mentioned:
    http://www.anandtech.com/video/showdoc.html?i=1821&p=22


    Matt


  • Registered Users Posts: 3,312 ✭✭✭mr_angry


    I appreciate that you're trying to give balance to the argument Matt, and the last thing I want to see is nVidia bullied out of the marketplace (which is a pretty unlikely scenario). The competition in the GPU market at the moment is the single greatest driving force which is giving consumers fabulous choice and high-quality products. If the NV40 turns out to be better than the R420, I'll be quite happy to promote their superior products to people.

    However, from a purely consumer oriented perspective, the Radeon 9600 and 9800 series almost universally outperformed the comparable nVidia cards in relation to DX9 games. I agree - the nVidia cards do have slightly better performance on DX8 games, and they have a nice compact set of utilities (I myself was an nVidia customer for more than 3 years), but this is not a substitute for real performance, especially when we're talking about a DX9 game. The fact that nVidia hadn't implemented the Shaders 2.0 standard on the FX series left them at a serious disadvantage, and put both developers and consumers in a difficult position. It is entirely a hardware issue, and thus could not be resolved by later driver releases. For most, it was easier to switch to supporting ATi.

    As for the performance indicators you mentioned - the Doom3 leak was a thoroughly unoptimised alpha (the article in question doesn't seem to indicate what kind of Doom3 sneak peak they have, so I'm assuming it was last year's leak), and its impossible to determine any of these issues before release, let alone the near cumpulsory 1.01 patch and driver updates from the vendors. As for Stalker - I'm not aware of any near-playable version of the game which could be described as fit for deciding this argument. In all of the market-release DX9 gaming benchmarks I've seen, the ATi cards have come out on top. The 5950 Ultra made serious inroads into that, but the majority of the ATi series cards won out over their respective competitors. Given that the 5950 is a pricey card in order to achieve such results is a downside though.

    Overclocking is an interesting comparison. I myself have a 9800 non-pro which I have overclocked from:

    Core: 324MHz
    Memory: 290.3MHz

    to:

    Core: 411.8MHz (XT speeds)
    Memory: 330.8MHz

    One of my housemates did even better, overclocking from the same initial values to:

    Core: 490MHz
    Memory: 382MHz

    Nobody can deny that they are some pretty impressive results, although they seem limited somehow to Sapphire cards - another of my housemates has a Hercules 9800 Pro, and achieved almost zero increase. Nevertheless, I have yet to see an nVidia card achieve similar increases.

    I really hope nVidia come out with a great card in the next product cycle, but the NV3x has not been good for them. If the time comes, I will be happy to praise nVidia for the good work they've done, and suggest to people that they purchase their cards. However, I will not advise people to buy an extremely high priced nVidia card on the basis of a thoroughly un-optimised demo benchmark, nor am I happy to see people proclaiming nVidia will be superior simply because they have a sticker on the Doom3 box. In most realistic tests so far, ATi have come out on top, and until that changes, I will keep advising people to buy thier product.

    Long live healthy competition!


  • Registered Users Posts: 15,815 ✭✭✭✭po0k


    ATi have been a slow juggernaut of a graphics company, the only thing that kept them going through the pre-Radeon years was the OEM contracts (you'd have a very high chance of finding a RAGE128 on most OEM servers these days, and looking back at Gateway, Dell etc, I remember seeing people get these new machines with nice new fast cpus and then finding an onboard ATi chip inside them, sighing and saying "**** for games".

    Glad to see that has changed as it gave nVidia a bit of a kick up the arse.
    I was afraid of seeing them becoming complacent and sitting on their laurels, concentrating on "fluff" cinematic effects while trading off performance, as (imho) was the case with the VooDoo4/5/6 series and their T-buffer etc.
    And not having 32-bit support in the VooDoo3 was a bit of a failing on their part :)
    Things like shadows and fine gradients look terrible under 16-bit.


  • Advertisement
  • Registered Users Posts: 6,560 ✭✭✭Woden


    Originally posted by mr_angry


    Nobody can deny that they are some pretty impressive results, although they seem limited somehow to Sapphire cards - another of my housemates has a Hercules 9800 Pro, and achieved almost zero increase. Nevertheless, I have yet to see an nVidia card achieve similar increases.


    thats a nice overclock especially on your housemates ones, however i seem to find that nvidia cards will overclock just as well especially the 5900LX/XT range which starts off with something like 400/350 and can be brought up to 5900 ultra frequencies and in a number of cases 5950 speeds at 475/475 which is quite impressive for a circa €200 gfx card.

    personally my 9800pro will do 452/392 and thats a powercolour one


  • Registered Users Posts: 8,718 ✭✭✭Matt Simis


    Given that Doom3 and Stalker are both NV promo games, I would bet that Stalker will match ATI performance and Doom3 will beat it. In regards Doom3, you have to remember that nVidia has always (and continues) to have the best OpenGL support on the market, combined with the fact that they (and ATI) write their own extensions to OpenGL spec, providing a better fitting API that DirectX ever could be. The "sneak peek" Doom3 bench was sancantioned by ID, I would assume it was a little bit further along than the leaked alpha.. the final product will probably be heavily optimised for nVidia (tbh, the FX will need it), but that really doesnt matter to the end user. Lets not forget that nVidia cards support HW accelerated Stencil Shadowing, a major part of Doom3! :p

    In regards those overclocking results, impressive stuff. As I have the top end clocked FX, I cant match those figures percentage wise (perhaps a 5900XT could?), but Ive managed

    Core: 475MHz -> 580MHz
    Memory: 950MHz -> 1.04GHz

    With stock cooling, no mods on an already heavily overclocked system. I assume with another cooler and voltage mods it would push the core over 600MHz.

    All FX cards support PS2.0 and the NV30 range are just a handful of instructions from supporting PS3.0 (the spec wasnt final when HW was completed). I agree tho, there is no getting away from the fact NV's PS2.0 is much slower than ATIs.




    Matt


  • Closed Accounts Posts: 2,918 ✭✭✭Deadwing


    Originally posted by mr_angry

    Nobody can deny that they are some pretty impressive results, although they seem limited somehow to Sapphire cards - another of my housemates has a Hercules 9800 Pro, and achieved almost zero increase. Nevertheless, I have yet to see an nVidia card achieve similar increases.

    I really hope nVidia come out with a great card in the next product cycle, but the NV3x has not been good for them. If the time comes, I will be happy to praise nVidia for the good work they've done, and suggest to people that they purchase their cards. However, I will not advise people to buy an extremely high priced nVidia card on the basis of a thoroughly un-optimised demo benchmark, nor am I happy to see people proclaiming nVidia will be superior simply because they have a sticker on the Doom3 box. In most realistic tests so far, ATi have come out on top, and until that changes, I will keep advising people to buy thier product.

    Long live healthy competition!
    Well said indeed :) Ive used both ATI and Nvidia cards, so im not really biased in either ones favour. (Tho i do love my 9800pro)
    As i said, Deus ex2 has the nvidia sticker on the box, yet runs without a hitch whatsoever, with a fairly blistering framerate on mine. (havent had ANY slowdown yet) So i hightly doubt doom3 will will be a jerky, jaggy, crapfest on ATI cards.


  • Registered Users Posts: 3,312 ✭✭✭mr_angry


    Fair enough - I accept those points. But the Doom3 benchmark in question was run on the 12th May last year, and the alpha released not long before that wasn't even capable of running on most ATi cards without modification. To say its a fair benchmark is... well, dubious.

    The only real test is when the game actually comes out. I'm not trying to say ATi cards will definitely run it better. In fact, this agreement goes to suggest that nVidia cards will perform better. But that is marketing-talk, not actual results, and I just wanted to point out that nobody should jump to conclusions on the basis of this story, as some people were doing earlier in the thread.


Advertisement