Advertisement
If you have a new account but are having problems posting or verifying your account, please email us on hello@boards.ie for help. Thanks :)
Hello all! Please ensure that you are posting a new thread or question in the appropriate forum. The Feedback forum is overwhelmed with questions that are having to be moved elsewhere. If you need help to verify your account contact hello@boards.ie

Q3 and T&L

Options
  • 30-03-2000 5:41pm
    #1
    Closed Accounts Posts: 17


    does Q3 take advantage of hardware T&L?


Comments

  • Closed Accounts Posts: 1,341 ✭✭✭Koopa


    it does, although so far only the 5.xx beta drivers for geforce have it enabled.. or something


  • Closed Accounts Posts: 7,488 ✭✭✭SantaHoe


    Dunno if this is T&L, but I saw some screenshots of the players having cool after-trails.
    So I think it does.

    Or was I just tripping?! smile.gif


  • Closed Accounts Posts: 1,341 ✭✭✭Koopa


    thats nothing to do with hardware support for T&L

    [This message has been edited by Koopa (edited 30-03-2000).]


  • Registered Users Posts: 486 ✭✭acous


    i saw this guy once...
    he was running along, next thing he shoots the ground with a rocket and jumps at the same time, causing him to explode upwords in a flying ball of virtual body parts... this means that theres harware T&L support.

    or am i just a nutter?


  • Closed Accounts Posts: 2,313 ✭✭✭Paladin


    What worries me acous, is that u dont mention quake anywhere. You must live in a rough neighbourhood.


  • Advertisement
  • Closed Accounts Posts: 1,039 ✭✭✭Vorosha


    Anything that isn't Quake related please take to a different board. Thank you.

    Daire O'Neill.


  • Registered Users Posts: 486 ✭✭acous


    rocket == harware, explosion == lighting, man -> guts == transformation.

    i rest my case

    or am i just a nutter?


  • Registered Users Posts: 4,146 ✭✭✭_CreeD_


    I think, therefore I spam....No , thats not what I meant....Les'see.

    Anything that uses Opengl's T&L path will use Hardware T&L (So long as the drivers don't have it turned off (Ala Savage 2k)). Carmack has said that they always use Opengl as purely as possible. So Quake1/2/3 and anything using their engines do automatically make use of it. To what extent I don't know, chances are they had to optimise the thing for CPU's instead and have some trickery in there that may take away from the advantage of onboard T&L since the improvement isnt meant to be stellar.
    Actually in a lot of stuff 700+CPU T&L has proven faster than the Geforce's , the advantage though is that even if you get the same FPS your CPU will be breathing a bit easier with Onboard T&L taking the load.

    The 5.13's do speed up Opengl a LOT, and add 2 tap Full Screen AA. Which does slow things down but is worth a look (Actually Quake2 engine-games(I mainly play HL/TFC) run perfectly with it on since it aint pushing your system anywhere near as hard as Q3).


    [This message has been edited by _CreeD_ (edited 01-04-2000).]


  • Closed Accounts Posts: 1,341 ✭✭✭Koopa


    no it wont
    the game has to support hardware t&l, anything running opengl WONT use hardware t&l, quake1 and quake2 DONT use hardware t&l

    i dont think even UT supports hardware t&l, as its using the same engine as unreal i think


  • Closed Accounts Posts: 7,488 ✭✭✭SantaHoe


    Yes it WILL Koopa, it WILL.
    Yes Quake1 DOES support full h/w T&L, as does DOOM and WolfenStein.
    YES They DO.

    LOL smile.gif

    Jesus, don't freak out man.


  • Advertisement
  • Closed Accounts Posts: 1,341 ✭✭✭Koopa


    **** sorry man, i almost got out of my chair for that one.. but being as fat as i am, i decided it wasnt worth causing another earthquake in japan


  • Registered Users Posts: 4,146 ✭✭✭_CreeD_


    Koopa, Q3 does. OpenGl had the ability since 1.0 (since most highend cards had some form of geometry/T&L processor). The crux was whether the engine deliberately bypassed this feature and used it's own engine - so you're right about OpenGl games not necessarily using it (UT has it's own engine, even in OGL mode). But.... from what I've read in interviews with Carmack they did not bypass it. So while Q3 was not written specifically with HW T&L in mind, it will use it to some extent if there.
    I heard the same things said of Quake1 and 2, they used the same ideology of 'pure' opengl code. You just won't really notice it since any system that has a Geforce is fast enough to blast them along anyway (And Quake1 wasn't exactly a polygon whore).

    The biggest problem is Direct X. Since it also has had a software T&L engine since DX6, that was crap and ignored, and specific HW T&L support only being added in Dx7 means that it's a rarity to see games with support built in. So they're usually said to have 'special support added', when all they're really doing is finally playing by the books with their DX code (At least, by Microsoft's book anyway).

    Doom and Wolfenstein?....I wonder if Zdoom uses the OGL path? Hmmmm, wasted horsepower to boast about.....And they're doing OGL Wolfenstein so y anever know/=

    [This message has been edited by _CreeD_ (edited 02-04-2000).]


  • Closed Accounts Posts: 1,341 ✭✭✭Koopa


    i know q3 does, but q2 and q1 definitely dont, and im pretty sure UT doesnt either

    most highend cards had a t&l processor?
    nvidia claim that the geforce was the first card with hardware support for transform and lighting..
    the voodoo based cards didnt have onboard transform and lighting processors in hardware, neither did any previous cards from nvidia..

    it has nothing to do with whether the game code was written with t&l in mind or not, it has to do with how many triangles per frame a game has, and how fast the processor or t&l unit is, games "designed" for hardware t&l basically just have to make sure that they have exactly the right number, or less, of triangles per frame that the t&l unit they are targetting can handle to give a decent framerate, no more.. while software t&l games have to decide this triangle count using the throughput of an "average" cpu



  • Closed Accounts Posts: 1,341 ✭✭✭Koopa


    ah you edited your post while i was posting, makes the first line unnecessary
    "i know q3 does, but q2 and q1 definitely dont"


  • Registered Users Posts: 4,146 ✭✭✭_CreeD_


    Okay, knowing this is one of those minor things that could actually escalate into 200 posts of "Yes it does...Oh No it Doesn't". And since I have made the odd <Cough> wrong comment here before...... Here's what Brian Hook (Progarmmer at ID for Q1/2/3) had to say when details of T&L on the Geforce were leaked.
    This is what you do when you're in work on a sunday morning.... smile.gif


    _____________________________________________

    August 11, 1999 - (6:30am MDT)

    Dear Brian,

    What's the deal with T&L that's done onboard and how they are used by API's. IE: for a game to use of onboard T&L extensions in OpenGL does it need to specifically be written to make use of those OpenGL extensions? Or does OpenGL figure that out already and the 3d card manufacturer just has to include it in their OpenGL driver?

    If you could fill us all in with a little more on onboard T&L info that would kick ass.

    Aaron


    From Brian:
    Aaron,

    Hardware transformations and lightings do not need OpenGL extensions if an application was written to use OpenGL's native transformation and lighting capabilities. Good old GLQuake supports hardware T&L on, say, Intergraph workstations with hardware T&L. No special code needed to be written, just the regular API was supported.

    The same is somewhat true of D3D, although the engineers at Microsoft have had some "problems" getting a hardware T&L driver spec sorted out for the past few years.


  • Registered Users Posts: 4,146 ✭✭✭_CreeD_


    Oh dear.... smile.gif.... I didn't see your second post before I dropped the above one in.

    By High end I meant the +1000 mark. Voodoos etc. are high end gaming cards, not CAD cards -whole different ballgame (Which may have crap fillrates, but have had blistering T&L for years (Even Cheapy Permedia2's had a Glint T&L processor, it was just useless as a game card because of texture handling features and fill rate)).
    Nvidia officially claimed to have the first GPU (Graphics Processing Unit) beacuse it was the first that merged great fillrates with HW T&L, not because it was the first card ever with HW T&L. It's meant to be the best of both worlds.

    The principal of knowing how many triangles to process is pretty much the same for CPU and 'GPU' (Stupid term). But a lot are starting to use Level Of Detail for polygon counts too (Or make them user-adjustable like Quake3). So they'll scale the count with your hardware. It won't be exact, but it's still an improvement over wasting the system resources.

    [This message has been edited by _CreeD_ (edited 02-04-2000).]


  • Closed Accounts Posts: 890 ✭✭✭Wyverne


    wolfenstein 3d man those were tha days, hell yes.

    /me is all bleary eyed and reminiscent


  • Closed Accounts Posts: 1,484 ✭✭✭El_Presidente


    T&L

    buh?


  • Registered Users Posts: 488 ✭✭Sonic


    sam , professional graphics cards like the cobalt chipset from silicon graphics and chipsets from evans & sutherland have always had seperate t&l support but the geforce was the first to put it all on one chip and be at a reasonable price (under £2000)

    [This message has been edited by Sonic (edited 03-04-2000).]


  • Registered Users Posts: 488 ✭✭Sonic


    i think i should read all the posts b4 i reply , ya what u said creed cool.gif


  • Advertisement
  • Registered Users Posts: 4,432 ✭✭✭Gerry


    T&L = Transform and Lighting. If this can be done on the graphics card and not on the cpu, it takes some of the load off the cpu, giving a higher frame rate.


  • Registered Users Posts: 488 ✭✭Sonic


    yes bai


  • Closed Accounts Posts: 1,484 ✭✭✭El_Presidente


    I see, but how much does this free weekend COST?


  • Registered Users Posts: 77 ✭✭pox


    And Just when IS this free weekend?


  • Registered Users Posts: 4,146 ✭✭✭_CreeD_


    9 Tales of woe spoken in black and white at the end of silence.

    Those not wishing to listen to another useless lecture run for your lives, El Presidente I hope this answers your Question.

    Traditionally your 3d card (Game card) handles the rendering, ie. 'Painting' of the scene. But your CPU is still creating the polygons it's doing the decoration job on, it's stil responsible for all the geometry.
    Think of it like your CPU is still drawing good old line vector graphics, those lurvely see through triangley jobbies from the old days (Like Elite on the C64). A large part of this is called Transform and Lighting (there are a few others, but I don't know much about them).
    Transform is basically the formation and control (Ie. when moving, or the view is changing, the polygon shape has to change) of the hundreds/thousands of triangles (they are the most flexible easily computed shape) that make up your model, and lighting refers to how dynamic lightsources will effect these ploygons (eg. if Polygon 100 is over there what angle is light 4 going to hit it at - your video card uses this lighting info to render alpha blends or textures or whatever the engine uses for ligthing, but the actual co-ordinates are decided by the CPU).
    This takes a LOT of CPU horsepower.
    A card with Hardware T&L does this instead of the CPU. The theory is, like using traditional cards to render, this specialised bit of hardware can do it much faster than a generalised thing like the processor.
    If the Geforce lived up to it's 15Mil triangles a sec generation it would kill any CPU out there. But of course it doesn't. High End Piiis and Athlons can outdo it in some instances.
    But, if you use it that leaves your CPU free to do other things.
    From my personal experience, with the 5.13 drivers, the Geforce is about 10% faster than an Athlon700 in benchmarks with Hardware T&L on and Off (In 3dmark 200 anyway)
    This also creates a problem in that instead of just transmitting finished polygon data and textures to the VidCard you're now also sending lots of raw model data to be transformed. Most reviers speculate that AGP4x will help the most when High Poly count games appear, to handle all the extra traffic. And since Most Athlon systems have the Geforce running in 1x this might help to explain the smaller gap.


    _____________________________________________
    Okay that's it, methinks it's time for me to part ways with this thread.
    Have fun.

    [This message has been edited by _CreeD_ (edited 03-04-2000).]


Advertisement