Advertisement
If you have a new account but are having problems posting or verifying your account, please email us on hello@boards.ie for help. Thanks :)
Hello all! Please ensure that you are posting a new thread or question in the appropriate forum. The Feedback forum is overwhelmed with questions that are having to be moved elsewhere. If you need help to verify your account contact hello@boards.ie

GeForce...right!

Options
  • 07-09-1999 2:01pm
    #1
    Registered Users Posts: 2,518 ✭✭✭


    whens this out? smile.gif


Comments

  • Registered Users Posts: 2,010 ✭✭✭Dr_Teeth


    The second week of October, roughly. It'll probably be the end of October before we'll be able to get a hold of one though..

    Teeth.


  • Registered Users Posts: 2,010 ✭✭✭Dr_Teeth


    Mind you, with it's not-too-impressive fill rate, I reckon the next-gen 3Dfx card will kick it around the place.. it's all very well putting a hardware pipe-line on a card, but all of the current games are still fill-rate limited at high-res.

    Q3 has a max of around 10,000 polygons per scene, which really isn't a problem for decent machines these days - it's fill-rate and texture handling that need a boost in the short-term.

    Think of the GeForce not as an acceletator for the current generation of games, but as an enabler of the next generation. Games will have to be written to take advantage of the GeForce fully, whereas 3Dfx's card is squarely aimed at pushing the performance of current games through the roof.. ah well, should be a very interesting market in the next few months.. I feel sorry for Matrox and their G400 Max. smile.gif

    Teeth.


  • Registered Users Posts: 2,518 ✭✭✭Hecate


    yes it looks and sounds impressive, but most of that is marketing blurb and it wont be out here for quite a while, then we'll see how it performs.

    In the meantime a Neon or Xentor would be the best bet until 3Dfx's latest offering comes along.


  • Closed Accounts Posts: 6,275 ✭✭✭Shinji


    3dfx's new offering doesn't look interesting at all, I fear sad.gif

    Even their top-secret announcement at ECTS which we're all not allowed talk about wasn't all that interesting, since it was to do with software not nex-gen hardware.

    The GeForce is very impressive - the fill rate isn't as high as anticipated, but that's because it makes up for it with other features which reduce the fill rate that you're actually going to need in the first place. Anyone who saw their tech demo knows that the speed isn't gonna be an issue - with a processor twice as complex as a PIII and up to 128Mb of RAM onboard, this card is the best thing out there at the moment. The new 3dfx cards pale in comparison, and they look like they're playing catch-up with the TNT2 and the Neon, while NVIDIA themselves race ahead on the nex-gen path.

    Ja,
    Rob


  • Registered Users Posts: 2,518 ✭✭✭Hecate


    lol, Rob, youd almost swear that you had a GeForce or somthing smile.gif


  • Advertisement
  • Registered Users Posts: 2,010 ✭✭✭Dr_Teeth


    Think of it this way - on your current hardware, do any of the current games or betas of future games run too slowly at 640x480x16?

    If this is the case (ie, you have a processor slower than a P2) then you need more polygon crunching power, either in the shape of a faster CPU, or a graphics chip with hardware T&L.

    If (as is the case for most people now) all of the current games and most of the new ones (such as Q3 and UT) run great at low res and 16bit colour - then you already have all the T&L power you need. As you increase resolution and colour depth, the number of polygons being handled stays exactly the same, whereas the fill-rate goes through the roof.

    The current generation of games - up to and beyond Q3 are completely fill-rate limited, the only effective way to increase the performance of these games (on decent CPUs) is more fill-rate.

    Hardware T&L is of course 'the way forward', but at the moment a card like the GeForce isn't going to make much of a difference beyond it's increased fill-rate and memory.

    When the next generation of games with their ultra-high polygon counts and CPU-intensive physics engines and AI come along, the GeForce and similar cards will stand out - but then, you lot will just drop the poly levels, screw with mip-cap settings, and kill the lighting so it looks like Quake1 anyway, so why worry? smile.gif

    In any case, if 3Dfx can bring out a card with significantly higher fill-rate than the competition, I reckon their cards will be the fastest on any game that we will see until the middle of next year - I hope for their sake they have a hardware T&L card out by then, or they'll eat Nvidia's dust for sure!

    Teeth.


  • Closed Accounts Posts: 13 Geebee


    Ok but don't forget the geForce is still fuppen fast too, it puts fill-rate up by 40% on q3 with raw early drivers (I don't think this was even giving geometry over to the geforce, it was done in the cpu, correct me if I am wrong).

    This means the geforce will be great for the current wave of games. It will then be a nice card to have when geometry becomes a must. I suspect at this stage (mid next year) everyone else will have caught up. At the moment its timing is excellent. 3dfx have a point, giving more speed (80fps min with all effects on at 800x600.. anti aliasing does make the res look higher) which will support the games for the next 6 months VERY well. But if you go the 3dfx route you will need to buy twice in the next year to stay with an excellent card. Nvidia will offer you a card pretty good which should last the whole year.

    The demos are something else when compared to todays stuff. Walk around this tree filled island and down the the sea, which excellent lighting effects, great mapping and loads of polys all the way.. 120,000 in one scene.

    Oh developer types will HAVE to get the geforce, its gonna outperform any of the 3dlab cards at a far lower price. 3dmax should be very stable with it.

    Greenbean


  • Closed Accounts Posts: 6,275 ✭✭✭Shinji


    Teeth - I seem to recall a discussion on the OpenGL list about GPU systems, and the final conclusion was that if NVIDIA write the GeForce OpenGL ICD right, games running in OGL will get a significant speed boost from the oncard geometry functions - without having to rewrite any game code. DX7 supports GPU stuff transparently as well, but not as well.

    3DFX's fill rate in the nexgen isn't going skyward - they're concentrating on lamer features like the fabled T-Buffer (like anyone in their right mind is gonna use that, it'd be like turning around in this day and age and writing a GLIDE-only game) and other ummmm stuff that we're not allowed talk about on public boards... smile.gif But I can assure you all that it ain't earth-shaking smile.gif

    Ja,
    Rob


  • Registered Users Posts: 2,518 ✭✭✭Hecate


    seems 3dfx are up a certain sh1t creek without a paddle.

    They're coming out with all these standards and fancy names but no hardware. Like throwing out a bunch of figures and benchmarks is actually gonna sell anything.


  • Registered Users Posts: 3,308 ✭✭✭quozl


    hmm, t buffer is a load of crap?Doubt it. Companies will support it because there'll be a large enough userbase within weeks of its realease. many games even now support a couple of proprietary 3d audio standards. None of which have dominace over the other.
    T buffering will be easy to add in, and will have a major effect on gameplay by dropping the fps needed to be playable hugely. The only reason we need 40+ frames per second playable is because of the lack of blur in movement and so on in each frame. The extra frames are needed so our eyes cannt keep up and add in the blur makeing the games look like the real world and less jarring to our perceptions.
    However with t buffering framerates of 25, like in films, will be perfectly acceptable to the eye without any of the jerkiness apparent to us at the moment. So thats a pretty cool feature. Instantly helping lower end cpus (3dfx once again being the way to go for ppl with crap cpus) and adding a nice improvement in quality to all users.
    Not exactly an unimportant and useless piece of hardware
    Still think nvidia will walk all over them (hope so anyway smile.gif
    Q


  • Advertisement
  • Closed Accounts Posts: 1,341 ✭✭✭Koopa


    there is one snag.. if you have lower FPS, you have a lower sampling rate of the users input as well.. remember all the ****e ID software got when they had q2 running on a 10Hz sampling rate or something (well you shouldnt remember this greg cos you didnt even play quake back then..!!)
    i think they removed that limit in q2 long ago, im not totally sure.. if they didnt then they should remove it now!


  • Closed Accounts Posts: 13 Geebee


    Koopa that was a restriction in windows of 25 samples per second or something (maybe its 40). This still exists I think unless you have run a program called ps2rate which can take it up to 100 (even 200) samples a second.

    The new nvidia card will be very friendly to lower end cpu's (in theory) because it frees them up. The downside of this is as newer games take proper advantage of the GPU they will fill up the rest of the cpu usage with AI, etc. Theoretically a p200 will run quake3 as well as a p600 with the GPU active. This will probably end up to be fairly true but theres probably other things like bus bandwidth on older computers etc.


  • Closed Accounts Posts: 1,341 ✭✭✭Koopa


    nope, its not ps2rate
    this is before that, and was a limit in q2 only(probably only the early versions, cos ID software got a lot of **** over it by quakeworld players who were switching at the time)

    default win95/98 mouse rate is 40hz for ps2


  • Closed Accounts Posts: 6,601 ✭✭✭Kali


    I reckon within two years they'll have an AI card for improved bots smile.gif


  • Registered Users Posts: 488 ✭✭Sonic


    yes boy!! i want one of them!!!!


  • Closed Accounts Posts: 6,275 ✭✭✭Shinji


    There are a few.... issues.... with the T-Buffer. For a start, there's 3dFX's market share, which is still tiny due to the lack of OEM deals. Sure, they have about 50% of the retail 3D accelerator market, but that's a very small market - how many people buy standalone 3D accelerators off the shelf?

    The T-Buffer's blurring is poxy, it tends to motion blur the object as well as the area behind, which looks a bit crap close-up. It's good for some special effects, but ultimately does nothing that a GPU can't do with sheer poly-pushing. That blur, for example, can be done by a Playstation - look at Metal Gear Solid - which shows up the current gen of PC accelerators somewhat!

    Ja,
    Rob


Advertisement