Advertisement
If you have a new account but are having problems posting or verifying your account, please email us on hello@boards.ie for help. Thanks :)
Hello all! Please ensure that you are posting a new thread or question in the appropriate forum. The Feedback forum is overwhelmed with questions that are having to be moved elsewhere. If you need help to verify your account contact hello@boards.ie

Next generation graphics cards just around the corner

Options
  • 14-04-2004 9:04pm
    #1
    Registered Users Posts: 6,007 ✭✭✭


    the reg

    After the arguably disappointing GeForce FX family of products, Nvidia has launched its latest graphics chip, which has been the subject of rumours from all corners of the world over the past few weeks. But now it's finally here and it's officially called the GeForce 6800. That's it, no fancy moniker in front of the numbers this time, just plain old 6800. But this time Nvidia has a product that will blow your socks off and no, I don't just mean a noisy fan this time around, I'm talking about the quality of the graphics and the outlandish performance, writes Lars-Göran Nilsson

    ...

    If you've saved up all your hard earned cash and planned to remortgage your house to get one of these new wonder cards, the good news is that as with recent top-end nVidia products, the GeForce 6800 Ultra will be launched at £399 inc VAT. And even if you don't have the fastest computer in the world, the GeForce 6800 Ultra will still do it justice as long as you're willing to play all your games at 1600 x 1200 resolution with 8x anti-aliasing and 8x anisotropic filtering.

    ...

    But what Nvidia should be supplying in the box is a discount voucher for a new power supply, as the GeForce 6800 Ultra needs two power lines to itself from the power supply. But worse than this is the fact that Nvidia recommends a minimum 480W power supply to make sure that the graphics card is supplied with enough juice. In return, you do get the most powerful graphics card on the market, which should convince enough users to fork out the extra money for a new power supply. Just don't expect to put one of these babies in small form factor box.

    ...
    toms hardware

    12.000 points in 3DMark 2003. A score of over 60.000 in AquaMark 3. Over 60fps in Halo at 1600x1200 and more than 50fps in FarCry with High FSAA and 4tap anisotropic filtering at 1024x768 - these are numbers that will bring tears of joy to PC enthusiasts everywhere.

    You'd have to go back quite a bit in the history of graphics cards to find a performance leap of similar magnitude. Maybe the transition from 3dfx's Voodoo1 to the Voodoo 2 comes close. Or the jump from NVIDIA's TNT2 to the GeForce 256 DDR, or perhaps the transition from ATi's Radeon 8500 to the 9700 Pro... Maybe these developments might come close, if, for the moment, we left aside the technological quantum leap in the past. But let's start at the beginning.

    ...

    Intresting to say the least.



    Re prices:

    the reg lists £399(€597) at release.

    TH lists as..
    6800 Ultra, $499(€417), 16 pipes, two Molex connectors, two slots, 400/550
    6800, $299(€249), 12 pipes, one Molex connector, one slot, TBD


«1

Comments

  • Registered Users Posts: 6,560 ✭✭✭Woden


    hell yeah it is, come on ati show us what ya got


  • Moderators, Computer Games Moderators Posts: 23,176 Mod ✭✭✭✭Kiith


    it's just a shame you'll have to sell your computer to afford it.


  • Registered Users Posts: 7,136 ✭✭✭Pugsley


    DOnt need new gfx cards atm, but they'll help in a few years time, and drop prices of current GFX cards, so no harm in it :)


  • Registered Users Posts: 6,560 ✭✭✭Woden


    ah they'll be no more expensive then any other high end card when they are released i reckon


  • Closed Accounts Posts: 944 ✭✭✭Captain Trips


    I'd find it hard to justify this; 9700 Pro and 2800XP+ gets FarCry running smooth at the standard Very High settings. Myabe depends on HL2 and D3, if they ever turn up.


  • Advertisement
  • Closed Accounts Posts: 2,918 ✭✭✭Deadwing


    It'll be cool to see what they can do in the longrun, but right now ill keep my money and my 9800pro. Theres no games at the moment that will really need that kind of raw power, and shelling out 400 or so odd euro for a FPS boost of 10 or 20 in Far cry is going a bit overboard, methinks.


  • Moderators, Category Moderators, Computer Games Moderators Posts: 51,391 CMod ✭✭✭✭Retr0gamer


    The power supply business is a bit dodgy to say the least. It will be interesting to see if the ATI card will have the same inconvenience and if not how it will affect its performance and sales.


  • Registered Users Posts: 8,718 ✭✭✭Matt Simis


    The power supply is recommended high as Intels CPUs are already soaking up stupid amounts of power. If they got things under control on their end then the PSU would prolly be 350w to 430w recommended. Besides, people neglect their PSUs, bout time they upgraded. ;)

    In regards "only a bit faster"... have to wonder what you play. Sure, if you play one of the few games, that only show a 33%+ performance increase while improving image quality (and only then at low res), then perhaps you personally can be satisifed with the current gen. I play at 1920x1080, which is too high a res to put AA or AF on with current gen cards, not to mention enabling Stereoscopic 3D, which causes a huge performance hit.

    Some games just dont run well on current cards at even "standard" settings... look at Halo and LockOn, jumps from 23fps to 50fps will make them different games.



    Matt


  • Registered Users Posts: 8,718 ✭✭✭Matt Simis


    Originally posted by Deadwing
    It'll be cool to see what they can do in the longrun, but right now ill keep my money and my 9800pro. Theres no games at the moment that will really need that kind of raw power, and shelling out 400 or so odd euro for a FPS boost of 10 or 20 in Far cry is going a bit overboard, methinks.


    That 20fps equates to an 80% increase in FPS:

    http://www.firingsquad.com/hardware/nvidia_geforce_6800_ultra/page23.asp

    Unless you dont put AA on your existing EUR400 card?



    Matt


  • Closed Accounts Posts: 2,918 ✭✭✭Deadwing


    Actually i dont put AA on my card because i find the difference is negligible in most cases, and its not worth it for the performance hit.
    Im sure the NV40 (or 6800) will be a great card, but buying another 400 euro (if its even that cheap) card, when my current one does the same job? No thanks. Ill wait until the next gen cards really come into thier own with doom 4 or half life 3 orsomething until they warrant a purchase.


  • Advertisement
  • Registered Users Posts: 8,718 ✭✭✭Matt Simis


    Originally posted by Deadwing
    Actually i dont put AA on my card because i find the difference is negligible in most cases, and its not worth it for the performance hit.
    Im sure the NV40 (or 6800) will be a great card, but buying another 400 euro (if its even that cheap) card, when my current one does the same job? No thanks. Ill wait until the next gen cards really come into thier own with doom 4 or half life 3 orsomething until they warrant a purchase.


    Well, in that case I envy your poor eyesight and/or small monitor! :p



    Matt


  • Registered Users Posts: 2,614 ✭✭✭BadCharlie


    You play at such a high res, How on earth do u c anything on the screen ??

    U have a 21inch monitor or something ?


  • Registered Users Posts: 15,815 ✭✭✭✭po0k


    Originally posted by Matt Simis
    Well, in that case I envy your poor eyesight and/or small monitor! :p



    Matt

    I play games at res from 800x600 (far cry) to 1280x1024, and tbqh, once I go past 1024x768 I would never consider enabling AA.
    I simply don't see the difference and it's of no benefit to me, especially in fast-moving games (of which the majority of my gaming time is spent playing).


  • Registered Users Posts: 1,272 ✭✭✭i_am_dogboy


    heres a link to a review of a reference model
    http://www.beyond3d.com/previews/nvidia/nv40/index.php?p=2
    to be honest i think nvidia have either ****ed up royally somewhere or they are scared ****less of ati's next line of cards.......if you look back over the last few years, they havent exactly released any cards with such ridiculously high specs, not even the geforce 2(which was an absolute beast) specs can compare to this. anyway i wont be buying one.........i'd much rather if i could buy a new vodoo :(

    it would be wise to wait and see what s3 and xgi have up there sleeves before buying a new card, hell even matrox might do something on par with the G400

    edit
    i forgot to mention, as the article points out, nvidia have been kown to bull**** a bit about their pipelines, the gedorce fx 5950 was said to have 8x1 pixel pipelines, which would suggest 8 pixel pipelines, but it had 4 and 2 texture units per pipe.......which is a completely pointless addition to my post


  • Registered Users Posts: 8,718 ✭✭✭Matt Simis


    Originally posted by SyxPak
    I play games at res from 800x600 (far cry) to 1280x1024, and tbqh, once I go past 1024x768 I would never consider enabling AA.
    I simply don't see the difference and it's of no benefit to me, especially in fast-moving games (of which the majority of my gaming time is spent playing).

    I appreciate we all see things different (literally).. but seriously what size monitors do you use??

    I have 24" CRT, even at 1280x1024 the pixels are close to 1mm in size, more than big enough to see. Also, without AA you suffer "pixel popup", objects in the distance that are just too far to render correctly at low res, so they pop in and out of the visual gameworld (obviously this happens on any monitor, its a problem with low res). This is all ignoring AF, when at full whack it turns the slimy, blurred textures of the standard filtering into sharp, details ones. Without it, its clear you are walking in an "orb" of sharper textures that ends about 2.5m in front of you. And just like AA, the absence of AF means certain details will not be rendered correctly (or even visable). See Firingsquads LockOn pictures.

    Virtually all games played on such high end cards are "fast moving". If they werent fast moving its unlikely you would need such fast cards. I play BF:Vietnam at 1600x1000.. and frankly it looks like a different game when 2xAA and 8XAF is turned on. I simply couldnt play at lower settings.

    Im amazed that there is anyone that could play at lower settings in this day and age, especially when you have meaty cards like 9800s.



    Matt


  • Registered Users Posts: 8,718 ✭✭✭Matt Simis


    Originally posted by i_am_dogboy
    heres a link to a review of a reference model
    http://www.beyond3d.com/previews/nvidia/nv40/index.php?p=2
    to be honest i think nvidia have either ****ed up royally somewhere or they are scared ****less of ati's next line of cards.......if you look back over the last few years, they havent exactly released any cards with such ridiculously high specs, not even the geforce 2(which was an absolute beast) specs can compare to this. anyway i wont be buying one.........i'd much rather if i could buy a new vodoo :(

    it would be wise to wait and see what s3 and xgi have up there sleeves before buying a new card, hell even matrox might do something on par with the G400


    Look at what they were competing against back with GF2 days.. They merely are responding to the competition, there is nothing shocking or "iffy" about the GF6800, its merely the competitor to the incredibly fast ATI cards. I wouldnt even list the GF2 as ground breaking, GF1 or GF3 yeah, but GF2 was just a small evolutionary step. Remember the overpriced GF2 Ultra.. all it had was higher clock speed and green heatsinks?

    If you want VooDoo cards, buy the Geforce6800, many of the 3dfx people worked on this card, infact one of them is a team leader on the design team in nVidia.


    Matt


  • Registered Users Posts: 1,272 ✭✭✭i_am_dogboy


    I wasn't saying the geforce 2 was impressive, but the specs for the time were pretty amazing, the geforce 1 couldn't even handle the features it was meant to provide, the geforce 3 was however a great card. And when I say i want a vodoo, I want a vodoo, by 3dfx........it just doesn't seem the same without them.......i remember the excitement the day i bought my vodoo 3.

    The 6800 is looking good, but is the top range model just going to be a mythical peice of hardware like the voodoo 5 6000?

    Still im gonna wait and see retail models and what happens with the other companies before i form a proper opinion.


  • Closed Accounts Posts: 944 ✭✭✭Captain Trips


    Originally posted by i_am_dogboy


    it would be wise to wait and see what s3 and xgi have up there sleeves before buying a new card, hell even matrox might do something on par with the G400


    Sure guy, right as soon as my Amiga order arrives, I'll order a video card from Matrox.


  • Registered Users Posts: 3,754 ✭✭✭Big Chief


    Availability
    The first GPUs based on the NVIDIA GeForce 6 Series, the GeForce 6800 Ultra and GeForce 6800 models, are manufactured using IBM’s high-volume 0.13-micron process technology and are currently shipping to leading add-in-card partners, OEMs, system builders, and game developers.

    Retail graphics boards based on the GeForce 6800 models are slated for release in the next 45 days.

    does this mean they will be on a shelf near us soon?

    i am awaiting Ati's reply to this...


  • Registered Users Posts: 15,815 ✭✭✭✭po0k


    0.13u is a bit shit though, compared to IBM's mastery of the 90nm process (970FX, 2Ghz 64-bit @ 24W:)).
    Spose it does allow them to punch out more chips using a tried and tested fab process, though not as much as 90nm on 300mm wafers....


  • Advertisement
  • Registered Users Posts: 6,560 ✭✭✭Woden


    i'd say it will be 2months before cards hit the shelves

    @ dog boy you can be sure that if the retail cards don't perform like that nvidia will be laughed at. i'm sure the retail cards will be better with variability in the cooling set ups and if they desire somewhat higher frequencies.


  • Registered Users Posts: 6,892 ✭✭✭bizmark


    Originally posted by Captain Trips
    Sure guy, right as soon as my Amiga order arrives, I'll order a video card from Matrox.

    LOL :D

    What are you talking about dogboy ? the 6800 is a masterpiece double the proformance of the 9800xt how could nividia have "****ed up rolaly" by produceing a card that kicks the **** out of the last generations best card??

    Me thinks your stuck in a time warp or something


  • Registered Users Posts: 1,272 ✭✭✭i_am_dogboy


    Originally posted by bizmark
    LOL :D

    What are you talking about dogboy ? the 6800 is a masterpiece double the proformance of the 9800xt how could nividia have "****ed up rolaly" by produceing a card that kicks the **** out of the last generations best card??

    Me thinks your stuck in a time warp or something
    yeah......i'm kind of stuck in the past when it comes to games......i wanna play sonic 1 again......

    what i meant by the ****ed up royally comment is that they might have screwed something up to require such specs(benchmarks dont prove every aspect of performance), in most cases the new generation of cards kick the ass of the previous generation of cards anyway.....im thinking the radeon 9500 pro taking on the ti 4400 and 4600 for a far better price than both.

    and data, i wasn't saying the retail model would perform worse, i just said i wanted to see the retail models, i meant that i wanted to see the different cards available and how they compare to each other.

    and on another note, matrox made great cards a few years ago, did anyone see slave zero with bumpmapping? at the time there was nothing like it out there


  • Registered Users Posts: 8,718 ✭✭✭Matt Simis


    Originally posted by i_am_dogboy
    yeah......i'm kind of stuck in the past when it comes to games......i wanna play sonic 1 again......

    what i meant by the ****ed up royally comment is that they might have screwed something up to require such specs(benchmarks dont prove every aspect of performance), in most cases the new generation of cards kick the ass of the previous generation of cards anyway.....im thinking the radeon 9500 pro taking on the ti 4400 and 4600 for a far better price than both.

    and data, i wasn't saying the retail model would perform worse, i just said i wanted to see the retail models, i meant that i wanted to see the different cards available and how they compare to each other.

    and on another note, matrox made great cards a few years ago, did anyone see slave zero with bumpmapping? at the time there was nothing like it out there

    1) The Radeon 9700 was introduced when the Ti4600 reigned, not the 9500 (it came later). By your extremely unusual reasoning, it also must have "****ed up" because it had vastly better spec than the many incarnations of DX8 cards it trumped.
    2) The Matrox G400 had good bump mapping, tho the GF hw had a similar method as well as the superior Dot Product BM technique. Matrox did surprisingly well getting vendors to support them.. but the days of small companies competing on the high end front are over, its simply not financially viable. Any of these minnows (Matrox, VIA, S3, PowerVR etc) could come out with a low end, very competitive card, but the staggering R&D and production cost of the high end cards (that sell less then 5% of the low-mid range cards, no chance to recoup costs unless you have a "top to bottom" chip family ready to go) means only the powerhouses of the industry need apply. Forget about your white knight.


    Matt


  • Registered Users Posts: 16,666 ✭✭✭✭astrofool


    dogboy's just trolling.

    Well, I hope so, cos the other option is that he's extremely stupid, either way, its probably safe to ignore :)

    3dfx powa! ;)


  • Registered Users Posts: 5,460 ✭✭✭shinzon


    I will laugh heartily when these cards come out, when everyone rushes out to get it and install it on there machines and the inevitable happens

    1) Im getting artifacts in such and such

    2) this game wont work

    3) drivers are imcompatible ETC ETC

    Theres an old saying that i believe is applicable in this instance "dont believe the hype", until one of these cards are in your machines up and running with no problems whatsoever then believe it, dont take some tech bods word for it at some hardware expo, hes there to sell the card and rubbish everything else


    Shin


  • Registered Users Posts: 1,272 ✭✭✭i_am_dogboy


    Originally posted by Matt Simis
    1) The Radeon 9700 was introduced when the Ti4600 reigned, not the 9500 (it came later). By your extremely unusual reasoning, it also must have "****ed up" because it had vastly better spec than the many incarnations of DX8 cards it trumped.
    2) The Matrox G400 had good bump mapping, tho the GF hw had a similar method as well as the superior Dot Product BM technique. Matrox did surprisingly well getting vendors to support them.. but the days of small companies competing on the high end front are over, its simply not financially viable. Any of these minnows (Matrox, VIA, S3, PowerVR etc) could come out with a low end, very competitive card, but the staggering R&D and production cost of the high end cards (that sell less then 5% of the low-mid range cards, no chance to recoup costs unless you have a "top to bottom" chip family ready to go) means only the powerhouses of the industry need apply. Forget about your white knight.


    Matt
    Ok, my original comment didn't really come out the way i meant it, what i as trying to saw was that it strikes me as odd that nvidia, a company who usually pioneer new technologies, have produced such a powerhouse of a card, it's not really like them........i just though that maybe they may have screwed up some aspect of performance and are using the massive fill rate to compensate, kind of like covering up an achilles heal. I wasn't in any way implying that they ****ed up by designing and producing a very good fast card. It is a bit extreme i know but it could happen with a reputation like theirs to protect. I didn't say that they definitely did **** up either.......I also suggested that they may competing with ATI.

    And the mention of the 9500 instead of the 9700 was just to prove the point of one generation being better than the previous, I thought the 9500 was a better example because it was a mid range card taking on the high end offerings from nvidia.

    You are right on the matter of matrox and the other companies though.......It's a shame really, I was a big fan of theirs.

    Shinzon i gotta agree with you completely, the amount of problems i had with my geforce was unreal and the performance just wasn't what i was execting either. It didn't live up to the hype for me.


  • Registered Users Posts: 6,420 ✭✭✭Doodee


    It is quite possible that Matrox or PowerVR could come out with a card to compete. For one, PowerVR backed the whole Tile based Rendering, which was far more efficient, they ****ed up though by not providing hardware T&L (or was it Texture compression) on the cards.

    If i recall, when those cards originally came out, everyone blowed on about how their performance was better than the GF3 based cards in favoured games like counter strike or quake3. I also remember suggestions about a new type of tile based card, with signifigant performance increases etc.
    Then again, dont believe what you read.

    This is simply a jump in the generations. The GF4 Cards and FX's belong to the 2.0Ghz + bracket of CPU's. Its now upto 3.4Ghz and over a year since they arrived so its no wonder this card is out. Also, Cards will go hand in hand with the games industry, thats sort of obvious. So what with HL2 meant to be here now, etc, Nvidia are only supplying to the market what was needed. Nobody seems to want to settle for games running anymore, they wanna show off their flash cards and big CPU's because they're overcompensating :D

    Im just curious as to weither ATI will support the newer Generations of Processors or do as they did with the 9500 and increase the performance of an older and more reliable one. (Nvidia backed their newer processors in the FX series)

    Gah, its been so long since i read a 15 page article on the latests and greatest GPU's *sniffle*


  • Closed Accounts Posts: 947 ✭✭✭neXus9


    Originally posted by Matt Simis
    I appreciate we all see things different (literally).. but seriously what size monitors do you use??

    I have 24" CRT, even at 1280x1024 the pixels are close to 1mm in size, more than big enough to see. Also, without AA you suffer "pixel popup", objects in the distance that are just too far to render correctly at low res, so they pop in and out of the visual gameworld (obviously this happens on any monitor, its a problem with low res). This is all ignoring AF, when at full whack it turns the slimy, blurred textures of the standard filtering into sharp, details ones. Without it, its clear you are walking in an "orb" of sharper textures that ends about 2.5m in front of you. And just like AA, the absence of AF means certain details will not be rendered correctly (or even visable). See Firingsquads LockOn pictures.

    Virtually all games played on such high end cards are "fast moving". If they werent fast moving its unlikely you would need such fast cards. I play BF:Vietnam at 1600x1000.. and frankly it looks like a different game when 2xAA and 8XAF is turned on. I simply couldnt play at lower settings.

    Im amazed that there is anyone that could play at lower settings in this day and age, especially when you have meaty cards like 9800s.



    Matt



    What monitor have you got, and how much was it? It would be damn expensive to get a monitor that big with that type of resoloution with a decent refresh rate. What's your refresh rate anyway, at 1600x1000??????????????????????????


  • Advertisement
  • Registered Users Posts: 8,718 ✭✭✭Matt Simis


    Originally posted by neXus9
    What monitor have you got, and how much was it? It would be damn expensive to get a monitor that big with that type of resoloution with a decent refresh rate. What's your refresh rate anyway, at 1600x1000??????????????????????????

    Sony W900 24" Widescreen cost me EUR600, but they appear in the Buy and Sell and Boards Forsale (Dubseller selling them afaik) for around the EUR400 mark from time to time. This model is old, but was originally very expensive (EUR2.5K). As long as you can get over the curvature (its not a flat screen, its successor is, the FW900), its a great monitor.

    Resolutions of note:
    1920x1200, 75Hz
    1920x1080, 85Hz
    1600x1024, 85Hz
    1600x900, 100Hz
    800x600, 140Hz

    The top 4 are all widescreen, the top two are the "ideal" modes, as you can fit two A4 pages side by side. I run at 1920x1080 on the desktop, great for web browsing. Id need something as fast as the Geforce 6800 to play games at that res tho. Any decent 21" monitor will do (at least) 1600x1200 nicely however and why not run at high res if you have cards fast enough.. why retard your games to lower res needlessly?


    Matt


Advertisement