Advertisement
If you have a new account but are having problems posting or verifying your account, please email us on hello@boards.ie for help. Thanks :)
Hello all! Please ensure that you are posting a new thread or question in the appropriate forum. The Feedback forum is overwhelmed with questions that are having to be moved elsewhere. If you need help to verify your account contact hello@boards.ie

9800GX2 or GTX280?

Options
2»

Comments

  • Registered Users Posts: 3,357 ✭✭✭papu


    DanGerMus wrote: »
    Moose, i think what they've done is abandon the highend singlecore cards in favour of developing cheaper mid range cards and then just slapping 2 together for high end. Saves on development i suppose.

    Squall; with the gtx260 at that price you've made me think twice about my 4870 purchase. Hasn't been shipped yet though. With the 4870 apparently being such bad clockers how far past them do you think the 260 will clock?

    4870's have no proper clocking tool now because of the ddr5 give it a few weeks , ccc will only take it so far , theres a Overclocked edition comming soon with 800/1200 which will be taking on the 280 i believe , ati cards usually have allot of headroom left for clocking :D


  • Closed Accounts Posts: 852 ✭✭✭blackgold>>


    Squall we all know your a die hard nvidia fan but why would you buy a card for 267 when a 140 card can do exactly the same thing?

    And as far as overcloking goes why they hell do you buy a card thats more expensive and overclock it 50-100mhz and talk about it like your actually making a difference to your gameplay?Overclocking is pointless and totally blown out of proportion to the point of been ridiculous.


  • Registered Users Posts: 2,044 ✭✭✭Sqaull20


    Squall we all know your a die hard nvidia fan but why would you buy a card for 267 when a 140 card can do exactly the same thing?

    And as far as overcloking goes why they hell do you buy a card thats more expensive and overclock it 50-100mhz and talk about it like your actually making a difference to your gameplay?Overclocking is pointless and totally blown out of proportion to the point of been ridiculous.

    Yeah, you probably wouldnt even notice the difference with the overclock, but its nice to say you got a Gtx 280 for €267 -132mb of memory :D

    HD4850 is a fine card, but its not as fast as a gtx260, at least in the games I play anyway.


  • Closed Accounts Posts: 852 ✭✭✭blackgold>>


    But you don't have a 280 for 267 you have a 260 for 267 when an 4850 is on par and better with some games and visa versa.
    Your point of view is distorted by your support for one company over another.I fail to see your point about saying you overclocked anything would you brag about overcloking a core duo 1.7ghz to 1.8?


  • Closed Accounts Posts: 5,111 ✭✭✭MooseJam


    why do graphics cores run at ~ 600 mhz while cpu's run at ghz, why no gfx clocked at ghz ?


  • Advertisement
  • Registered Users Posts: 3,357 ✭✭✭papu


    whoooole different architecture , cpus have 2 cores now or 4 , but in a gfx each stream processor is basically a small cpu


  • Closed Accounts Posts: 852 ✭✭✭blackgold>>


    There a totally different kettle of fish as papu said I would read up on oscillator crystal clock rate buses and generally how cpus work.Your question can't be answered sufficiently by us farts.
    This is a good read if your really interested.
    http://en.wikipedia.org/wiki/Clock_rate


  • Registered Users Posts: 3,357 ✭✭✭papu


    1337mhz.


  • Registered Users Posts: 171 ✭✭WEST


    Squall we all know your a die hard nvidia fan but why would you buy a card for 267 when a 140 card can do exactly the same thing?

    And as far as overcloking goes why they hell do you buy a card thats more expensive and overclock it 50-100mhz and talk about it like your actually making a difference to your gameplay?Overclocking is pointless and totally blown out of proportion to the point of been ridiculous.

    I can never understand the brand loyalty myself. Why someone would spend over €100 more for a card that does not provide better performance? Its not like car were everyone can see it. A graphics card is is hidden away were no one can see it.

    I can see the marketing departments rubbing their hands in delight when they see the fanboys (aka nutjobs) reaching for their credit cards.


  • Closed Accounts Posts: 852 ✭✭✭blackgold>>


    I think this picture encompases it quite nicely.
    fanboy-anatomy.jpg


  • Advertisement
  • Closed Accounts Posts: 13,874 ✭✭✭✭PogMoThoin


    LMFAO :D


  • Registered Users Posts: 2,848 ✭✭✭Fnz


    WEST wrote: »
    I can never understand the brand loyalty myself. Why someone would spend over €100 more for a card that does not provide better performance?

    I'm not too up to date on all things GPU-related but I have been told that Nvidia are more 'prompt' when it comes to driver updates. Things like that inspire loyalty.


  • Registered Users Posts: 3,357 ✭✭✭papu


    Fnz wrote: »
    I'm not too up to date on all things GPU-related but I have been told that Nvidia are more 'prompt' when it comes to driver updates. Things like that inspire loyalty.

    they release ALLOT of beta drivers , but more often than not it causes instablility , there was a study that showed that about 30% of vista crashes were caused by Nvidia drivers while ati had only 9.3%link


  • Closed Accounts Posts: 13,874 ✭✭✭✭PogMoThoin


    Fnz wrote: »
    I'm not too up to date on all things GPU-related but I have been told that Nvidia are more 'prompt' when it comes to driver updates. Things like that inspire loyalty.

    Nvidia use tactics similar to Creative, they drop support for certain devices at driver level. My mates Geforce 5300 wouldn't run under Vista as Nvidia decided it'd be best to force customers to buy a new gpu. An older Ati x1300 has a vista driver and is supported.


  • Registered Users Posts: 2,848 ✭✭✭Fnz


    papu wrote: »
    they release ALLOT of beta drivers , but more often than not it causes instablility , there was a study that showed that about 30% of vista crashes were caused by Nvidia drivers while ati had only 9.3%link
    Yeah I did hear about the Vista crash stats.

    It was a friend who told me about Nvidia "being better for drivers". I assume he knows what he's talking about (is really into his games). My impression was that ATI users would be waiting longer for (sometimes vital) bug fixes (for new games, in particular).
    PogMoThoin wrote: »
    Nvidia use tactics similar to Creative, they drop support for certain devices at driver level. My mates Geforce 5300 wouldn't run under Vista as Nvidia decided it'd be best to force customers to buy a new gpu. An older Ati x1300 has a vista driver and is supported.

    That does sound bastardish, alright. I can see why people might become ATI 'fanboys' as a result of such treatment.


  • Registered Users Posts: 3,357 ✭✭✭papu


    Fnz wrote: »
    Yeah I did hear about the Vista crash stats.

    It was a friend who told me about Nvidia "being better for drivers". I assume he knows what he's talking about (is really into his games). My impression was that ATI users would be waiting longer for (sometimes vital) bug fixes (for new games, in particular).

    That does sound bastardish, alright. I can see why people might become ATI 'fanboys' as a result of such treatment.
    yeah but ati do release Hot fixes which fixes allot of problems with new games , Bioshock and the hd2900xt was one i can remember..


  • Registered Users Posts: 16,614 ✭✭✭✭astrofool


    ATI release drivers monthly, nVidia, at most, quarterly.

    The 5300 was the same generation as the Radeon 9700, two before the x1300. ATI still support the 9700 however :)


  • Registered Users Posts: 8,067 ✭✭✭L31mr0d


    anyone think its worth waiting for the 1GB HD4870?


  • Registered Users Posts: 171 ✭✭WEST


    L31mr0d wrote: »
    anyone think its worth waiting for the 1GB HD4870?

    That will depend on what resolution you play at and if you use AA. If you have a 24" monitor or less the 512 should be enough. Best check the Hardocp review for the HD4870, the review there shows the highest playbale settings for the card in a few games. Just after a quick glance at the review it seems 512 is enough.

    Plus it will depend on the game you play with too. Cyrsis uses a lot of memory with AA, then again that game is not really playbale with AA.


  • Closed Accounts Posts: 13,874 ✭✭✭✭PogMoThoin


    L31mr0d wrote: »
    anyone think its worth waiting for the 1GB HD4870?

    Apparently not, I read on XS that there's no need for 1GB as its GDDR5, there's enough bandwitdh with 512MB.


  • Advertisement
  • Registered Users Posts: 16,614 ✭✭✭✭astrofool


    Amount of RAM and bandwidth are independent measures.

    1GB is probably overkill for now, but I'm sure nVidia will be pushing devs to use the 1GB on their GX280.


  • Registered Users Posts: 3,357 ✭✭✭papu


    yeah but still its allot , and thats taken up in page file aswell , so really if its not being used , your paying for the unused ram , and more memory being used by windows!


  • Closed Accounts Posts: 13,874 ✭✭✭✭PogMoThoin


    Apparently nvidia need more than 512MB, Ati don't. I'm searching for the post on Xs that explains it.


  • Closed Accounts Posts: 13,874 ✭✭✭✭PogMoThoin


    Nvidia are dropping price of 260 and 280 to compete, HD4870 is better than they had hoped, more here


  • Registered Users Posts: 2,044 ✭✭✭Sqaull20


    PogMoThoin wrote: »
    Nvidia are dropping price of 260 and 280 to compete, HD4870 is better than they had hoped, more here

    Its suppose to be much bigger than that.

    Something like €70 - 80 price drop on each card.

    As cheap as GTX 280 €410, GTX 260 €230

    Sli 260 looks a good choice


  • Registered Users Posts: 16,614 ✭✭✭✭astrofool


    PogMoThoin wrote: »
    Apparently nvidia need more than 512MB, Ati don't. I'm searching for the post on Xs that explains it.

    I'd guess that either they are not compressing, need more cache due to their archictecture, and need more space, or it's to do with the amount of chips needed to support a 512bit bus (8*64 bit bus, though the 2900XT had a 512bit bus and came with 512MB Ram).


Advertisement