Advertisement
If you have a new account but are having problems posting or verifying your account, please email us on hello@boards.ie for help. Thanks :)
Hello all! Please ensure that you are posting a new thread or question in the appropriate forum. The Feedback forum is overwhelmed with questions that are having to be moved elsewhere. If you need help to verify your account contact hello@boards.ie

1GB 2900XT :o

Options

Comments

  • Registered Users Posts: 3,969 ✭✭✭christophicus


    LOl nice link ;) :P

    I dont think it will be that usefull really, I dont think its the ram holding that card back at all, I defo think the GPU is underpowered/not being used efficiently.


  • Closed Accounts Posts: 4,757 ✭✭✭8T8


    It is clocked higher as well, whether that will make any difference is another matter. I think this is the ill fated XTX but as the XT was unable to match the 8800GTX they canned the XTX so it's basically coming back as a souped up XT which allows them to save face by not calling it an XTX.

    2900XT 512MB
    740Mhz Core
    1650Mhz RAM

    2900XT 1GB
    825Mhz Core
    2100Mhz RAM

    It will be interesting to see if the 1GB really makes a difference at extreme resolutions with lots of AA/AF applied.


  • Registered Users Posts: 3,357 ✭✭✭papu


    the 2900xtx is also 65nm which will hopefully fix the heat / power issue,

    its higer clocks and the xt can go 870/1100 on stock usually

    so the higher clocked xtx will hopefully go 900-1000 gpu and probably not much higer in the ram as its already clocked insanely:eek:

    cant wait for benchies and decent drivers XD


  • Closed Accounts Posts: 459 ✭✭Offalycool


    Check this out... a Dual Radeon HD 2600 XT from HIS.
    Sketcky info at the mo.. may never see the light of day.
    http://www.hwupgrade.com/news/video/dual-radeon-hd-2600-xt-from-his_111.html


  • Registered Users Posts: 3,969 ✭✭✭christophicus


    I could have sworn MSI had displayed a dual 2600xt already , although they were on seperate pieces of pcb. it was still all the one card.


  • Advertisement
  • Closed Accounts Posts: 4,757 ✭✭✭8T8


    I think they did but on one of the Computex reports from a site I forget they said performance was crap and not worth it.

    The Radeon 2600XT from rumours will be just on par with the 8600 series but priced cheaper apparently so if you really wanted dual card action it would probably be cheaper to get 2x2600 cards.


  • Closed Accounts Posts: 459 ✭✭Offalycool


    I wonder how long it will take them to stick more GPU cores on a single die, like duel/quad CPU’s


  • Closed Accounts Posts: 4,757 ✭✭✭8T8


    It's not a question of when but whether "is it worth doing" and if it was worth doing then they would have already done it by now.

    GPU's are highly parralel to begin with so having two core's doesn't make much sense when you can just build a bigger one.


  • Registered Users Posts: 44 tedstokes168


    So you know the reason the XTX has never had a public
    release is because it was an OEM experiment version
    for companies to try different cooling solutions.

    Thats why you will never see an "XTX" version on shelves.


  • Closed Accounts Posts: 4,757 ✭✭✭8T8


    Not quite, ATI aimed for the 8800GTX with the 2900XT/XTX but failed to match it. They went in face saving mode and re-branded the 2900XT as lower tier high end GPU targeting the 8800GTS.

    The XTX is coming back {as 2900XT 1GB} because probably ATI were able to get yields at an acceptable level with that core clock which was probably the original core clock for the XTX {as was GDDR4 usage} so they can push out a few 2900XT 1GB cards & claw their way past the GTS but until we see benchmarks we wont know if it is competitive to the 8800GTX.

    The original 2900 XTX was up in the air with whether is was going to launch or not, then it was said it was OEM only using a longer board that was cheaper to make and then it was canned & dropped off the radar.

    This new 2900XT 1GB is a retail product and I think it's safe to say it is the XTX but they just don't want to call it that because if their flagship product XTX fails to beat even an 8800GTX it will look very poor, this way the leave a little space for themselves to breath should they ever get their act together and produce a competitive 2900XTX or 2950XTX.


  • Advertisement
  • Closed Accounts Posts: 459 ✭✭Offalycool


    8T8 wrote:
    It's not a question of when but whether "is it worth doing" and if it was worth doing then they would have already done it by now.

    GPU's are highly parralel to begin with so having two core's doesn't make much sense when you can just build a bigger one.


    Point taken.. but I have to wonder if sticking 3 cards in ur rig (sli/crossfire and a Graphics Physics Gpu) is really more about extorting consumers than providing a more economical and efficient system.


  • Closed Accounts Posts: 4,757 ✭✭✭8T8


    Well the physics thing is indeed rubbish, multi core CPU's are far more likely to do that kind of work and no products actually make of GPU based physics.

    SLI/Crossfire does have it uses not so much at the low end but at the high end if you just want raw for very high resolutions power that's the way to go but it is still very much a niche.


  • Registered Users Posts: 3,357 ✭✭✭papu


    before the 2900 series was released , even on wikipedia , there were showings of the range which was

    2900xt
    2900xtx
    2900xtx2( dual gpu's)

    the xt was never their flagship product , the xtx2 was but i suppose they never got around to making it , probably to impractical


  • Registered Users Posts: 3,969 ✭✭✭christophicus


    Can you imagine the power draw on such a card??? LOl


  • Registered Users Posts: 3,357 ✭✭✭papu


    well they were ment to be 65nm and ddr4 which would have reduced the power draw allot , it'd still be huge.....>!!1


  • Closed Accounts Posts: 459 ✭✭Offalycool


    Can you imagine the power draw on such a card??? LOl

    God, imagine the power draw on two in crossfire!! I think 1GB of memory on a graphics card eats into the amount of system memory windows can recognise. If u had 2GB on cards in crossfire I’m sure this would be detrimental to the overall system performance.


  • Closed Accounts Posts: 4,757 ✭✭✭8T8


    Well the power drain would be high but it's not as if we don't already have PSU's capable of delivering that.

    The amount of RAM on the GPU has no bearing on 32-bit Windows as long as you are not going over 2GB of RAM otherwise if you put in 3-4GB address mapping will kick in and you will lose 1GB RAM & be back down to 2GB or roughly higher with some jiggery pokery you might get it to 3GB.

    Anyway performance wise it has no effect on Windows and each 2900XT would be treated as having 1GB of RAM to itself, the RAM isn't pooled both SLI & Crossfire duplicate everything in both cards RAM.


  • Subscribers Posts: 6,408 ✭✭✭conzy


    8T8 wrote:
    jiggery pokery


    :)


  • Registered Users Posts: 3,579 ✭✭✭BopNiblets


    Hmm, I had my eye on a 8600GTS, so a 2600XT would be cheaper and better?
    Sounds like a sweet deal, especially if that offer of included Episode 2, Portal and TF2 offer is carried over.

    cms_file.php?id=2682


  • Registered Users Posts: 1,525 ✭✭✭DanGerMus


    tis on preorder on www.overclockers.co.uk now
    But i wouldnt hold my breath for much better performance i've read reviews that say the architecture of these cards is just designed badly and the cant use all there processors power and bandwidth efficiently and that no driver can fix this.


  • Advertisement
  • Registered Users Posts: 3,357 ✭✭✭papu


    Link plz?


  • Closed Accounts Posts: 4,757 ✭✭✭8T8


    There is an interview with the lead designer on Beyond3D where he talks about a lot of stuff relating to R600, very techy though he responds to other questions in the thread.

    It appears the 2900XT is okay in that it is on par with the GTX/GTS as long as you don't turn on AA/AF which of course everyone does with such a high end unit. In response as to why performance tanks he says this in the thread.
    The AF hit can be more significant on R600 at this time. It should show higher quality in exchange (not on the LOD isotropy front, but on the mipmap transition front). I believe some of that will be addressed, in part, in future drivers. I can't promise a % improvement at this time.

    So essentially as long as you don't use AA/AF it's fine for the time being which is not really an acceptable situation with such a high end card, hopefully it is the roughness of the drivers that is the cause & as he says will be addressed.


  • Registered Users Posts: 1,525 ✭✭✭DanGerMus


    papu wrote:
    Link plz?

    how hard would it be to go there and check it yourself... lol...
    here you go you lazt git :p...jk

    http://www.overclockers.co.uk/showproduct.php?prodid=GX-061-OK&groupid=701&catid=56&subcat=922

    or was it the architecture issue you wanted? whoops... asap


  • Registered Users Posts: 3,357 ✭✭✭papu


    But i wouldnt hold my breath for much better performance i've read reviews that say the architecture of these cards is just designed badly and the cant use all there processors power and bandwidth efficiently and that no driver can fix this.

    i did get it myself , you even posted it yourself

    i was asking for the link to this ^^^^^^^^^^

    :P


  • Registered Users Posts: 1,525 ✭✭✭DanGerMus


    lol yeah it did occur to me immediatly after i posted it, here is it page 5 in particular. apparently the 'lack of texture units (16) will probably hold it back', i'm paraphrasing. and i was refering to the r600 in general not the 1gb version only but afaik they're pretty much the same cards just with more ram higher clock. here's the link

    http://www.xbitlabs.com/articles/video/display/r600-architecture.html

    now i know nothing about this stuff but this guy seems to know a bit so take from it what you will.


Advertisement