Advertisement
If you have a new account but are having problems posting or verifying your account, please email us on hello@boards.ie for help. Thanks :)
Hello all! Please ensure that you are posting a new thread or question in the appropriate forum. The Feedback forum is overwhelmed with questions that are having to be moved elsewhere. If you need help to verify your account contact hello@boards.ie

9800pro flashed to XT Temps

Options
  • 10-08-2004 8:47pm
    #1
    Registered Users Posts: 508 ✭✭✭


    just outta curiosity what temps are people gettin on there flashed cards and at what core/mem speeds


    i have mine at 444/392 at a temp of 65C wud thought it wud be lower with the artic cooler


Comments

  • Registered Users Posts: 968 ✭✭✭Adeptus Titanicus


    Mine idles around 45 (high fan setting) and 55 (low fan setting) at 411.75/364.5. I have a cooldrive unit that beeps when it goes over 65, and this only happened once, recently, while testing an overclock of 427.5/405 on Doom3 with everything on high at 1024x768. It was on the low setting though, so switching it to high keeps it under 65 (but at XT speeds, without the extra overclock).

    I usually play Unreal Tournament 2004 with it set at low speed, and it sticks about 60. Your core is well overclocked at 444. I'd have thought with temps like that you'd be getting artefacts showing up running the ATI Tool. How does it look running the mother nature test in 3DMark03?


  • Registered Users Posts: 508 ✭✭✭davkav


    3d mark test looks grand anything higher though and artifacts show up all over the place.
    doom 3 works fine as well when overclocked even though ive seen posts around syaing it wont run as well as being on default


  • Registered Users Posts: 968 ✭✭✭Adeptus Titanicus


    Good stuff. Mine went over 65 again today when I had it at a low fan setting and running the beta of Dawn of War. Made me think it's time to get water cooling.. or clean the dust out of it ;)


  • Registered Users Posts: 2,942 ✭✭✭Mac daddy


    What are you guys using as a temp probe to check temps on gfx card???
    Because standard the 9800pro does not have this option only available on the XT 256Mb model- same as overdrive?


  • Registered Users Posts: 508 ✭✭✭davkav


    the R360 has a temp dode in it. Just isnt active under 9800 pro bios but when flashed to XT it becomes active


  • Advertisement
  • Closed Accounts Posts: 5,115 ✭✭✭Pacifico


    Stupid question : Ive a 9800 XT. How do you check the core temp? Its an OEM card so no software included :confused:

    thanks in advance :D


  • Registered Users Posts: 20,553 ✭✭✭✭Dempsey


    Could I flash my 9800Pro to XT with the stock cooler?


  • Closed Accounts Posts: 1,368 ✭✭✭-ADREN-


    probably not the best idea.


  • Registered Users Posts: 20,553 ✭✭✭✭Dempsey


    Is there any other way of enabling the thermal diode?


  • Registered Users Posts: 968 ✭✭✭Adeptus Titanicus


    As far as I know you need to have a thermal diode on the card itself. I would love to be proved wrong, as it'd be nice to have this feature unlocked. Has anyone here unlocked it? If so, what brand/memorychips/bios did you use?

    Have a look here, there's a pic of where the thermal diode should be.
    http://www.boards.ie/vbulletin/showthread.php?p=1536985#post1536985
    My card doesn't have this.

    At the moment I have a cooldrive unit that had 4 temp probes, one of which I slid under the heatsink so it's close to the core.


  • Advertisement
  • Registered Users Posts: 10,299 ✭✭✭✭BloodBath


    The pcb has to be an XT pcb for you to have a thermal diode so you may have an r360 core and not have the thermal sensor if the pcb is not an XT.


    BloodBath


  • Registered Users Posts: 20,553 ✭✭✭✭Dempsey


    I cant spot the thermal diode position even with the picture. How would I check if its a r360 core?


  • Registered Users Posts: 10,299 ✭✭✭✭BloodBath


    Removing the heatsink and checking the core is the only way to be sure.

    Another way is to try running 3dmark 2003. If you get artifacts during the pixel shader 2 test then it's an r350, if you don't it's an r360.

    Come on admins ban I Love You already.

    BloodBath


  • Registered Users Posts: 20,553 ✭✭✭✭Dempsey


    Is that the test with the elephants? I didnt notice any artifacts or anything odd. How do ya go about taking off the heatsink and putting it back is it similiar to a CPU heatsink?


  • Registered Users Posts: 10,299 ✭✭✭✭BloodBath


    Yeah there should be 4 screws holding it in place. You will need to apply some thermal paste when putting it back on. That's a good sign though.


    BloodBath


  • Registered Users Posts: 307 ✭✭Thordon


    Another way is to try running 3dmark 2003. If you get artifacts during the pixel shader 2 test then it's an r350, if you don't it's an r360.
    Do you mean after you softmod it? Would seem a bit strange if a stock 9800 pro would show artifacts when my 9600 non-pro dosent show artifacts on the same test. What exactly do artifacts look like anyway?


  • Closed Accounts Posts: 14,983 ✭✭✭✭tuxy


    it will show artifacts if its a r350 core because the bios for a xt will think you have a r360 and will try to access stuff that isnt avilable on the r350 core
    if your 9800 pro has a r360 u shouldnt have a problem unless the memory isnt able for the increased clock speed


  • Registered Users Posts: 508 ✭✭✭davkav


    Thordon wrote:
    What exactly do artifacts look like anyway?

    artifacts usually show up aslittle white dots on the screen i.e. missing pixels, which means your GFX card is clocked too high


  • Registered Users Posts: 614 ✭✭✭dent


    Flashed mine too. Standard XT clock speed. Temps are around 65 too. Go as high as 75 no artifacts though in 3D mark 03. Games play fine. Using a Arctic Silencer. The temp is reported by ATI tool. Wonder if its right.


  • Closed Accounts Posts: 90 ✭✭Billy Kovachy


    Found this in the Rivatuner Readme doc:

    Q: Why do I see noticeable difference in core temperatures monitored by RivaTuner and ATI Overdrive tab on my RADEON 9800XT? Does RivaTuner incorrectly read info from the sensor? Will you fix it?

    A: I will not fix anything, quite opposite I'd suggest ATI to fix their own control panel. RivaTuner displays the only core temperature that can be retrieved from RADEON 9800XT hardware sensor with the maximum accuracy. ATI already admitted that RADEON 9800XT boards don't have on-die thermal diode, and the temperature is monitored by the thermistor located near the graphics processor. So the temperature displayed by the control panel's Overdrive tab is just an attempt to approximate the real on-die temperature by adding constant 20°C offset to the real sensor's temperature. If you believe that such correction can approximate the real temperature with +/-2°C inaccuracy like ATI claims - you are free to specify 20°C offset for temperatures monitored by RivaTuner. To do it right-click core temperature graph, select Setup from the popup menu then enter 20°C temperature offset and click OK. f you prefer to see the real sensor's temperatures instead of tying to guess on-die temperature - just use default settings.


  • Advertisement
Advertisement