Advertisement
If you have a new account but are having problems posting or verifying your account, please email us on hello@boards.ie for help. Thanks :)
Hello all! Please ensure that you are posting a new thread or question in the appropriate forum. The Feedback forum is overwhelmed with questions that are having to be moved elsewhere. If you need help to verify your account contact hello@boards.ie

Nvidia RTX Discussion

Options
1106107109111112209

Comments

  • Registered Users Posts: 4,033 ✭✭✭BArra


    Haven't mounted them, my 1080 Gamerock was a 3 slot card so would only fit slim fans like the noctua 15mm. That and the usb 3 port position on this asus b450 means you have to loop the cable back and it interferes with the mounting holes.

    Have a usb 2 to usb 3 header so will slap that in at some point and stick two fans under it, but the temps have been pretty decent since I've had it. Usually hit max 65ish on the cpu and 70-75 on the gpu when stress testing and that was with a pretty late fan curve.

    raise that top glass and get 680x feet,corsair will send you them free, pm me if you want.

    i have a 280x also but minus the badly needed 3080fe!


  • Registered Users Posts: 21,654 ✭✭✭✭Squidgy Black


    BArra wrote: »
    raise that top glass and get 680x feet,corsair will send you them free, pm me if you want.

    i have a 280x also but minus the badly needed 3080fe!

    Deadly cheers, didn't know about the 680x feet. Have the top air filter removed and a couple washers under it as the aio rad is exhaust but could probably do with a bit more height


  • Registered Users Posts: 12,408 ✭✭✭✭TheValeyard


    Homelander wrote: »
    According to a review from the AX860 in 2013 even that one came with 3 x 8 pin. I guess it's possible as well you might have lost or misplaced one cable over time though. Major pain.

    Got sorted. Check the attic. Found spare 8pin pcie.
    Whoop whoop.

    All Eyes On Rafah



  • Registered Users Posts: 2,541 ✭✭✭Fitz II


    Got sorted. Check the attic. Found spare 8pin pcie.
    Whoop whoop.

    Whast is a man with a box of random wires....we all have one, and if not....for shame


  • Registered Users Posts: 8,798 ✭✭✭MiskyBoyy


    3080/3090 cards coming with Black Ops Cold War codes from participating retailers from today until December 10th

    https://www.nvidia.com/en-gb/geforce/campaigns/cod-black-ops-cold-war-bundle/


  • Advertisement
  • Registered Users Posts: 858 ✭✭✭one armed dwarf


    As above watch out to make sure that that pcie pin is the same as the one you already have, ie same PSU. Or you could be a very sad panda

    My asus strix 3070 has been shipped just now, whoop. Order was placed at 5 to 1 so it was one of the early ones.

    Feel bad to be paying 3080 FE prices for a 3070 but I don't think RRP will be a thing with these for a long time and I don't feel like checking scan every day for FEs anymore


  • Registered Users Posts: 12,408 ✭✭✭✭TheValeyard


    As above watch out to make sure that that pcie pin is the same as the one you already have, ie same PSU. Or you could be a very sad panda

    My asus strix 3070 has been shipped just now, whoop. Order was placed at 5 to 1 so it was one of the early ones.

    Feel bad to be paying 3080 FE prices for a 3070 but I don't think RRP will be a thing with these for a long time and I don't feel like checking scan every day for FEs anymore

    Yeah, it's the exact same type of 8pin (6+2) but by itself. Not shared. Think I'm alright. Came with the PSU cabling.

    All Eyes On Rafah



  • Registered Users Posts: 4,402 ✭✭✭Homelander


    Got an email to say I'm 9th in Queue for the RTX3070 I bought, a KFA SG. I find that mad considering I only ordered close enough to 2PM on a whim and am not that pushed about it.


  • Registered Users Posts: 12,609 ✭✭✭✭errlloyd


    Homelander wrote: »
    Got an email to say I'm 9th in Queue for the RTX3070 I bought, a KFA SG. I find that mad considering I only ordered close enough to 2PM on a whim and am not that pushed about it.

    If you're looking to offload it ;)


  • Registered Users Posts: 663 ✭✭✭MidlanderMan


    Setting her up now. Wiping drivers with ddu etc.

    Jesus the position of the pins is annoying and I absolutely hate the extension cable. I'm going to have to buy some sleeved extension and hide it all in the basement. There's no tidy way to route the cables with the in box extension/adaptor.


  • Advertisement
  • Registered Users Posts: 12,408 ✭✭✭✭TheValeyard


    Setting her up now. Wiping drivers with ddu etc.

    Jesus the position of the pins is annoying and I absolutely hate the extension cable. I'm going to have to buy some sleeved extension and hide it all in the basement. There's no tidy way to route the cables with the in box extension/adaptor.

    Because I wanted my glass case on the other side, I had to invert everything and everything is kinda upside down by PSU. So cables also inverted, tis annoying.

    All Eyes On Rafah



  • Registered Users Posts: 663 ✭✭✭MidlanderMan


    All that aside. It's a ****ing gorgeous card.


  • Registered Users Posts: 21,654 ✭✭✭✭Squidgy Black


    Setting her up now. Wiping drivers with ddu etc.

    Jesus the position of the pins is annoying and I absolutely hate the extension cable. I'm going to have to buy some sleeved extension and hide it all in the basement. There's no tidy way to route the cables with the in box extension/adaptor.

    It's awful isn't it. The fact it's so short too. There's not many 3rd party extension cables yet, but as soon as they become widely available I'll be getting one.


  • Registered Users Posts: 10,013 ✭✭✭✭Wonda-Boy


    Setting her up now. Wiping drivers with ddu etc.

    Jesus the position of the pins is annoying and I absolutely hate the extension cable. I'm going to have to buy some sleeved extension and hide it all in the basement. There's no tidy way to route the cables with the in box extension/adaptor.

    1st world problems eh....you slaaaaaaaag :p


  • Registered Users Posts: 3,724 ✭✭✭Metric Tensor


    Put mine in last night. It's so well built - not a hint of sag because the thing is so bloody solid to start with.

    The extension looks like it was thrown together at the last minute, in the dark, by a child, who had 6 fingers. I've a Corsair PSU and they have published details of a 12 pin lead they intend to release so the OCD might get the better of me in due course.

    Other than that everything is sweet as a nut - my overclocked 6700k seems to be keeping up with it for the most part although I get the odd framerate drop.


  • Registered Users Posts: 4,402 ✭✭✭Homelander


    You should use Rivatuner to monitor usage. I actually wouldn't expect any big frame drops on a 6700K, but in certain games you will get relatively low usage on a 3080.

    That would be 1080P high refresh though. At 4K a 6700K would largely be well able for it.


  • Registered Users Posts: 3,724 ✭✭✭Metric Tensor


    I plan to do just that Homelander - I think the drops may be due to poor coding - I have only tested PUBG which has a reputation for such things!

    I play at 1440p with relatively high quality settings so middle of the road balance between resolution and refresh rate.


  • Registered Users Posts: 12,408 ✭✭✭✭TheValeyard


    Put mine in last night. It's so well built - not a hint of sag because the thing is so bloody solid to start with.

    The extension looks like it was thrown together at the last minute, in the dark, by a child, who had 6 fingers. I've a Corsair PSU and they have published details of a 12 pin lead they intend to release so the OCD might get the better of me in due course.

    Other than that everything is sweet as a nut - my overclocked 6700k seems to be keeping up with it for the most part although I get the odd framerate drop.

    Would not have expected framedrops on a 6700k? Is that with older games at 1080 or 2K?

    All Eyes On Rafah



  • Registered Users Posts: 4,402 ✭✭✭Homelander


    You have a remember that a 6700 at this point is comparable to a Ryzen 3100 or Intel 10th gen i3.

    Still a grand CPU but not one to be paired with 3080 at high refresh.


  • Registered Users Posts: 1,703 ✭✭✭JoyPad


    Homelander wrote: »
    You have a remember that a 6700 at this point is comparable to a Ryzen 3100 or Intel 10th gen i3.

    Still a grand CPU but not one to be paired with 3080 at high refresh.

    Fact.
    My son can't play Watch Dogs Legion on his 6700K with an MSI RTX 3090. He could, at Low or Medium settings, but driving or fighting gets FPS dropping to unplayable levels.
    Meanwhile, I have a 7700K and a Zotac RTX 3090 (which is a bit worse than the MSI), and I play at Ultra + DLSS Quality.

    He chose to park the game until I get him a new CPU.


  • Advertisement
  • Registered Users Posts: 4,402 ✭✭✭Homelander


    Not sure I get the point there? The 7700K is only marginally better than the 6700K. I think something like 5-10% faster.

    About the same as the Ryzen 3300X or an Intel i3-10230.

    Watchdogs has issues on PC, even people with high end CPU's are having FPS problems.

    If you use something like RivaTuner, unless you are playing at 4K, a 7700K is a huge bottleneck to a 3090 in new games as well.


  • Registered Users Posts: 858 ✭✭✭one armed dwarf


    I'd heard that RAM speed can affect Watch_Dogs also.


  • Registered Users Posts: 1,703 ✭✭✭JoyPad


    Homelander wrote: »
    Not sure I get the point there? The 7700K is only marginally better than the 6700K. I think something like 5-10% faster.

    Bit more than that, since his CPU is no longer OC'd, due to power limits. Runs at stock clock (4GHz). We both have 3200MHz RAM.
    But it seems to be all that is needed to go from 60+ FPS to 40 FPS.


  • Registered Users Posts: 3,724 ✭✭✭Metric Tensor


    Would not have expected framedrops on a 6700k? Is that with older games at 1080 or 2K?
    Homelander wrote: »
    You have a remember that a 6700 at this point is comparable to a Ryzen 3100 or Intel 10th gen i3.

    Still a grand CPU but not one to be paired with 3080 at high refresh.


    Sorry guys. I should have been more precise I suppose.

    Current set up:
    6700K OCed to 4.6GHz
    3080 RTX
    32 GB DDR4 at 3100 (slight OC)
    M2 SSD




    First Night - Playing PUBG at 1440p, mostly medium/high settings, framerates unlocked:
    Framerates topped out at over 200 at times but odd drops to as low as 110 ish.


    Second Night - Playing PUBG at 1440p, mostly high/ultra settings, framerates capped at 140Hz:
    Framerates stuck at the cap almost all the time. Less frequent drops to somewhere in the 120s




    A few points:
    1. PUBG is an awful game for benchmarking due to its spaghetti code and different map behaviours, etc.
    2. The above numbers are based on me watching the FPS figure in the bottom. I haven't put rivatuner running yet.
    3. 120FPS for the lows is hardly a crime against humanity! For reference I used to lock my 1070 to 90FPS and would still have drops into the 80s at times (on medium-high settings).




    Thanks for the interest - I'll keep ye up to date!


  • Registered Users Posts: 663 ✭✭✭MidlanderMan


    Sorry guys. I should have been more precise I suppose.

    Current set up:
    6700K OCed to 4.6GHz
    3080 RTX
    32 GB DDR4 at 3100 (slight OC)
    M2 SSD
    .

    First Night - Playing PUBG at 1440p, mostly medium/high settings, framerates unlocked:
    Framerates topped out at over 200 at times but odd drops to as low as 110 ish.


    Second Night - Playing PUBG at 1440p, mostly high/ultra settings, framerates capped at 140Hz:
    Framerates stuck at the cap almost all the time. Less frequent drops to somewhere in the 120s




    A few points:
    1. PUBG is an awful game for benchmarking due to its spaghetti code and different map behaviours, etc.
    2. The above numbers are based on me watching the FPS figure in the bottom. I haven't put rivatuner running yet.
    3. 120FPS for the lows is hardly a crime against humanity! For reference I used to lock my 1070 to 90FPS and would still have drops into the 80s at times (on medium-high settings).




    Thanks for the interest - I'll keep ye up to date!



    Your CPU is the bottleneck here. Games with massive open maps like PUBG are very CPU sensitive


  • Registered Users Posts: 3,724 ✭✭✭Metric Tensor


    Your CPU is the bottleneck here. Games with massive open maps like PUBG are very CPU sensitive


    I know that - for the time being it's a matter of tuning things as best I can because the CPU is staying the way it is for a while yet.


  • Registered Users Posts: 4,402 ✭✭✭Homelander


    JoyPad wrote: »
    Bit more than that, since his CPU is no longer OC'd, due to power limits. Runs at stock clock (4GHz). We both have 3200MHz RAM.
    But it seems to be all that is needed to go from 60+ FPS to 40 FPS.

    There is something else wrong there. A stock 7700K is not much faster than 6700K, granted you can OC but unless it's at close to 6Ghz on LN2 there is no possible way it is 50% faster.

    On the more general discussion about bottlenecking, a big part of the "problem" is that people dismiss it, because they don't see or perceive it.

    I mean taking PUBG as an example, even with a low-tier processor, going from any card to an RTX3090 will give huge gains.

    However, that doesn't mean it's not bottlenecking, or that a lot of the card's potential is just sitting on the table idle.

    For PUBG, an RTX3090 is capable of driving 1440P ultra at around 200FPS average. You won't get close to that with a 6700K.

    Again, RivaTuner will show that the GPU is not at full utilization due to CPU ceiling.

    It's not about crapping on anyone with a 6700 or 7700K by the way, just being technical.

    If you can afford an RTX3090, you can afford a CPU to feed it surely! In fact an RTX3080 + 10700K would be cheaper than an RTX3090 alone, and would demolish that card + 6700/7700K in games.


  • Registered Users Posts: 3,724 ✭✭✭Metric Tensor


    I agree with all of the above. There will be a bottleneck in every system.

    My 1070 was my bottleneck. It was running at 100% whilst my 6700 had some power left on the table and was only running at 60ish percent (ignoring idle cores, etc for simplicity).

    Now my bottleneck is the 6700 as it is running at full tilt and the 3080 is about 70% (according to afterburner).

    I've still gone from an average of 90FPS to 130+ so I've gotten a great upgrade even if the 3080 isn't fully used. Would a cheaper 3070 have done the same thing? Definitely Yes but in a years time when I upgrade to the latest processor I'll be better off having a 3080 than a 3070 so I'm willing to spend it now.

    I don't have a 3090 myself and I do think that if I had 3090 money to spend I'd have gone new Mobo + Processor + 3080 instead. But that's me on my budget. I have no problem with people who buy a 3090 now and are hanging on for the new Ryzen or whatever.


  • Registered Users Posts: 1,703 ✭✭✭JoyPad


    I have no problem with people who buy a 3090 now and are hanging on for the new Ryzen or whatever.

    That's the exact case for my son's PC. Only part missing is the 5000-series CPU, and we're hoping they will not have Nvidia style availability.
    Homelander wrote: »
    There is something else wrong there. A stock 7700K is not much faster than 6700K, granted you can OC but unless it's at close to 6Ghz on LN2 there is no possible way it is 50% faster.

    I don't think that 10% difference in CPU leads to 10% difference in FPS, particularly when the GPUs are so powerful.


  • Advertisement
  • Registered Users Posts: 12,609 ✭✭✭✭errlloyd


    Another week begins. More spamming of links and refresh for me.


Advertisement