Advertisement
If you have a new account but are having problems posting or verifying your account, please email us on hello@boards.ie for help. Thanks :)
Hello all! Please ensure that you are posting a new thread or question in the appropriate forum. The Feedback forum is overwhelmed with questions that are having to be moved elsewhere. If you need help to verify your account contact hello@boards.ie

Nvidia RTX Discussion

Options
15253555758209

Comments

  • Registered Users Posts: 21,649 ✭✭✭✭Squidgy Black


    errlloyd wrote: »
    Also 3080 requires a 750 Watt PSU? My current PSU is 650. I've read some stuff about differing numbers of connector pins etc. So is there anything to look out for from the PSU, or will I just buy the best value 750 out there.

    I'm in a similar boat, but going to wait until the reviews come out before making a decision. Nvidia are recommending a 750w, but it all depends on your existing setup I guess.


  • Registered Users Posts: 602 ✭✭✭Aodhan5000


    Talk about an awful time to need to upgrade your PSU with the pandemic


  • Registered Users Posts: 10,684 ✭✭✭✭Samuel T. Cogley


    JoyPad wrote: »
    Can anyone clarify something, please?

    PCIe gen 4: required or optional? if Optional, how bad will it be on gen 3?

    Hardware unboxed did some testing and saw about 1-2% boost on the 5700XT in certain scenarios. I don't think it's going to be as optional as many say. I think it's more complicated than just looking at uni-directional bandwidth and saying it'll be grand. That said, pulling numbers out of thin air, I don't think we'll see more than single digit gains.



  • Registered Users Posts: 18,706 ✭✭✭✭K.O.Kiki


    errlloyd wrote: »
    Also 3080 requires a 750 Watt PSU? My current PSU is 650. I've read some stuff about differing numbers of connector pins etc. So is there anything to look out for from the PSU, or will I just buy the best value 750 out there.

    It has a 320W TDP and requires 2x separate 8-pin PCIe connectors.


  • Registered Users Posts: 10,684 ✭✭✭✭Samuel T. Cogley


    I'm very tempted to jump in and buy an RTX card, what niggling me is, if nVidia is this good what have AMD got up their sleeves. Will AMD have more widely accepted standards for RT given they're in the consoles? I'm half thinking about the 3080 but more the 3070 - hopefully we'll have some more info from AMD by then.


  • Advertisement
  • Registered Users Posts: 655 ✭✭✭L


    I'm very tempted to jump in and buy an RTX card, what niggling me is, if nVidia is this good what have AMD got up their sleeves. Will AMD have more widely accepted standards for RT given they're in the consoles? I'm half thinking about the 3080 but more the 3070 - hopefully we'll have some more info from AMD by then.

    The AMD date's still the start of October isn't it? For the sake of a month, I'd wait and see. Chances are good you won't even have a 30 series in your hands by the time you've got the AMD data.

    Nvidia's no charity - if they're bringing their prices down and their performance up like this, AMD is very likely to be releasing something that makes the 20 series style gouging non-viable.


  • Registered Users Posts: 670 ✭✭✭Nibs05


    errlloyd wrote: »
    Also 3080 requires a 750 Watt PSU? My current PSU is 650. I've read some stuff about differing numbers of connector pins etc. So is there anything to look out for from the PSU, or will I just buy the best value 750 out there.

    The 750w was based off a i9 10900k, but I’ll wait for reviews .


  • Registered Users Posts: 5,574 ✭✭✭EoinHef


    So Zotac partner cards are up on OCUK,the Trinity version is the same price as founders. They also have £639 and £659 versions

    The 5 year warranty they give makes that a tempting card for me.

    I have a Zotac 1080,Amp edition,not the best cooler but never had an issues once I set a custom fan profile.

    Also means at least some partners may release cards below founder edition costs. If more dont I just see everyone going for founders edition


  • Registered Users Posts: 2,449 ✭✭✭Rob2D


    3080ti this year @ $999???? I wouldn't have expected it this year.

    https://www.techpowerup.com/gpu-specs/geforce-rtx-3080-ti.c3581


  • Registered Users Posts: 5,930 ✭✭✭Cordell


    errlloyd wrote: »
    Also 3080 requires a 750 Watt PSU? My current PSU is 650. I've read some stuff about differing numbers of connector pins etc. So is there anything to look out for from the PSU, or will I just buy the best value 750 out there.

    Those recommendations are grossly overestimated. It's very likely that your PSU is able to power it up just fine, and with a comfortable power reserve.


  • Advertisement
  • Registered Users Posts: 5,574 ✭✭✭EoinHef


    Rob2D wrote: »
    3080ti this year @ $999???? I wouldn't have expected it this year.

    https://www.techpowerup.com/gpu-specs/geforce-rtx-3080-ti.c3581

    This just confirms people should wait until at least big navi launches imo. That will be hard,for myself too.

    Nvidia are obviously holding products back for more strategic launches.

    Even with benches for the 3070 and 3080 we still wouldn't really have a full picture of price/performance this gen until we see more 30 series cards and big navi.


  • Registered Users Posts: 655 ✭✭✭L


    Cordell wrote: »
    Those recommendations are grossly overestimated. It's very likely that your PSU is able to power it up just fine, and with a comfortable power reserve.

    See how it goes, but it may be tighter than normal. 3080's supposed to have a 320W peak draw, which means that the rest of your system needs to be under 200W to fit under the 80% mark on a 650W power supply.

    That's enough for a decent system, but it's not exactly leaving a lot of spare headroom for overclocking, extra storage or the future.


  • Registered Users Posts: 2,449 ✭✭✭Rob2D


    Lot of people going nuts for AMD news now. Saying they really need a "leak" to stay in the game. I agree they should probably leak something small. Just to show that they will be competitive and stop as many nvidia pre-orders as they can.


  • Registered Users Posts: 7,882 ✭✭✭frozenfrozen


    they could time it right so they leak while people are in their return window for 3000 series, but that might be too evil


  • Registered Users Posts: 18,706 ✭✭✭✭K.O.Kiki




  • Registered Users Posts: 655 ✭✭✭L


    Free Watchdogs: Legion with the 30 Series as well until the end of October.

    I half expect Nvidia's next move to be offering to come to your house and give you a backrub if you don't wait to see what AMD has.

    I think Nvidia's done more for my curiosity about what AMD is going to produce than AMD. :D


  • Registered Users Posts: 10,684 ✭✭✭✭Samuel T. Cogley


    K.O.Kiki wrote: »

    Do you think the extra RAM is going to be useful/needed at 1440p or 4K?


  • Registered Users Posts: 5,930 ✭✭✭Cordell


    L wrote: »
    See how it goes, but it may be tighter than normal. 3080's supposed to have a 320W peak draw, which means that the rest of your system needs to be under 200W to fit under the 80% mark on a 650W power supply.

    That's enough for a decent system, but it's not exactly leaving a lot of spare headroom for overclocking, extra storage or the future.

    You're right, it depends on the system and what you do with it, but for the most the rest of the system will be under 200W, maybe even under 150. Also not all 650W are equal, so YMMV :)


  • Registered Users Posts: 10,013 ✭✭✭✭Wonda-Boy


    Do you think the extra RAM is going to be useful/needed at 1440p or 4K?

    Extra Ram is always handy.....its money for old rope!


  • Registered Users Posts: 10,013 ✭✭✭✭Wonda-Boy


    This is the beast I am gonna get as soon as its in stock...Alot of the cards up on OCUK are quite bland....but I notice in the last few hours Pallit and KFA2 have been added to the selection. The Pallit cards are also nice looking....

    https://www.overclockers.co.uk/inno3d-geforce-rtx-3080-ichill-x4-10gb-gddr6x-pci-express-graphics-card-gx-07p-in.html

    https://www.overclockers.co.uk/palit-geforce-rtx-3080-gaming-pro-oc-10gb-gddr6x-pci-express-graphics-card-gx-05e-pl.html


  • Advertisement
  • Registered Users Posts: 12,609 ✭✭✭✭errlloyd


    Cordell wrote: »
    You're right, it depends on the system and what you do with it, but for the most the rest of the system will be under 200W, maybe even under 150. Also not all 650W are equal, so YMMV :)

    To be fair, I am quite thankful for your advice.

    I've a Corsair VS series 650w. It is running a couple of fans, a hard drive, 2 SSDs and a midrange processor.

    I have a 1060GTX and my current rig isn't that pleasant to play on. The frame rate in the games I play isn't great, and isn't stable. Given lockdown means I am gaming more my number 1 priority is actually just improving the card ASAP, my number 2 priority is not buying something that is poor value (like if I had bought a 2080 last month, for example).


  • Registered Users Posts: 21,649 ✭✭✭✭Squidgy Black




  • Registered Users Posts: 5,574 ✭✭✭EoinHef



    Jaysus that looks awful,like the fan was an after thought and just bolted on!


  • Registered Users Posts: 21,649 ✭✭✭✭Squidgy Black


    EoinHef wrote: »
    Jaysus that looks awful,like the fan was an after thought and just bolted on!

    It pretty much is, from the description it sounds like it's an optional clip-on.

    It's interesting as well that all of the 3rd part cards do far are all triple fan, apart from one of the Inno3D. Which means coolings probably going to be a big deal.

    Bit of a pain in the arse for me though as large cards are a tight fit in the 280x case, 300mm max but even then the power cables can be a nightmare


  • Registered Users Posts: 18,706 ✭✭✭✭K.O.Kiki



    I take it back, maybe the AIBs are trying to kill themselves to leave only Nvidia standing! :pac:


  • Registered Users Posts: 3,571 ✭✭✭2ndcoming


    As someone who's only ever had AMD graphics cards, I'd be very sceptical of their ability to match the 3080 with their next batch of cards.

    Based on everything they've done in the GPU division for the past 5 or even 10 years I'd say their goal was to beat the 2080Ti with a $400 or so card. Nvidia's pricing here is a pretty aggressive maneouvre I would say based off this expectation also.

    Now they could surprise us all, but it would be a very big surprise for them to be able to catch and pass the people they've been playing catch up to for years, in the process completely blowing away anything they've ever done before and deliver it at less then $700.


  • Registered Users Posts: 21,649 ✭✭✭✭Squidgy Black


    syHi0Zw_d.webp?maxwidth=640&shape=thumb&fidelity=medium


  • Registered Users Posts: 7,879 ✭✭✭Mr Crispy


    Mr Crispy wrote: »
    According to Lenovo, there's a 16GB 3070 Ti on the way.... (Videocardz).

    "Lenovo also lists NVIDIA GeForce RTX 3070 SUPER"


    Surely a mistake somewhere along the lines. They're not releasing a 3070 vanilla, Super, and Ti variant around the same time.


  • Registered Users Posts: 10,013 ✭✭✭✭Wonda-Boy


    EVGA also listed now on OCUK which looks nice, Id say the benchmarking will start just before release date around the 15th or so. Until then the propaganda machine will be in overdrive and the Green and Red Fanboys will be out in force.


  • Advertisement
  • Registered Users Posts: 655 ✭✭✭L


    2ndcoming wrote: »
    Now they could surprise us all, but it would be a very big surprise for them to be able to catch and pass the people they've been playing catch up to for years, in the process completely blowing away anything they've ever done before and deliver it at less then $700.

    At the same time, *something* has Nvidia reacting way more strongly than I'd have predicted.

    I was expecting the 30 series to be a significant improvement based on the crap 20 series processes (transistor density was appalling for the 2080 Ti, which suggested poor yields), which they are. However, they're much cheaper, hotter and more power hungry. It feels like Nvidia reaching to make sure they stay ahead of AMD.

    Hell, 30 series has shades of an old school "free electric heater" AMD design philosophy to it. :D


Advertisement