Advertisement
If you have a new account but are having problems posting or verifying your account, please email us on hello@boards.ie for help. Thanks :)
Hello all! Please ensure that you are posting a new thread or question in the appropriate forum. The Feedback forum is overwhelmed with questions that are having to be moved elsewhere. If you need help to verify your account contact hello@boards.ie

Nvidia RTX Discussion

Options
16667697172209

Comments

  • Registered Users Posts: 2,541 ✭✭✭Fitz II


    Well general consensus is that this is a winner of a card, I wonder will it undo the damage done during the 20 series launch? I am very excited to see where the 3090 lands on the benchmarks, obviously a law of diminishing returns, but buy got this is a monster of a card. Interested to see how well this plays with Ryzen versus Intel and the overclocking potential.

    A good day to be a PC gaming enthusiast.


  • Registered Users Posts: 7,875 ✭✭✭Mr Crispy


    VideoCardz have links to written and video reviews, with new ones added as they go up.


  • Registered Users Posts: 4,401 ✭✭✭Homelander


    While I think the 8GB on the RTX3070 is a bit of a joke, I am not hugely pushed. I'm OK with texture being set at high rather than ultra, or medium down the line if really needs be, when everything else can be maxed out at massive FPS.

    I mean most games, comparing medium v ultra textures, not a huge difference visually for a massive vram hit.


  • Registered Users Posts: 10,684 ✭✭✭✭Samuel T. Cogley


    Homelander wrote: »
    While I think the 8GB on the RTX3070 is a bit of a joke, I am not hugely pushed. I'm OK with texture being set at high rather than ultra, or medium down the line if really needs be, when everything else can be maxed out at massive FPS.

    I mean most games, comparing medium v ultra textures, not a huge difference visually for a massive vram hit.

    Stop countering my justifications for not upgrading my 5700XT! :pac:


  • Moderators, Recreation & Hobbies Moderators Posts: 4,662 Mod ✭✭✭✭Hyzepher


    I am aware of the allocation v actual use situation but Hardware Unboxed showed the performance hit the 8GB cards were taking in it's benchmarks.

    I think it will be interesting to see how the 3070 8Gb performs before we use the old 8Gb models to make assumptions


  • Advertisement
  • Registered Users Posts: 10,684 ✭✭✭✭Samuel T. Cogley


    Hyzepher wrote: »
    I think it will be interesting to see how the 3070 8Gb performs before we use the old 8Gb models to make assumptions

    I haven't see much about any advance in compression technology this generation.


  • Registered Users Posts: 7,882 ✭✭✭frozenfrozen


    do the reviews have anything about the storage decompression thing on the cards?

    Can't imagine 10GB of VRAM is an issue if you can swap at 6GB/s from disk


  • Registered Users Posts: 20,558 ✭✭✭✭dreamers75




  • Moderators, Recreation & Hobbies Moderators Posts: 4,662 Mod ✭✭✭✭Hyzepher


    I think the 8gb 3070 is being unfairly comapared to the 2080ti - mainly due to the Nvidia press release - but it really shold be compared to the 2070/Super and I think we'll see the same performance increase despite the Vram size


  • Registered Users Posts: 28,033 ✭✭✭✭TitianGerm


    do the reviews have anything about the storage decompression thing on the

    No games utilising it at the moment.


  • Advertisement
  • Registered Users Posts: 7,875 ✭✭✭Mr Crispy


    do the reviews have anything about the storage decompression thing on the cards?

    Can't imagine 10GB of VRAM is an issue if you can swap at 6GB/s from disk

    That won't be available until 2021 at the earliest.


  • Registered Users Posts: 11,979 ✭✭✭✭Giblet


    Hyzepher wrote: »
    I think the 8gb 3070 is being unfairly comapared to the 2080ti - mainly due to the Nvidia press release - but it really shold be compared to the 2070/Super and I think we'll see the same performance increase despite the Vram size

    Well a lot of people where saying the 3080 is going to completely destroy the 2080 TI, but it seems to be a 20% increase across the board, and less at lower resolutions (and that power draw... sheeesh!). The 3070 doesn't have DDR6X, so yes, it'll trash the 2070 super I imagine but I just don't see it winning all benchmarks against the 2080TI which has nearly 50% more memory bandwidth than the 3070. (Prob a 2080 killer?)

    RTX performance is great, but I mean, that's barely a few games still.


  • Registered Users Posts: 4,401 ✭✭✭Homelander


    Hyzepher wrote: »
    I think the 8gb 3070 is being unfairly comapared to the 2080ti - mainly due to the Nvidia press release - but it really shold be compared to the 2070/Super and I think we'll see the same performance increase despite the Vram size

    The whole system just really got messed up when AMD stopped being competitive and Nvidia ran absolutely riot with pricing.

    GTX10 series in particular where it really started, 1070 ended up being €500, compared to about €350 for the GTX970.

    In a normal run of things, it would've been a €350-400 class replacement card, and beaten the 980Ti handily despite being €200-300 cheaper.

    Until relatively recently, it was always completely normal for the "mainstream" performance card to be comparable to the prior gen's flagship at a significantly lower price, close to half in many cases.


  • Registered Users Posts: 10,684 ✭✭✭✭Samuel T. Cogley


    Giblet wrote: »
    RTX performance is great, but I mean, that's barely a few games still.

    From what I saw there is no improvement in efficiency of RT there, the hit is about the same as Turing.


  • Registered Users Posts: 11,979 ✭✭✭✭Giblet


    From what I saw there is no improvement in efficiency of RT there, the hit is about the same as Turing.

    I saw Quake II and Minecraft which use pure pathtracing have huge improvements. Control also.

    Those AAA games, Quake II and Minecraft...


  • Registered Users Posts: 10,684 ✭✭✭✭Samuel T. Cogley


    Giblet wrote: »
    I saw Quake II and Minecraft which use pure pathtracing have huge improvements. Control also.

    Those AAA games, Quake II and Minecraft...

    Better than just the overall improvement in performance?


  • Registered Users Posts: 11,979 ✭✭✭✭Giblet


    Better than just the overall improvement in performance?

    https://www.eurogamer.net/articles/digitalfoundry-2020-nvidia-geforce-rtx-3080-review?page=6

    They have the benchmark tools.


  • Moderators, Recreation & Hobbies Moderators Posts: 4,662 Mod ✭✭✭✭Hyzepher


    Giblet wrote: »
    Well a lot of people where saying the 3080 is going to completely destroy the 2080 TI, but it seems to be a 20% increase across the board, and less at lower resolutions (and that power draw... sheeesh!). The 3070 doesn't have DDR6X, so yes, it'll trash the 2070 super I imagine but I just don't see it winning all benchmarks against the 2080TI which has nearly 50% more memory bandwidth than the 3070. (Prob a 2080 killer?)

    RTX performance is great, but I mean, that's barely a few games still.

    What I see is a €700 card outperform the previous €1200 flagship. The 3080 isn't even the replacement for the 2080ti from a tier level perspective. There's an argument that the 3090 is

    Currently, the 3080 is a 20xx killer in both performance and price and when compared directly with the comparable 2080/Super in terms of price - it provides a huge upgrade


  • Registered Users Posts: 5,574 ✭✭✭EoinHef


    The numbers coming from using a 1080 look great.

    Gonna be a big boost for me. Who knows when be able to get hands on a GPU though. Initially for me it would be founders or nothing,id be worried about some AIB coolers givin power usage. Would have to wait on individual reviews and my price range would be in the lower AIB range im guessing. Some priced at near £800 and screw paying that.

    If the founders is scarce I won't be too bothered as waiting for AMD would be the sensible thing to do anyway


  • Registered Users Posts: 28,033 ✭✭✭✭TitianGerm


    EoinHef wrote: »
    The numbers coming from using a 1080 look great.

    Gonna be a big boost for me. Who knows when be able to get hands on a GPU though. Initially for me it would be founders or nothing,id be worried about some AIB coolers givin power usage. Would have to wait on individual reviews and my price range would be in the lower AIB range im guessing. Some priced at near £800 and screw paying that.

    If the founders is scarce I won't be too bothered as waiting for AMD would be the sensible thing to do anyway

    Massive increases for me moving from a 2060. I better get one tomorrow now!!


  • Advertisement
  • Registered Users Posts: 10,299 ✭✭✭✭BloodBath


    Pretty impressive all around. 4k high refresh doable on high end 3080/3090 now even with all the bells and whistles enabled.

    If you need more fps enable DLSS. By next gen the mid range cards will be able to do it.

    The price of those monitors needs to come down a lot now. Still 1k+ for 4k/120hz+ monitors.


  • Registered Users Posts: 5,574 ✭✭✭EoinHef


    TitianGerm wrote: »
    Massive increases for me moving from a 2060. I better get one tomorrow now!!

    Wonder what time they will open up sales,cant see it being 12.01 am but that would suit me.


  • Registered Users Posts: 7,875 ✭✭✭Mr Crispy


    BloodBath wrote: »
    The price of those monitors needs to come down a lot now. Still 1k+ for 4k/120hz+ monitors.

    And even at those high prices it's still a crap shoot in terms of quality control. I can understand why some people are saying they may as well go for a 48" OLED if spending that much.


  • Registered Users Posts: 28,033 ✭✭✭✭TitianGerm


    EoinHef wrote: »
    Wonder what time they will open up sales,cant see it being 12.01 am but that would suit me.

    It's meant to be 2pm. Overclockers had something posted on their forums.


  • Registered Users Posts: 18,706 ✭✭✭✭K.O.Kiki


    Homelander wrote: »
    The whole system just really got messed up when AMD stopped being competitive and Nvidia ran absolutely riot with pricing.

    GTX10 series in particular where it really started, 1070 ended up being €500, compared to about €350 for the GTX970.

    In a normal run of things, it would've been a €350-400 class replacement card, and beaten the 980Ti handily despite being €200-300 cheaper.

    Until relatively recently, it was always completely normal for the "mainstream" performance card to be comparable to the prior gen's flagship at a significantly lower price, close to half in many cases.

    https://www.eurogamer.net/articles/digitalfoundry-2016-nvidia-geforce-gtx-1070-review_6
    We picked up a GTX 970 for £250 back in the day, and it had a $330 US ticket price - Nvidia's latest has a baseline £330/$380 price-point.

    Not sure where you're remembering 500 euro from, at those exchange rates they would've been 420 euro (unless you mean the FE).


  • Registered Users Posts: 21,650 ✭✭✭✭Squidgy Black


    I think Nvidia slightly overshot their marketing on this.

    They released a card that has noticeable improvement on the last flagship card, for around 500 quid less. But left themselves wide open to "un-impressed" or "letdown" reviews/reveals because they pushed their marketing too hard in terms of saying it was going to be twice as fast and make the previous cards redundant.

    It's still class and a noted improvement, but I'd say those 2080ti owners who sold for dirt cheap are absolutely fuming (albeit it's their own fault for panicking).


  • Registered Users Posts: 10,299 ✭✭✭✭BloodBath


    Nvidia always push their marketing too hard.

    They are still super impressive cards.

    I think 2080ti owners would be looking at the 3090 rather than the 3080 as well. Or hold out for the super or ti variants.


  • Registered Users Posts: 18,706 ✭✭✭✭K.O.Kiki


    I think Nvidia slightly overshot their marketing on this.

    They released a card that has noticeable improvement on the last flagship card, for around 500 quid less. But left themselves wide open to "un-impressed" or "letdown" reviews/reveals because they pushed their marketing too hard in terms of saying it was going to be twice as fast and make the previous cards redundant.

    It's still class and a noted improvement, but I'd say those 2080ti owners who sold for dirt cheap are absolutely fuming (albeit it's their own fault for panicking).

    Those 2080 Ti owners are idiots, because every Nvidia slide was saying "60-70% better than 2080, a world away from 1080 Ti" :pac:

    But then, we did kinda call them idiots for buying that GPU in the first place.


  • Registered Users Posts: 30,123 ✭✭✭✭Star Lord


    Blender benchmark:

    Blender-2.90-Classroom-CUDA-and-OptiX-Render-Time-Cycles-NVIDIA-GeForce-RTX-3080-680x383.jpg

    Hoping to build a 3080 system, probably some stage early next year, and going to be coming from an i5 4690 with a GTX970 to this, it's going to be a quantum shift for me! :D

    Just checked the render time for the Classroom scene the above is based on, and for a GTX970, it's 479.164-502.605 seconds... compared to 43 seconds on the 3080 using CUDA (for direct comparison), and even faster using OPTIX...


  • Advertisement
  • Registered Users Posts: 3,356 ✭✭✭seraphimvc


    second round of suicide watch for those who sold their 2080TI at around 500 mark :pac: poor thing.

    Now i can't wait for big navi news. Definitely an exciting time for everyone.


Advertisement