Advertisement
If you have a new account but are having problems posting or verifying your account, please email us on hello@boards.ie for help. Thanks :)
Hello all! Please ensure that you are posting a new thread or question in the appropriate forum. The Feedback forum is overwhelmed with questions that are having to be moved elsewhere. If you need help to verify your account contact hello@boards.ie

AMD Navi Discussion

Options
1568101197

Comments

  • Registered Users Posts: 7,878 ✭✭✭Mr Crispy


    I'm guessing they'll use the ray tracing built into DX12 Ultimate, or whatever it's called.


  • Registered Users Posts: 10,299 ✭✭✭✭BloodBath


    Yeah I can't see them catching up with Nvidia this gen. They may well have specialist RT cores but will they have tensor cores?

    Nvidia have nearly tripled the tensor cores with the 3000 series over 2000. That's how much they believe in them. They are driving lot's of innovation like DLSS and all of the other cool AI driven features like the microphone voice isolation, virtual green screen, 2d picture or video to 3d model and animation, voice to facial animation, camera to skeletal mesh animation and a bunch of other things.

    There's a bunch of software utilizing these cores for workloads now as well. It's like having a super computer at home. You can or will be able to train game ai using them, you can create new unique 2d images and/or 3d models based on source samples of thousands and choose the 1 you like rather than trying to create it from scratch.


  • Moderators, Recreation & Hobbies Moderators Posts: 4,662 Mod ✭✭✭✭Hyzepher


    Azza wrote: »

    I also suspect they have something in the works comparable to RTX IO for speeding up data decompression.

    I think Nvidia mentioned that RTX IO is dirived from implementations by xbox, so with AMD dominating next gen consoles I wouldn't be surprised if their IO offering was already superior to Nvidia and Nvidia just got their hype in first


  • Registered Users Posts: 10,684 ✭✭✭✭Samuel T. Cogley


    My predictions are better performance per watt.
    Something approaching the 3080 at a similar price.
    More RAM on the cards - which will be quickly countered.
    Nothing to counter DLSS.


  • Registered Users Posts: 2,449 ✭✭✭Rob2D


    Nothing to counter DLSS.

    Which is what they actually need.


  • Advertisement
  • Registered Users Posts: 10,684 ✭✭✭✭Samuel T. Cogley


    Rob2D wrote: »
    Which is what they actually need.

    I don't disagree.


  • Registered Users Posts: 10,013 ✭✭✭✭Wonda-Boy


    Really hope they have something good and give us consumers something to think about.....competition is good and the only winner is the end user (us lot).


  • Registered Users Posts: 10,684 ✭✭✭✭Samuel T. Cogley


    Wonda-Boy wrote: »
    Really hope they have something good and give us consumers something to think about.....competition is good and the only winner is the end user (us lot).

    The issue is though it'll have to be really good. I mean at least 20% better at the same price point with a USP. nVidia have the 'mindshare' or whatever it's called.


  • Registered Users Posts: 18,706 ✭✭✭✭K.O.Kiki


    RX 6000 series named in Fortnite?
    https://twitter.com/MissGinaDarling/status/1301721126846963712?s=19

    Possible pricing:
    https://twitter.com/coreteks/status/1301839482287796224

    Igorslab spilling possible details:
    [...]Big Navi with 275 watts is somewhere between 3070 and 3080 and possibly with more power consumption (300W+) somewhere around 3080 performance. Big Navi will not be able to attack the 3090. Take this with a lot of salt.
    [...]No AIBs in 2020.

    Also reminder, reference cooler confirmed to not be a blower model.
    beuenrwiaxk41.png


  • Registered Users Posts: 21,648 ✭✭✭✭Squidgy Black


    No AIBs in 2020 would be disappointing, unless AMD go down a similar route and have a much different design than the usual blower.


  • Advertisement
  • Registered Users Posts: 5,574 ✭✭✭EoinHef


    No AIBs in 2020 would be disappointing, unless AMD go down a similar route and have a much different design than the usual blower.

    The pic above makes it look like a dual fan design. Thats been around awhile.

    That blower cooler has done AMD damage over the years,how they still used it for the 5 series is beyond me.

    Too many reviews on launch say hot and loud because they are reviewing the reference design. Talk about self harm


  • Registered Users Posts: 2,465 ✭✭✭SweetCaliber


    EoinHef wrote: »
    The pic above makes it look like a dual fan design. Thats been around awhile.

    That blower cooler has done AMD damage over the years,how they still used it for the 5 series is beyond me.

    Too many reviews on launch say hot and loud because they are reviewing the reference design. Talk about self harm

    They should go back to their AIO cards, the R9 Fury for example...: :pac::pac:

    81HRs3ta2tL._AC_SL1500_.jpg


  • Registered Users Posts: 14,309 ✭✭✭✭wotzgoingon


    Cost of those is the thing. I'm not saying they were mad expensive to buy for end user but they were expensive for AMD to produce.

    I actually had one of those cards. Pure silence while gaming. Not that noise bothers me as I were headphones while gaming but cards can sometimes ramp up the fans of non AIO gaming cards while running a demanding game.


  • Moderators Posts: 5,558 ✭✭✭Azza


    The 16GB VRAM might be a handy thing to have with the advent of the next generation consoles. At the start of this generation of consoles, PC GPU's of the time where lagging behind the consoles with the amount of VRAM they where offering. Performance was still better on PC of course but if you didn't have at least 4GB VRAM you had to lower texture quality settings. I know I did with my GTX 680 2GB at the time with games like Titanfall and Watch Dogs. I remember thinking crikey when Middle-earth: Shadow of Mordor offered an optional HD texture pack and it required 6GB VRAM.

    Considering next gen consoles will come with 16GB VRAM, VRAM requirements in PC games may go up as well.


  • Registered Users Posts: 21,648 ✭✭✭✭Squidgy Black


    Azza wrote: »
    The 16GB VRAM might be a handy thing to have with the advent of the next generation consoles. At the start of this generation of consoles, PC GPU's of the time where lagging behind the consoles with the amount of VRAM they where offering. Performance was still better on PC of course but if you didn't have at least 4GB VRAM you had to lower texture quality settings. I know I did with my GTX 680 2GB at the time with games like Titanfall and Watch Dogs. I remember thinking crikey when Middle-earth: Shadow of Mordor offered an optional HD texture pack and it required 6GB VRAM.

    Considering next gen consoles will come with 16GB VRAM, VRAM requirements in PC games may go up as well.

    They're 16gb shared memory as opposed to dedicated graphics memory though are they not? The same as an 8gb VRAM card now.


  • Registered Users Posts: 14,309 ✭✭✭✭wotzgoingon


    I'm not sure but I think it works out at around 12GB VRAM while the rest of the 16GB's is for OS. That's from the top of my head so could be wrong and I didn't look up the spec of consoles since they were announced.


  • Registered Users Posts: 18,706 ✭✭✭✭K.O.Kiki




  • Banned (with Prison Access) Posts: 1,306 ✭✭✭bobbyy gee




  • Registered Users Posts: 14,309 ✭✭✭✭wotzgoingon


    I don't follow leakers well I used to watch Adored until he went for pod cast type videos. Who is that _rogame fellow? It is believable though what he is saying although I still do not think AMD can match the 3090 and this is from someone who likes AMD.


  • Registered Users Posts: 18,706 ✭✭✭✭K.O.Kiki




  • Advertisement
  • Registered Users Posts: 10,013 ✭✭✭✭Wonda-Boy


    Id love to wait and see what AMD brring to the table, if they even released some info about a possible release date. Id also love to wait and see if the 3080ti would be along quite soon after the launch of the 3080 but I am badly stuck for a card now.


  • Moderators Posts: 5,558 ✭✭✭Azza


    I'm itching to know myself. Think you are best off waiting a few months before deciding either way.

    The last two generations AMD's competition has forced NVIDIA to counter with faster cards. The Vega 56 beat the GTX 1070 in terms of performance and forced NVIDIA to release the GTX 1070Ti. The 5700 and 5700XT forced NVIDIA to counter with the 2060 Super and 2070 Super.

    I'd imagine that will likely be the case again.

    I'm probably going to go AMD if its remotely close between them. I do want to loose FreeSync support or be forced to change my monitor to get G-Sync if opted for NVIDIA.

    I'm not into spending thousands on a GPU nor do I need the absolute best performance. What I would like is to be able to max out the 165Hz refresh rate of my monitor at its native 1440P resolution more often in games that use rasterization and be able to use ray tracing features in games that use it and still hit 60FPS at 1440P. If the new GPU's are capable of accelerating data decompression as well that would a good plus to have as well.


  • Registered Users Posts: 10,299 ✭✭✭✭BloodBath


    Azza wrote: »
    I'm itching to know myself. Think you are best off waiting a few months before deciding either way.

    The last two generations AMD's competition has forced NVIDIA to counter with faster cards. The Vega 56 beat the GTX 1070 in terms of performance and forced NVIDIA to release the GTX 1070Ti. The 5700 and 5700XT forced NVIDIA to counter with the 2060 Super and 2070 Super.

    I'd imagine that will likely be the case again.

    I'm probably going to go AMD if its remotely close between them. I do want to loose FreeSync support or be forced to change my monitor to get G-Sync if opted for NVIDIA.

    I'm not into spending thousands on a GPU nor do I need the absolute best performance. What I would like is to be able to max out the 165Hz refresh rate of my monitor at its native 1440P resolution more often in games that use rasterization and be able to use ray tracing features in games that use it and still hit 60FPS at 1440P. If the new GPU's are capable of accelerating data decompression as well that would a good plus to have as well.

    Nvidia cards also support support freesync now.


  • Moderators Posts: 5,558 ✭✭✭Azza


    BloodBath wrote: »
    Nvidia cards also support support freesync now.

    Thanks for the tip, wasn't aware of that.

    Just looking into it and my monitor even though it not officially listed as doing so does supports G-Sync.

    That's great, removes a constraining factor in my purchasing decision.

    Still going wait and see what Big Navi is like before deciding.


  • Registered Users Posts: 18,706 ✭✭✭✭K.O.Kiki




  • Moderators Posts: 5,558 ✭✭✭Azza


    Rumored price cut for Big Navi already.

    https://www.pcgamer.com/amd-rx-6000-series-price-drop-pre-launch/

    Would indicate that in terms of performance AMD don't have anything to directly compete with the 3080 and will again have to compete on price.


  • Registered Users Posts: 18,706 ✭✭✭✭K.O.Kiki


    Azza wrote: »
    Rumored price cut for Big Navi already.

    https://www.pcgamer.com/amd-rx-6000-series-price-drop-pre-launch/

    Would indicate that in terms of performance AMD don't have anything to directly compete with the 3080 and will again have to compete on price.

    https://twitter.com/_rogame/status/1301955306021416970

    The above article is speculation & rumors, but I'll be VERY surprised if AMD doesn't have a 3080 competitor.


  • Registered Users Posts: 10,299 ✭✭✭✭BloodBath


    I doubt it. Biggest NAVI is only supposed to be around 500mm die size. The 3080 is 637mm. Granted there's a few caveats. Nvidia have RT and Tensor cores on that die and it's Samsungs 8nm process vs TSMC's 7nm.

    They could go all out for rasterisation and maybe beat Nvidia there but there's supposed to be at least dedicated RT cores on big Navi as well. There's rumors that it's delayed until next year now as well so it's going to be too late to compete with the 3070 and 3080 and by the time it comes out, if it's close, Nvidia will just release the Ti or super versions.


  • Registered Users Posts: 5,574 ✭✭✭EoinHef


    Delay till next year and they will barely have any share at all in the GPU market


  • Advertisement
  • Registered Users Posts: 7,878 ✭✭✭Mr Crispy


    What's the source for the delay rumours?


Advertisement