Advertisement
If you have a new account but are having problems posting or verifying your account, please email us on hello@boards.ie for help. Thanks :)
Hello all! Please ensure that you are posting a new thread or question in the appropriate forum. The Feedback forum is overwhelmed with questions that are having to be moved elsewhere. If you need help to verify your account contact hello@boards.ie

AMD Navi Discussion

Options
1202123252697

Comments

  • Registered Users Posts: 12,411 ✭✭✭✭Skerries




  • Registered Users Posts: 1,389 ✭✭✭KillerShamrock


    Just looking over the graphs and stuff and I can't help feel unless you have a 5000 series CPU and a 500 series mobo the figures shown won't be obtained and the cards may actually fair a small bit worse than the Nvidia counter parts in terms of raw FPS.

    I hope I'm wrong and I'm just nit picking but there is a bang of apple style buy into our eco system type marketing coming from AMD with this launch.

    If I'm right though and the figures shown rely on the eco system then unless you are willing/can/want to or are waiting to build and haven't in the last while, then bar the cheaper price the cards will come up a little short which for the price difference isn't a bad thing.

    Also as the 6800 was shown with the smart thing on, if that's the case it's worse than a 2080ti/3070 for a higher price unless as above you are in line for a new CPU/mobo.

    As usual I hope I'm wrong and the reviewers will show the cards on multiple systems to show the real difference.


  • Registered Users Posts: 1,450 ✭✭✭jebidiah


    Just looking over the graphs and stuff and I can't help feel unless you have a 5000 series CPU and a 500 series mobo the figures shown won't be obtained and the cards may actually fair a small bit worse than the Nvidia counter parts in terms of raw FPS.

    I hope I'm wrong and I'm just nit picking but there is a bang of apple style buy into our eco system type marketing coming from AMD with this launch.

    If I'm right though and the figures shown rely on the eco system then unless you are willing/can/want to or are waiting to build and haven't in the last while, then bar the cheaper price the cards will come up a little short which for the price difference isn't a bad thing.

    Also as the 6800 was shown with the smart thing on, if that's the case it's worse than a 2080ti/3070 for a higher price unless as above you are in line for a new CPU/mobo.


    Will be interesting to see the third party reviews with this in mind. I'm assuming Nvidia also cherry pick the absolute best of everything to give their bench mark numbers as well.

    It makes sense for AMD to be really working side by side with both departments, since they are doing this for the consoles. Leverage that experience into desktops to make a more streamlined process. Sure it costs more to get in, but for the cost of a 3090 you could probably get that full set up of Mobo, CPU and 6900 (give or take a few quid)


  • Registered Users Posts: 13,753 ✭✭✭✭Inquitus


    Overheal wrote: »
    The wrinkle there is Nvidia cards apparently aren’t over clocking that well. Because GDDR6X doesn’t like it?

    AMD went with GDDR6 not the GDDR6X that Nvidia have opted for.


  • Registered Users Posts: 1,389 ✭✭✭KillerShamrock


    jebidiah wrote: »
    Will be interesting to see the third party reviews with this in mind. I'm aum Nvidia also cherry pick the absolute best of everything to give their bench mark numbers as well.

    oh yeah they 100% did remember 2x faster than a 2080ti jazz for the 3080 which turned out to be in Ray tracing quake 2 or mincraft only or along those lines and that the 3070 is faster than a 2080ti which it is in certain cases.
    jebidiah wrote: »
    It makes sense for AMD to be really working side by side with both departments, since they are doing this for the consoles. Leverage that experience into desktops to make a more streamlined process. Sure it costs more to get in, but for the cost of a 3090 you could probably get that full set up of Mobo, CPU and 6900 (give or take a few quid)

    All well and good if you are buying in at that level or fresh/first time build but at the 6800/6800XT and 3080/3070 level that adds a good chunk of change to the system and if you are coming from intel then even more so.
    The price to preformance ratio will most likely change drastically if you go down this route.


  • Advertisement
  • Registered Users Posts: 10,013 ✭✭✭✭Wonda-Boy


    oh yeah they 100% did remember 2x faster than a 2080ti jazz for the 3080 which turned out to be in Ray tracing quake 2 or mincraft only or along those lines and that the 3070 is faster than a 2080ti which it is in certain cases.



    All well and good if you are buying in at that level or fresh/first time build but at the 6800/6800XT and 3080/3070 level that adds a good chunk of change to the system and if you are coming from intel then even more so.
    The price to preformance ratio will most likely change drastically if you go down this route.

    Yeah, I could well and truely "SWAP" out my 3900x for a 5000 series but I cant help think its a bit of a gimmick at this stage. Be a total waste of time doing it for 5-10FPS gains.

    Should have it performing better with Zen 2 also.....;)


  • Registered Users Posts: 3,356 ✭✭✭seraphimvc


    jebidiah wrote: »
    Will be interesting to see the third party reviews with this in mind. I'm aum Nvidia also cherry pick the absolute best of everything to give their bench mark numbers as well.

    It makes sense for AMD to be really working side by side with both departments, since they are doing this for the consoles. Leverage that experience into desktops to make a more streamlined process. Sure it costs more to get in, but for the cost of a 3090 you could probably get that full set up of Mobo, CPU and 6900 (give or take a few quid)

    tech jesus said he will test for 5xx mobo + the 60 cards. I cant wait to see the benchmarks and how exactly the 'smart memory' works.

    Feel good now for my 200e B550 mobo and first time in about 10+ years that i might consider for a AMD card:pac:


  • Registered Users Posts: 13,753 ✭✭✭✭Inquitus


    I still have a R9 290X in one of my older PC's its like a combination between a hoover and a space heater!


  • Registered Users Posts: 4,400 ✭✭✭Homelander


    Not all of the AMD benchmarks were using SMA and Rage.

    Also, didn't they have slides showing the gap with said features on/off? Most games it was 5%.

    That still puts the 6900XT at parity, or a few % worse than the RTX3090 at $500 less. Still a major win for consumers.


  • Registered Users Posts: 1,389 ✭✭✭KillerShamrock


    Homelander wrote: »
    Not all of the AMD benchmarks were using SMA and Rage.

    Also, didn't they have slides showing the gap with said features on/off? Most games it was 5%.

    That still puts the 6900XT at parity, or a few % worse than the RTX3090 at $500 less. Still a major win for consumers.

    They didnt and yep its a win for consumers.

    On the 6900XT the slides had + smart access and rage mode and then for the 6800 it had smart access.

    For the 6800xt it just compared wattage at the bottom, also no numbers on any of the Nvidia bars so we dont know yet what they are comparing to realisticly.
    Reviews will be interesting to say the least and any one who just reviews them with the latest AMD board and cpu will need to justify why not on a few different sytems or they will probably end up being branded AMD shills/fanboys by Nvidia fanboys :pac:


  • Advertisement
  • Moderators, Computer Games Moderators Posts: 4,281 Mod ✭✭✭✭deconduo


    Inquitus wrote: »
    I still have a R9 290X in one of my older PC's its like a combination between a hoover and a space heater!

    That's what I have in my current build, time for an upgrade for Christmas I think.


  • Registered Users Posts: 21,648 ✭✭✭✭Squidgy Black


    There'll be an awful lot of salty 3090 owners regardless of how the benchmarks turn out for the 6900XT. It'll be like the 2080ti owners after the 3080 launch all over again.


  • Registered Users Posts: 18,706 ✭✭✭✭K.O.Kiki


    I just got done upgrading my mobo to X570 Tomahawk Wifi so waiting on benchmarks of 5600X / 5800X & 6800XT.


  • Registered Users Posts: 3,571 ✭✭✭2ndcoming


    I'd be fairly sure there will be a 6700xt to undercut the 3070 at around the $400 pricepoint, but below that you wouldn't be at all surprised to see RX660, 670, 680 come out in the $150-250 price range. That's the biggest unit shifter range and AMD have already been competitive there for years.


  • Registered Users Posts: 4,400 ✭✭✭Homelander


    They didnt and yep its a win for consumers.

    On the 6900XT the slides had + smart access and rage mode and then for the 6800 it had smart access.

    For the 6800xt it just compared wattage at the bottom, also no numbers on any of the Nvidia bars so we dont know yet what they are comparing to realisticly.
    Reviews will be interesting to say the least and any one who just reviews them with the latest AMD board and cpu will need to justify why not on a few different sytems or they will probably end up being branded AMD shills/fanboys by Nvidia fanboys :pac:

    Isn't that what this is though? 6800XT gains using SMA and Rage?

    v8Aykue.png


  • Registered Users Posts: 3,571 ✭✭✭2ndcoming


    There were three slides as they announced each extra. First straight comparison, then with rage mode, then with rage mode and smart access.


  • Registered Users Posts: 4,400 ✭✭✭Homelander


    Maybe I'm getting my wires crossed but that's what I meant before. The base figures are the 6800XT without Rage or SMA, so we do have a direct comparison to the 3080 for those of use who won't be using those features.

    In so far as we can take AMD's figures obviously of course, pending actual reviews.


  • Moderators, Recreation & Hobbies Moderators Posts: 4,662 Mod ✭✭✭✭Hyzepher


    Their first slide had a direct comparison - 3080 vs 6800XT - more or less on par


  • Registered Users Posts: 82,389 ✭✭✭✭Overheal


    awec wrote: »
    The 6900 is very tempting but I have a g-sync monitor. :(

    One of the very things I’ve said before about this proprietary feature, it brand locks you.


  • Registered Users Posts: 82,389 ✭✭✭✭Overheal


    Inquitus wrote: »
    AMD went with GDDR6 not the GDDR6X that Nvidia have opted for.

    That was my point. That the Radeon 6k card should be more over lock stable than nvidias RTX 3k series.


  • Advertisement
  • Registered Users Posts: 82,389 ✭✭✭✭Overheal


    Inquitus wrote: »
    I still have a R9 290X in one of my older PC's its like a combination between a hoover and a space heater!

    Just the 290 here and an FX 8150. This is definitely the year to upgrade!

    Guys whatever about the slides they differentiate between the Rage+SAM performance in white with the stock card in red. Even neck and neck in any other build like that those are great numbers and at that price. To have it actually get more performance on a Zen 3 system is free performance to those building new. And at less power consumption.


  • Registered Users Posts: 5,574 ✭✭✭EoinHef


    Overheal wrote: »
    That was my point. That the Radeon 6k card should be more over lock stable than nvidias RTX 3k series.

    Your basing that on what?

    Memory is only one part of an overclock,GDDR6X is already clocked higher than GDDR6 also so I'm not sure what your point is?


  • Registered Users Posts: 82,389 ✭✭✭✭Overheal


    EoinHef wrote: »
    Your basing that on what?

    Memory is only one part of an overclock,GDDR6X is already clocked higher than GDDR6 also so I'm not sure what your point is?

    Don’t quote me on that that’s just what I’ve heard from around. Maybe I have it wrong it’s not something I have been keenly read up on.


  • Registered Users Posts: 3,356 ✭✭✭seraphimvc


    Overheal wrote: »
    Just the 290 here and an FX 8150. This is definitely the year to upgrade!

    Guys whatever about the slides they differentiate between the Rage+SAM performance in white with the stock card in red. Even neck and neck in any other build like that those are great numbers and at that price. To have it actually get more performance on a Zen 3 system is free performance to those building new. And at less power consumption.

    I am expecting Linus's next video on FREE PERFORMANCE for the 60 cards. Oh man he might go to town now FREE+FREE performance from AMD CPU+GPU


  • Registered Users Posts: 82,389 ✭✭✭✭Overheal


    seraphimvc wrote: »
    I am expecting Linus's next video on FREE PERFORMANCE for the 60 cards. Oh man he might go to town now FREE+FREE performance from AMD CPU+GPU

    /shrug

    That’s the case though. If you put the card in an Intel or older AMD machine you’re leaving performance per dollar on the table. If not per watt.


  • Registered Users Posts: 21,648 ✭✭✭✭Squidgy Black


    Memory isn't what's stopping the overclocks with the 3080 it's the power limit that they've capped at 350w for reference boards and 370w for the FE.

    Unless AMD have managed to configure their board differently and give a higher power limit, I can't see there being much scope for higher clock speeds than the mental speeds they're already pushing the cards to with rage mode etc now.


  • Registered Users Posts: 10,299 ✭✭✭✭BloodBath


    Just looking over the graphs and stuff and I can't help feel unless you have a 5000 series CPU and a 500 series mobo the figures shown won't be obtained and the cards may actually fair a small bit worse than the Nvidia counter parts in terms of raw FPS.

    I hope I'm wrong and I'm just nit picking but there is a bang of apple style buy into our eco system type marketing coming from AMD with this launch.

    If I'm right though and the figures shown rely on the eco system then unless you are willing/can/want to or are waiting to build and haven't in the last while, then bar the cheaper price the cards will come up a little short which for the price difference isn't a bad thing.

    Also as the 6800 was shown with the smart thing on, if that's the case it's worse than a 2080ti/3070 for a higher price unless as above you are in line for a new CPU/mobo.

    As usual I hope I'm wrong and the reviewers will show the cards on multiple systems to show the real difference.

    How much of a boost do you think it's actually getting from this? I'd say it's no more than 2% and no more than a marketing gimmick.

    The 6800 was about 20% faster than the 3070. It sure as hell isn't getting that much a boost from it.

    Most of the boost I would say is from the OC which probably also decimates the power consumption numbers.
    There'll be an awful lot of salty 3090 owners regardless of how the benchmarks turn out for the 6900XT. It'll be like the 2080ti owners after the 3080 launch all over again.

    Why will they be salty? They will still have the best GPU on the market. The people willing to spend that kind of money do it for that very reason. If they were willing to spend that much extra over the 3080 for very little gaming benefit then they have other reasons not to care. There's still a bunch of other benefits to having an Nvidia GPU over AMD. More memory, better software support, better RT and DLSS.


  • Registered Users Posts: 1,871 ✭✭✭Simi


    Although I'll probably go for a 6800 XT myself, I'm very interested to see what both sides have planned for their lower end parts.

    Will Nvidia leave the 3060 as they're lowest end part with RT or will they offer RT in a 3050/3050 TI?

    If AMD's RT implementation is built into the CU does that mean we could potentially see RT support at the sub $300 price range for the first time? Perhaps in the 6500XT.

    I know RT on low end cards isn't going to be very performant, but the PS5 is doing it with only 36 CU.


  • Moderators, Recreation & Hobbies Moderators Posts: 4,662 Mod ✭✭✭✭Hyzepher


    I think AMDs RT is software based, so nothing to do with the CU count. Other than the more you have the better


  • Advertisement
  • Registered Users Posts: 82,389 ✭✭✭✭Overheal


    Hyzepher wrote: »
    I think AMDs RT is software based, so nothing to do with the CU count. Other than the more you have the better

    "Bringing new HW features into the architecture"

    https://youtu.be/CKuJNFMeoCo?t=399

    It's - ambiguous?


Advertisement