Advertisement
If you have a new account but are having problems posting or verifying your account, please email us on hello@boards.ie for help. Thanks :)
Hello all! Please ensure that you are posting a new thread or question in the appropriate forum. The Feedback forum is overwhelmed with questions that are having to be moved elsewhere. If you need help to verify your account contact hello@boards.ie

AMD Navi Discussion

Options
1212224262797

Comments

  • Registered Users Posts: 10,299 ✭✭✭✭BloodBath


    Hyzepher wrote: »
    I think AMDs RT is software based, so nothing to do with the CU count. Other than the more you have the better

    It's not software based. It's hardware based on the CU count.


  • Registered Users Posts: 1,871 ✭✭✭Simi


    Hyzepher wrote: »
    I think AMDs RT is software based, so nothing to do with the CU count. Other than the more you have the better

    It's 'hardware accelerated' and it supposedly has X RT accelerator(s) per CU. So it's not software based but doesn't have dedicated tensor cores like Nvidia. It's unclear exactly how it's being implemented.

    https://youtu.be/Np0WbGaDdWE

    Ampere could be much faster for RT, but if RT is going to be implemented on much weaker console hardware using RDNA2 architecture then it might not matter much.

    It might just be the case that Nvidia cards simply unlock an Ultra RT setting in games that won't run well on AMD hardware.


  • Registered Users Posts: 54,170 ✭✭✭✭Headshot


    Iv always been a Nvidia guy but if the AMD Radeon RX 6900 XT can some what match the 3090 with that price difference I maybe swayed.

    When are reviews expected?


  • Registered Users Posts: 940 ✭✭✭GHOST MGG


    Im interested in how AMD's smart system memory is going to be implemented using system ram on a board with 5000 series cpu and a 6000 series gpu..i wonder will faster memory clocks help here


  • Registered Users Posts: 82,389 ✭✭✭✭Overheal


    BloodBath wrote: »
    It's not software based. It's hardware based on the CU count.

    579976-hardware-accelerated-raytracing-1920x500.jpg


  • Advertisement
  • Registered Users Posts: 82,389 ✭✭✭✭Overheal


    GHOST MGG wrote: »
    Im interested in how AMD's smart system memory is going to be implemented using system ram on a board with 5000 series cpu and a 6000 series gpu..i wonder will faster memory clocks help here

    IDK but the benchmarks they posted were from a test bench using a Ryzen 9 5900X and 16GB DDR4-3200 Mhz RAM


  • Registered Users Posts: 321 ✭✭Mucashinto


    I wonder if Jensen got on the phone to the CEO of Intel yesterday and had a bit of a vent about Lisa with him in solidarity. Could always team up and hire someone to "make the problem go away" :)

    The GPUs from both companies so far seem to favour 4K gamers though. 6800xt and 3080 look like better propositions all around than 3070/6080 to me, especially as 3070s only seem gettable at at least £50 above MSRP. Holding on to get a FE 3080 would start to come in to play at that stage imo.

    Really weird that AMD compared the 6800 to the 3070 but then priced it higher, only card they did that for. Maybe there's some 6700s on the way meant for 1440p but getting the 6800 over the xt, there would have to be more than an $80 saving in it to make it worth it surely.


  • Registered Users Posts: 10,299 ✭✭✭✭BloodBath


    There's a 12gb card on the way (6700?) that will compete directly with the 3070.

    I'd say the 6800 will be competing with the 3070ti.


  • Registered Users Posts: 321 ✭✭Mucashinto


    BloodBath wrote: »
    There's a 12gb card on the way...

    Ah, right. Okay, I'll wait for that then and if there's a good deal on some of the last gen cards new or used tbh I might just snap one up and move on.

    PC gaming tech isn't really capturing me now anyway tbh. I think it's a bit overpriced and uninspiring atm considering the alternatives - got an Oculus Quest 2 there on a whim and it's been a genuinely new and exciting experience so far. Bit of a gimmick, but what a gimmick. To pay double that to get more frames and slightly better graphics in a game I can already play resonably well :o And then thinking of how cloud gaming could take off making hardware irrelevant? If I had any discipline I'd ride this 1070 into the ground and then see how the landscape looks...


  • Registered Users Posts: 1,871 ✭✭✭Simi


    Third party liquid cooled 6800 XT on the way. Bet it'll be pricey! I wonder if anyone will do a 6900 XT.

    https://videocardz.com/press-release/asus-reveals-custom-radeon-rx-6800-rog-strix-lc-rog-strix-oc-and-tuf-series

    Cloud gaming is interesting, and the launch of games like Control on the Nintendo Switch is cool, but I don't see it as a genuine replacement for actual hardware.

    The best streaming system in the world is still dependent on your local ISP to deliver it. That's before you even consider people's home setup.


  • Advertisement
  • Registered Users Posts: 82,389 ✭✭✭✭Overheal


    I'm really interested in seeing reviews for the cloud gaming. My own experience just with SteamLink has been, well, ****. And that's on the same router. Granted, it's an 802.11n laptop with a 3nd gen core i5 I was wirelessly linking to, the switch has 802.11ac and a newer APU which would (hopefully) bridge the kind of performance shortages I noticed and I don't think the downstream/up speed would be the issue.

    My router shouldn't be the bottleneck anyway it's a bit of a battlestation

    https://www.amazon.com/gp/product/B076HRZJ66/ref=ppx_yo_dt_b_search_asin_title?ie=UTF8&psc=1

    Really, buying it was simply a big fuuuuuck youuuuu to everyone in my apartment complex contributing to ****ty wifi crosstalk on the old setup I had. I don't have that problem anymore, they might :P


  • Registered Users Posts: 165 ✭✭MugsGame


    GHOST MGG wrote: »
    Im interested in how AMD's smart system memory is going to be implemented using system ram on a board with 5000 series cpu and a 6000 series gpu..i wonder will faster memory clocks help here

    AFAIK it's got nothing directly to do with system RAM, it's direct access from the CPU to the GPU RAM without having to use a sliding addressing window (PCI BAR) - instead the window can be expanded to the full GPU address space (variable BAR).

    Comments on Reddit suggest this is already supported on Linux for older AMD cards (AMD actually contributed the code) and that Windows has support for the relevant parts of the PCIe standard.

    There's no reason in principle Smart Access can't be enabled for Zen 2+Big Navi, but Windows (unlike Linux) needs a BIOS update, which it sounds like AMD only plan to qualify for Big Navi+Zen 3.

    Once that new BIOS flag is enabled, existing DirectX games get a free performance boost, as the API calls to move data into the GPU can automatically take advantage of it with suitable driver support. And future games (e.g. coming from console) can optimize even more for it. Coupled with DirectStorage you can imagine some games barely touching CPU RAM.


  • Registered Users Posts: 3,356 ✭✭✭seraphimvc


    I just saw it on reddit on official release data, ya can have a look at the pics and data there: https://www.reddit.com/r/Amd/comments/jkq11y/rx6000_series_performance_analysis_official_data/
    Conclusion:

    At 2K, 6900 XT is about 7% faster than 3090, 6800 XT is slightly faster than 3090 (1.5%), 6800 XT is about 10% faster than 3080, 6800 is close to 3080 (5% slower), faster than 2080ti and 3070 about 20%.

    At 4K, 6900 XT is about 3% faster compared to 3090, which we can say they are on par with each other. 6800 XT is about 5% slower than 3090, 6800 XT is about 5% faster than 3080, 6800 is about 15% faster than 2080 Ti and 3070.

    All data from AMD official web, there is the possibility of AMD selection of their preferred games, but it is real data.

    My conclusion is that 6800 XT probably close to 3090, and 6800 is aimed at 3070ti/super. By the way, all the above tests have enabled AMD's smart access memory, but the rage mode has not been mentioned.

    This gets abit mental but it is a good thing - I CANNOT WAIT FOR STEVE'S REVIEW (need 3rd party reviews urgently lol)


  • Registered Users Posts: 7,410 ✭✭✭Icyseanfitz


    Imagine if these cards have any bit of overclocking headroom, be amazing


  • Registered Users Posts: 165 ✭✭MugsGame


    Amazon.co.uk have the 6800 XT temporarily out of stock at only £169.03. I hope the scalpers pick these up if they ever come back in stock :pac:

    Nvidia should trademark 6800 XT to try and block the Radeon launch ...


  • Registered Users Posts: 13,753 ✭✭✭✭Inquitus


    Imagine if these cards have any bit of overclocking headroom, be amazing

    When's the Embargo lifted for performance results?


  • Registered Users Posts: 10,013 ✭✭✭✭Wonda-Boy


    Inquitus wrote: »
    When's the Embargo lifted for performance results?

    Usually a day before if not 2 days before....Really need to see the how the Radeon cards perform with RT, it becoming much more of a feature now so a 3080 will "have to do" :P


  • Registered Users Posts: 1,871 ✭✭✭Simi


    Wonda-Boy wrote: »
    Usually a day before if not 2 days before....Really need to see the how the Radeon cards perform with RT, it becoming much more of a feature now so a 3080 will "have to do" :P

    RT will probably be poor in currently available RT games, because they've all been designed around Nvidia's RT implementation.

    Future releases might fare a bit better given they'll be targeting a light RT touch for the consoles using a non propriety implementation. I'm sure Nvidia cards will still be faster, but it might just be a case of turning down the slider a notch for AMD cards to get similar performance.

    More press releases;

    https://www.sapphiretech.com/en/consumer/21304-01-20g-radeon-rx-6800-xt-16g-gddr6

    This one confirms HDMI 2.1 VRR support. They also recommend a 750W power supply. Yikes!


  • Registered Users Posts: 1,740 ✭✭✭kingtiger


    Simi wrote: »
    They also recommend a 750W power supply. Yikes!

    I thought the 6800xt only uses 300w :confused:

    750w seems a bit over the top


  • Registered Users Posts: 1,871 ✭✭✭Simi


    kingtiger wrote: »
    I thought the 6800xt only uses 300w :confused:

    750w seems a bit over the top

    Third party card so they've probably upped the volts for a tiny overclock over stock. I'm sure a decent 600W PSU will be fine if you're using it with a Ryzen CPU.

    If you're using it with an Intel 9th or 10th gen on the other hand...


  • Advertisement
  • Registered Users Posts: 1,740 ✭✭✭kingtiger


    Simi wrote: »
    Third party card so they've probably upped the volts for a tiny overclock over stock. I'm sure a decent 600W PSU will be fine if you're using it with a Ryzen CPU.

    If you're using it with an Intel 9th or 10th gen on the other hand...

    aye, I only think in terms of power draw with a Ryzen these days, as I wouldn't dream of using an Intel platform at the moment


  • Registered Users Posts: 1,871 ✭✭✭Simi


    kingtiger wrote: »
    aye, I only think in terms of power draw with a Ryzen these days, as I wouldn't dream of using an Intel platform at the moment

    The role reversal is crazy. Intel is edging closer and closer to FX9590 territory with each release


  • Registered Users Posts: 1,740 ✭✭✭kingtiger


    Simi wrote: »
    The role reversal is crazy. Intel is edging closer and closer to FX9590 territory with each release

    the king is dead....


  • Registered Users Posts: 10,013 ✭✭✭✭Wonda-Boy


    [PHP][/PHP]Found this while looking for info.....6800XT vs 3070 (not sure why its not the 3080)



    6900XT vs 3090




    Sorry if posted already.


  • Registered Users Posts: 663 ✭✭✭MidlanderMan


    Wonda-Boy wrote: »
    Found this while looking for info.....6800XT vs 3070 (not sure why its not the 3080)


    It's probably not real tbh.


  • Registered Users Posts: 4,400 ✭✭✭Homelander


    Simi wrote: »
    The role reversal is crazy. Intel is edging closer and closer to FX9590 territory with each release

    Ah, I wouldn't say it's really that comparable.

    At least they still have really solid performance and are still top dog by a good margin for gaming.

    The FX9590 guzzled about 250W and needed liquid cooling to trade blows with a 50w i3 in some games, it was truly just a god-awful CPU.


  • Registered Users Posts: 1,389 ✭✭✭KillerShamrock


    Wonda-Boy wrote: »
    [PHP][/PHP]Found this while looking for info.....6800XT vs 3070 (not sure why its not the 3080)


    6900XT vs 3090


    Sorry if posted already.

    yeah that whole thing is bs some rando channel with no subs and like 6 videos all put up today :pac: plus wouldnt the 6800xt be way more than 10 fps faster than a 3070 at 4k.

    The also have a vid with a 3090 vs a 6900xt so i Call Pure BS


  • Registered Users Posts: 165 ✭✭MugsGame


    Homelander wrote: »
    are still top dog by a good margin for gaming.

    Consensus seems to be that Zen 3 takes the gaming performance crown from Intel, for now ...


  • Registered Users Posts: 13,753 ✭✭✭✭Inquitus


    Homelander wrote: »
    Ah, I wouldn't say it's really that comparable.

    At least they still have really solid performance and are still top dog by a good margin for gaming.

    The FX9590 guzzled about 250W and needed liquid cooling to trade blows with a 50w i3 in some games, it was truly just a god-awful CPU.

    Aye as mentioned above Zen 3 takes the crown, even single-threaded!


  • Advertisement
  • Registered Users Posts: 4,400 ✭✭✭Homelander


    I know that, I'm talking about the comparisons with FX. It's not really a fair one as as Intel are still excellent for gaming.

    Ryzen have shaken up the market and brought viable competition, but until this moment in time Intel have remained the best for gaming.

    Even at lower end. For example it never gets mentioned but the 10th Gen i3 is significantly faster than 3300x in some games at same cost.

    People sort of got it into their heads that Intel is now worthless but it's not at all true.

    New generation of Ryzen will also be as expensive or more expensive than Intel counterparts in some cases.

    I mean, 10700k is way, way faster than 3700x, but also more expensive.

    However the suggested price places 5800x well beyond the cost of 10700k.

    Like anyone else, I'm waiting for reviews, but even at this point in time I'm not counting Intel out.

    So, while this is all great for consumers, comparisons with FX are a bit dramatic.


Advertisement