Advertisement
If you have a new account but are having problems posting or verifying your account, please email us on hello@boards.ie for help. Thanks :)
Hello all! Please ensure that you are posting a new thread or question in the appropriate forum. The Feedback forum is overwhelmed with questions that are having to be moved elsewhere. If you need help to verify your account contact hello@boards.ie

AMD Navi Discussion

Options
1222325272897

Comments

  • Registered Users Posts: 2,625 ✭✭✭fergus1001


    Simi wrote:
    RT will probably be poor in currently available RT games, because they've all been designed around Nvidia's RT implementation.


    not really they will also be designed for RT on consoles which uses RDNA2 so Nvidia might not have the advantage you would think here


  • Registered Users Posts: 2,625 ✭✭✭fergus1001


    Homelander wrote:
    Like anyone else, I'm waiting for reviews, but even at this point in time I'm not counting Intel out.


    Intel got caught napping and of


  • Registered Users Posts: 54,170 ✭✭✭✭Headshot


    Have we heard anything about these AMD GPUs having HDMI 2.1?


  • Registered Users Posts: 165 ✭✭MugsGame


    Yes, the consoles have HDMI 2.1 VRR and the desktop GPUs have it as standard. See e.g. https://www.sapphiretech.com/en/consumer/21304-01-20g-radeon-rx-6800-xt-16g-gddr


  • Registered Users Posts: 5,574 ✭✭✭EoinHef


    531559.jpg


  • Advertisement
  • Registered Users Posts: 663 ✭✭✭MidlanderMan


    1-3 weeks after the AMD branded cards launch before the partner cards launch.

    AMD exclusive partners like Power Color will release before the likes of MSI, Gigabyte, etc.


  • Registered Users Posts: 82,389 ✭✭✭✭Overheal


    I hate all their partner brand names. Oooooh Power Color! Sapphire. Meh. /has 2 Sapphire cards in the closet


  • Registered Users Posts: 10,013 ✭✭✭✭Wonda-Boy


    Sapphire are meant to be the best AIB card to get for AMD are they not......then Powercolour?


  • Registered Users Posts: 82,389 ✭✭✭✭Overheal


    Wonda-Boy wrote: »
    Sapphire are meant to be the best AIB card to get for AMD are they not......then Powercolour?

    I dunno. I'm homer simpson when it comes to brand differentiation. Except knowing that MSI, ASUS, EVGA etc. are high tier PCB makers.


  • Registered Users Posts: 2,449 ✭✭✭Rob2D


    I've always wanted a DEVIL card ever since the 7870 days. Which are Powercolor as well I think???

    Shame they've never made a really good one.


  • Advertisement
  • Registered Users Posts: 4,400 ✭✭✭Homelander


    Powercooler are OK, Sapphire are probably the best all-rounders. The Pulse is a good budget card and the Nitro is usually excellent.

    MSI, Gigabyte and Asus are always a mixed bag. Grand on the higher end of things, but their budget models can sometimes be poor.


  • Registered Users Posts: 10,684 ✭✭✭✭Samuel T. Cogley


    Hoping for an all aluminium TUF gaming like the nvidia ones.


  • Moderators Posts: 5,558 ✭✭✭Azza


    I think AMD decision to opt for 16GB VRAM across the board is looking like a wise move.

    Hardware unboxed are reporting that Watch Dogs Legion exceeds 8GB VRAM at 1440P on its highest settings and there is a performance hit incurred as a result.

    Microsoft Flight Simulator seems to use as much VRAM as possible and on 16GB VRAM cards, showing over 10GB usage at 1440p and 12.5GB usage at 4K, though it doesn't appear to have too much impact on performance.

    Doom Eternal uses over 11GB VRAM at 4K with ultra texture settings, and anything with less VRAM takes a performance hit. At those settings the Radeon VII actually out performs the 1080Ti and 2080 Super as a result.

    Lastly the developers of Godfall have said that 4K/Ultra settings will require 12GB of VRAM, so even the RTX 3080 might take a performance hit with that game.


  • Registered Users Posts: 4,400 ✭✭✭Homelander


    I utterly loathe Nvidia's stingy approach with the VRAM. They are always doing it, to artificially put a ceiling on the lifespan of cards and hoping the majority of people who upgrade will stick with Nvidia.

    I have a 3070 on order out of curiosity, but that is exactly the reason I will not be keeping it. 8GB of VRAM is ridiculous on these 3070 cards now, when it is standard GDDR6. They are already struggling at launch in certain titles, in a few years it will become a big issue even though the core of the card will still be well capable.


  • Registered Users Posts: 6,948 ✭✭✭circadian


    One of the main reasons I bought a Radeon VII was for the 16GB vram. To see high end cards not having this as standard over a year later is madness.


  • Registered Users Posts: 740 ✭✭✭z0oT


    There was a version of the RX 480 with 8GB of VRAM all the way back in 2016, and that was solidly marketed as a mid range graphics card back at the time.

    4 years later the 3070 only has 8GB, and the 3080 with 10GB, and both are massively more high end cards than the RX 480 ever was. It really makes the both of them look silly compared with the 16GB on offer from AMD.

    Bit of a shame because the 3070 would be quite a good buy otherwise.


  • Registered Users Posts: 5,574 ✭✭✭EoinHef


    We are only really starting to see games need more than 8GB though.

    Like how does that 480 perform now? Not great despite it having 8GB of vram. 8 feels like it was enough until recently imo


  • Registered Users Posts: 8,615 ✭✭✭grogi


    EoinHef wrote: »
    We are only really starting to see games need more than 8GB though.

    Like how does that 480 perform now? Not great despite it having 8GB of vram. 8 feels like it was enough until recently imo

    Once you limit yourself to 1920x1080, the RX480 performs more than decently. 8GB was unnecesary five years ago, but today it is below minimum.

    NVidia cards manage the memory slightly better than Radeons - 8GB in RTX is similar to ~10GB in a Radeon. That doesn't change a fact that a card with raw power of 3080 should have more memory than it has.


  • Registered Users Posts: 5,574 ✭✭✭EoinHef


    grogi wrote: »
    Once you limit yourself to 1920x1080, the RX480 performs more than decently.

    NVidia cards manage the memory slightly better than Radeons - 8GB in RTX is similar to ~10GB in a Radeon. That doesn't change a fact that a card with raw power of 3080 should have more memory than it has.

    How much of that 8GB VRAM is being used at 1080p though?

    Id agree going forward that more than 10GB is desirable,I just don't think its been nessacery till recently.


  • Registered Users Posts: 8,615 ✭✭✭grogi


    EoinHef wrote: »
    How much of that 8GB VRAM is being used at 1080p though?

    Id agree going forward that more than 10GB is desirable,I just don't think its been nessacery till recently.

    Memory requirements are not really linked with the output resolution. In 1920x1080 and tripple buffering, you'd need ~25MiB for frame buffering. In 4K (3840x2160) and tripple buffering, you use ~100MiB. That's a mere 75MiB of difference.

    What eats the VRAM is higher texture resolution.


  • Advertisement
  • Registered Users Posts: 5,574 ✭✭✭EoinHef


    grogi wrote: »
    Memory requirements are not really linked with the output resolution. In 1920x1080 and tripple buffering, you'd need ~25MiB for frame buffering. In 4K (3840x2160) and tripple buffering, you use ~100MiB. That's a mere 75MiB of difference.

    What eats the VRAM is higher texture resolution.

    Yes I'm aware,modded skyrim is the only game that has ever requested close to the full 8gb on my 1080 and that was because I crammed it with mods that had high res textures.

    How often do you think a 480 has actually used,not allocated,its full 8GB of memory?


    But thats beside the point,point is that up until recently 8gb has been sufficient imo.


  • Registered Users Posts: 10,013 ✭✭✭✭Wonda-Boy


    Interesting Article:-

    CEO of Counter play games (GODFALL) Keith Lee explains that to play Godfall on 4K with Ultra settings you need MINIMUM 12GB Vram!!!
    Gushes over the latest AMD offerings, seems a little back handed to me the way he is ramping up AMD cards even if true.

    But its food for thought.

    https://videocardz.com/newz/godfall-requires-12gb-of-vram-for-ultrahd-textures-at-4k-resolution


  • Registered Users Posts: 4,400 ✭✭✭Homelander


    EoinHef wrote: »
    Yes I'm aware,modded skyrim is the only game that has ever requested close to the full 8gb on my 1080 and that was because I crammed it with mods that had high res textures.

    How often do you think a 480 has actually used,not allocated,its full 8GB of memory?

    But thats beside the point,point is that up until recently 8gb has been sufficient imo.

    But surely that is a kinda redundant point when we're talking about brand new, high-end cards in 2020?

    For the past 2 years there have been games that require 8GB vram for high/ultra textures.

    Whatever about GDDR6X, a 3070 with 8GB GDDR6 is a major bum-deal from Nvidia.

    You will have a situation in a few years where the card will still be able to effortlessly drive ultra settings, but have to reduce textures because of lack of vram.

    Same rubbish they pulled with the 1660/RTX2060 6GB, GTX1060 3GB, and so on. Artificially limiting the cards potential based on their market share and the likelihood of customers upgrading to another Nvidia card.

    If AMD can get a semi-strong foothold back, this practice will stop.


  • Registered Users Posts: 740 ✭✭✭z0oT


    I have personally seen the 8GB on the 5700 XT I have nearly full many times in the last year. The frames/sec have been solid throughout so it hasn't bothered me any.

    This is at 1440p.


  • Registered Users Posts: 5,574 ✭✭✭EoinHef


    Homelander wrote: »
    But surely that is a kinda redundant point when we're talking about brand new, high-end cards in 2020?

    For the past 2 years there have been games that require 8GB vram for high/ultra textures.

    Whatever about GDDR6X, a 3070 with 8GB GDDR6 is a major bum-deal from Nvidia.

    You will have a situation in a few years where the card will still be able to effortlessly drive ultra settings, but have to reduce textures because of lack of vram.

    Same rubbish they pulled with the 1660/RTX2060 6GB, GTX1060 3GB, and so on. Artificially limiting the cards potential based on their market share and the likelihood of customers upgrading to another Nvidia card.

    If AMD can get a semi-strong foothold back, this practice will stop.

    Originally someone said cards should have had more VRAM years ago,i just said until recently it hasn't really needed to be more than 8GB.

    I agree that going forward there needs to be more VRAM,already said as much. If,as is most likely,8GB is not enough buying a 3070 would not be a good idea. I'm not buying one. Maybe 8GB on a 3060,10GB on a 3070 and 16GB on a 3080 seems fairer but it is Nvidia were talking about here,nothing is for free with them.

    I'd also agree its where Nvidia may get people to upgrade mid cycle rather than generationally.

    None of that changes the point that 8GB has been enough up until recently really. Also allocation and usage do differ,id actually like to see if someone has done a breakdown with figures etc.


  • Registered Users Posts: 10,299 ✭✭✭✭BloodBath


    Azza wrote: »
    I think AMD decision to opt for 16GB VRAM across the board is looking like a wise move.

    Hardware unboxed are reporting that Watch Dogs Legion exceeds 8GB VRAM at 1440P on its highest settings and there is a performance hit incurred as a result.

    Microsoft Flight Simulator seems to use as much VRAM as possible and on 16GB VRAM cards, showing over 10GB usage at 1440p and 12.5GB usage at 4K, though it doesn't appear to have too much impact on performance.

    Doom Eternal uses over 11GB VRAM at 4K with ultra texture settings, and anything with less VRAM takes a performance hit. At those settings the Radeon VII actually out performs the 1080Ti and 2080 Super as a result.

    Lastly the developers of Godfall have said that 4K/Ultra settings will require 12GB of VRAM, so even the RTX 3080 might take a performance hit with that game.

    That's going to become a lot more common with the new consoles having 16gb of vram. With about 12-13 useable by games.

    Nvidia really cheaped out and I wouldn't touch any of the current cards for that reason. My problem with AMD is their poor software support outside of games so there's no ideal card for me yet.

    Hoping Nvidia do a reasonably priced 3070ti with at least 12gb of Vram or AMD sort out their software support.


  • Moderators Posts: 5,558 ✭✭✭Azza


    BloodBath wrote: »
    That's going to become a lot more common with the new consoles having 16gb of vram. With about 12-13 useable by games.

    Nvidia really cheaped out and I wouldn't touch any of the current cards for that reason. My problem with AMD is their poor software support outside of games so there's no ideal card for me yet.

    Hoping Nvidia do a reasonably priced 3070ti with at least 12gb of Vram or AMD sort out their software support.

    Out of curiosity what software support do you find lacking with AMD?


  • Registered Users Posts: 10,299 ✭✭✭✭BloodBath


    A lot of stuff in the 3d rendering space like Houdini, Substance, Blender etc and game engines like unreal.

    Cuda just generally has way better support. On top of that is the RT features which have been added to some of the software and some tensor workflows have been implemented as well which can also accelerate workflows and improve image quality.

    At least AMD will have better RT support this time around and the memory amount is huge plus giving full dx12 ultimate support.


  • Registered Users Posts: 10,684 ✭✭✭✭Samuel T. Cogley


    What do we reckon on VR? nVidia all the way?


  • Advertisement
  • Registered Users Posts: 10,299 ✭✭✭✭BloodBath


    What do we reckon on VR? nVidia all the way?

    I'd imagine DLSS is huge for VR.


Advertisement