Advertisement
If you have a new account but are having problems posting or verifying your account, please email us on hello@boards.ie for help. Thanks :)
Hello all! Please ensure that you are posting a new thread or question in the appropriate forum. The Feedback forum is overwhelmed with questions that are having to be moved elsewhere. If you need help to verify your account contact hello@boards.ie

AMD Navi Discussion

Options
17810121397

Comments

  • Registered Users Posts: 13,984 ✭✭✭✭Cuddlesworth


    Mr Crispy wrote: »
    Interesting that he suspects AMD are using HBM (although he could be reading far too much into that model).

    The rest of the industry has serious doubts about that. Its just too expensive to produce.


  • Posts: 0 [Deleted User]


    I find most of the Tech reviewers really irritating, from the so laid back types they sound really disinterested to the obnoxious shouty types who tend to have a bunch of dopes running about taking orders from them [that Linus guy for one] - but, we have to get the info someplace and here and there between them all we do learn something [eventually] as they will always get the gear first. They are obviously going to milk the absoloute crap out of these new cards and convince us all that we 'need' to upgrade asap. Their channels wouldn't be doing much if there wasn't new tech to push/review/moan about




    Yeah Linus can be really grating. If I have to I would listen to someone like JayzTwoCents; yeah he can rant about random things but comes across as a general grumbling tech head as opposed to screaming into the screen and having peons stagger around.


  • Registered Users Posts: 18,706 ✭✭✭✭K.O.Kiki


    They are obviously going to milk the absoloute crap out of these new cards and convince us all that we 'need' to upgrade asap. Their channels wouldn't be doing much if there wasn't new tech to push/review/moan about

    GamersNexus' Steve recently admitted that he still runs an FX-8350 at home and affirmed his stance that you should buy for longevity where possible.


  • Registered Users Posts: 10,013 ✭✭✭✭Wonda-Boy


    Have to say I to find it rediculous advertising a GPU in fortnite....I mean come on ffs! Especially the very first insight into a GPU that is so important to AMD, maybe after a few weeks of launch for some PR but its madness.


  • Registered Users Posts: 7,878 ✭✭✭Mr Crispy


    Wonda-Boy wrote: »
    Have to say I to find it rediculous advertising a GPU in fortnite....I mean come on ffs! Especially the very first insight into a GPU that is so important to AMD, maybe after a few weeks of launch for some PR but its madness.

    It really isn't. People getting upset over this need to calm down.


  • Advertisement
  • Registered Users Posts: 740 ✭✭✭z0oT


    I find most of the Tech reviewers really irritating, from the so laid back types they sound really disinterested to the obnoxious shouty types who tend to have a bunch of dopes running about taking orders from them [that Linus guy for one] - but, we have to get the info someplace and here and there between them all we do learn something [eventually] as they will always get the gear first. They are obviously going to milk the absoloute crap out of these new cards and convince us all that we 'need' to upgrade asap. Their channels wouldn't be doing much if there wasn't new tech to push/review/moan about
    What irks me about some of the Tech Youtubers/Websites is when they try and go into detail about the technical electronic principles behind things. It's often evident they know nothing about what they're talking about most of the time in that case.

    As for Navi, this video is worth a watch. The prediction is that AMD will have a 3070 & 3080 competitor, and they'll probably do something like what the Radeon VII was to compete with the 3090. Time will of course tell.
    https://www.youtube.com/watch?v=BR70xbcwB6U


  • Posts: 0 [Deleted User]


    Mr Crispy wrote: »
    It really isn't. People getting upset over this need to calm down.




    It's a highly popular game, massive throughput, many people with high end rigs play, AND has a large base of people who have basic enough rigs.


    Those low enders are prime targets for enticing towards a self-build mid/high end rig, with no affiliation to Nvidia.


  • Registered Users Posts: 18,706 ✭✭✭✭K.O.Kiki


    Wonda-Boy wrote: »
    Have to say I to find it rediculous advertising a GPU in fortnite....I mean come on ffs! Especially the very first insight into a GPU that is so important to AMD, maybe after a few weeks of launch for some PR but its madness.


    There's no insight in the Fortnite thing.
    It's just a big render.

    First insight will be on 28. October, as they already announced earlier in the month.


  • Registered Users Posts: 10,013 ✭✭✭✭Wonda-Boy


    Mr Crispy wrote: »
    It really isn't. People getting upset over this need to calm down.

    Jaysus buddy, think of my feelings will ya....:P

    Getting upset indeed....more of an observation more then anything else :rolleyes:

    According to statistica, a survey done late 2020 found that the majority of people playing FORTNITE are 18-24 yr olds (62%)

    22% are of the age 25-43.

    12% are of the age 35-44.

    2% are of the age 45-54.

    To me, that is not the base that AMD would need to be targeting as the majority of PC high end users looking at the extreme high end GPU purchases would not be 18-24 yrs of age. More like 30+ present company included.


  • Posts: 0 [Deleted User]


    Wonda-Boy wrote: »
    Jaysus buddy, think of my feelings will ya....:P

    Getting upset indeed....more of an observation more then anything else :rolleyes:

    According to statistica, a survey done late 2020 found that the majority of people playing FORTNITE are 18-24 yr olds (62%)

    22% are of the age 25-43.

    12% are of the age 35-44.

    2% are of the age 45-54.

    To me, that is not the base that AMD would need to be targeting as the majority of PC high end users looking at the extreme high end GPU purchases would not be 18-24 yrs of age. More like 30+ present company included.




    At 37 I was in an office as a Sys-Admin trying to justify a 1080 purchase (considering kids, mortgage, car, and wife wondering about said purchase)
    Vs
    Bunch of 24 year old (paid) interns splashing out on 2080s a few months later.


    Also, think of those 16 year olds saying to 40ish parents "Dad/Mom, you like games too. Can we get a high end GPU together, for Xmas?"


    Edit: Bastard interns actually asked me if I knew what a pokéball was.


  • Advertisement
  • Registered Users Posts: 21,648 ✭✭✭✭Squidgy Black


    There's a HUGE influx of younger people who are convincing their parents to buy them new PCs over in the states too with remote learning being pretty popular over there. And then you e college students too who will all be remote, so there's a lot of people investing in systems and saying why not go full whack.

    It's a gimmick to be honest, not a huge deal but if it gets them publicity then why not. I'm sure Epic were more than happy to work with them for it. Sure Nvidia used fortnite as one of their examples in their launch of games that would have RTX support being rolled out.


  • Registered Users Posts: 7,878 ✭✭✭Mr Crispy


    Wonda-Boy wrote: »
    Jaysus buddy, think of my feelings will ya....:P

    Getting upset indeed....more of an observation more then anything else :rolleyes:

    According to statistica, a survey done late 2020 found that the majority of people playing FORTNITE are 18-24 yr olds (62%)

    22% are of the age 25-43.

    12% are of the age 35-44.

    2% are of the age 45-54.

    To me, that is not the base that AMD would need to be targeting as the majority of PC high end users looking at the extreme high end GPU purchases would not be 18-24 yrs of age. More like 30+ present company included.

    I wasn't aiming that specifically at you, moreso the general outrage about it online (although you did sound upset :p ).

    Anyway, ignoring those figures (which just show that there are still tens of millions of Fortnite players over the age of 24), you have to remember that this is only one element of AMD's marketing. If their whole plan is to publicise this launch through Fortnite, then yeah, I'll grab a pitchfork alongside the rest of the internet. But if it's just one measly element of getting this thing talked about, I really don't see the harm.

    Maybe they should have just revealed it by pulling it out of an oven. *shrugs*


  • Registered Users Posts: 663 ✭✭✭MidlanderMan


    The screw hole patern on the back of Navi render in Fortnite suggests that whatever that card is, it doesn't have GDDR, there are screw holes where the GDDR chips would be.
    If that's true, it means that *maybe* the highest end cards will use HBM2, which might make sense with the 80 CU's and 24GB rumours.


  • Registered Users Posts: 10,684 ✭✭✭✭Samuel T. Cogley




    Skip to about 11 minutes in.


  • Registered Users Posts: 7,878 ✭✭✭Mr Crispy


    Images from the video;

    526470.jpg

    526471.jpg

    526472.jpg


  • Registered Users Posts: 6,947 ✭✭✭circadian


    I like Jayztwocents liquid cooling, hvac cooling videos. Even the benchmark stuff is decent but when it comes to news/rumours etc he comes a across as a total fan boy and completely biased.

    Mind you, most of the tech you tubers do so at least he has the other material that's decent.


  • Registered Users Posts: 10,299 ✭✭✭✭BloodBath


    The screw hole patern on the back of Navi render in Fortnite suggests that whatever that card is, it doesn't have GDDR, there are screw holes where the GDDR chips would be.
    If that's true, it means that *maybe* the highest end cards will use HBM2, which might make sense with the 80 CU's and 24GB rumours.

    I don't think there's any truth to that. The memory can be laid out multiple ways.

    I'm pretty sure I used to have a gpu that had that x shaped pressure plate on the back as well that wasn't HBM.

    It's just too expensive for consumer cards and is only used for server/ai gpu's.

    I think AMD learned that lesson already with radeon 7.


  • Registered Users Posts: 4,400 ✭✭✭Homelander


    AMD drivers are the reason I won't really entertain Big Navi.

    Back in the day, cards like the R9 290, HD7970, whatever else, where hugely viable cards to Nvidia. Even the RX series were fine.

    But their drivers have just been so horrible for the Vega/XT cards in comparison to Nvidia.


  • Registered Users Posts: 1,016 ✭✭✭Ultrflat


    circadian wrote: »
    Even the benchmark stuff is decent but when it comes to news/rumours etc he comes a across as a total fan boy and completely biased.

    I think his fan boys days are over, hes become very critical of companies and there practices, he gave MSI greef over a bios issue, hes called out NVIDIA multiple times in the past year over poor practices. He also has no issue in making intel look bad as well as AMD.

    He used to give the impression he was a fan boy, I just think he was going with what ever card/cpu was working best apposed sticking to one side.


  • Registered Users Posts: 21,648 ✭✭✭✭Squidgy Black


    I'm going to assume that if the 6900XT runs close to the 3080 performance wise there's going to be zero chance of getting one this side of Christmas considering the demand for the 3080 so far.


  • Advertisement
  • Moderators, Recreation & Hobbies Moderators Posts: 4,662 Mod ✭✭✭✭Hyzepher


    I'm going to assume that if the 6900XT runs close to the 3080 performance wise there's going to be zero chance of getting one this side of Christmas considering the demand for the 3080 so far.

    There will always be more demand for Nvidia imo. But given how little both companies provide out of the gate I'm also assuming it will be hard to get one.


  • Moderators Posts: 5,558 ✭✭✭Azza


    So from what I'm hearing around the web, AMD have at least a competitor to the 3080.
    There 6700 will be there competitor to 3070 but will not be available this year.
    Rasterization performance will be around the same or in spitting distance of Ampere.
    Ray Tracing performance is abit of a unknown but the speculation is that its not not as good as Ampere but better than Turing.
    Will have more VRAM than the 3070/3080 cards with 16GB VRAM.
    Much better power consumption than Ampere.
    Its reference cooler will no longer a blower design.
    Will again undercut NVIDIA on price.


  • Registered Users Posts: 1,871 ✭✭✭Simi


    It won't matter if it has more vram, better power consumption and a lower price. Unless it solidly beats the equivalent Nvidia card and has zero driver issues from day one, people simply aren't going to pick an AMD card over an Nvidia one.

    It's not good enough to get within 10% of or trade blows with the 3080. The AMD equivalent has to be unquestionably better in every way to stand even a chance of stealing some market share.


  • Registered Users Posts: 10,684 ✭✭✭✭Samuel T. Cogley


    Simi wrote: »
    It won't matter if it has more vram, better power consumption and a lower price. Unless it solidly beats the equivalent Nvidia card and has zero driver issues from day one, people simply aren't going to pick an AMD card over an Nvidia one.

    It's not good enough to get within 10% of or trade blows with the 3080. The AMD equivalent has to be unquestionably better in every way to stand even a chance of stealing some market share.

    As much as I agree with the above it simply wont, sadly.

    It'll be as good if not slightly better at 1440P
    Worse at 4K
    RT performance will cause a bigger hit than on nVidia Cards
    No answer to DLSS
    Cheaper made reference and AIB designs
    Driver issues, especially on launch

    This won't be Radeon's Zen moment unless they massively undercut on cost - which I don't think they can aford to do and would also be easily countered.


  • Moderators Posts: 5,558 ✭✭✭Azza


    Simi wrote: »
    It won't matter if it has more vram, better power consumption and a lower price. Unless it solidly beats the equivalent Nvidia card and has zero driver issues from day one, people simply aren't going to pick an AMD card over an Nvidia one.

    It's not good enough to get within 10% of or trade blows with the 3080. The AMD equivalent has to be unquestionably better in every way to stand even a chance of stealing some market share.

    Same sources I was looking at believe three cards are coming the 6700, 6800 and 6900.

    If AMD want to win on rasterization they can do so but by upping power consumption well outside of the chips sweet spot. I suspect that will be the 6900 card.

    Looking at hardware unboxed reviews of the 3080, they show that in terms of Ray Tracing performance the improvements are pretty marginal relative to Turning, so if AMD do get in between Turing and Ampere you are not going to notice a difference.

    VRAM may prove a factor down the road.

    As for DLSS, same sources say they are working on a counter solution, but absolutely no ETA on it.


  • Registered Users Posts: 4,573 ✭✭✭Infini


    Simi wrote: »
    It won't matter if it has more vram, better power consumption and a lower price. Unless it solidly beats the equivalent Nvidia card and has zero driver issues from day one, people simply aren't going to pick an AMD card over an Nvidia one.

    It's not good enough to get within 10% of or trade blows with the 3080. The AMD equivalent has to be unquestionably better in every way to stand even a chance of stealing some market share.

    In all honesty though Ive an AMD rig as well and there's not a bother out of it for me. As it stands AMD isnt thag bad they are improving if anything its because of them that nvidia arent gouging this time around as much on prices.

    Would wait and see what happens but AMD might be in better shape this time around.


  • Moderators Posts: 5,558 ✭✭✭Azza


    I've had a Vega 56 and a 5700XT and driver wise I never had an issue with the Vega 56.
    On the 5700XT I've got the black screen issue once and had an issue with micro stutter introduced by a GPU driver update in Sniper Elite 4. I ended up doing 4-5 driver rolls backs an updates before it fixed it, as each rollback/update would temp fix the issue before it came back. Weirdly it resolved itself when I went back to the driver that first introduced the issue. Not sure what happened, was annoying at the time but overall I wouldn't say AMD driver support has been bad, nor has it been perfect but I think there reputation for poor drivers is a little over exaggerated or a little out of date.


  • Registered Users Posts: 6,947 ✭✭✭circadian


    I'm definitely curious to see what RDNA2 can do but I'm waiting until the next generation before upgrading anyways. It seems AMD have taken the same approach as Zen by creating a more modular architecture and focusing on different characteristics each generation. I suspect rdna3 and 4 will have a very mature GPU offering that could well be able to tackle NVIDIA at the high end.


  • Banned (with Prison Access) Posts: 1,306 ✭✭✭bobbyy gee




  • Advertisement
  • Banned (with Prison Access) Posts: 1,306 ✭✭✭bobbyy gee




Advertisement