Advertisement
If you have a new account but are having problems posting or verifying your account, please email us on hello@boards.ie for help. Thanks :)
Hello all! Please ensure that you are posting a new thread or question in the appropriate forum. The Feedback forum is overwhelmed with questions that are having to be moved elsewhere. If you need help to verify your account contact hello@boards.ie

AMD RX 4xx Discussion Thread

Options
1232425262729»

Comments

  • Registered Users Posts: 1,634 ✭✭✭Gehad_JoyRider


    Redfox25 wrote: »
    Are we still looking at q2 ish 2017 for Vega to launch?




    Yep September for me :D


  • Registered Users Posts: 1,965 ✭✭✭jonerkinsella


    Fast forward to about 15mins .


  • Registered Users Posts: 3,878 ✭✭✭Robert ninja


    Well Deus Ex just reset my colour settings. Went to AMD Settings and what do you know... they removed color management from the AMD settings. Left with that god awful windows color wizard thing (CCC.exe). Another feature gone. My screen looks like shyte.


  • Registered Users Posts: 1,115 ✭✭✭chrismon


    How are ye finding the cards now that they are out a few months?


  • Registered Users Posts: 577 ✭✭✭Justwinginit


    Just ordered an MSI Gaming X 8GB for £249 off amazon...Wahoo!


  • Advertisement
  • Registered Users Posts: 86 ✭✭ReZonator


    chrismon wrote: »
    How are ye finding the cards now that they are out a few months?

    I picked up a Powercolor RX 470 Red Devil 4GB for about €200. Couldn't wait till Black Friday. Overclocked out of the box, and I believe overclocked memory makes a difference. The card is a bit weird as the logo is upside down, and it was supposed to have a demonic symbol on the backplate but it's blank. I has no problems with the install but it was a tight squeeze with the internal USB 3 header. Once installed it looks nice really, not that I really care about it as it's out of sight. I had an issue with the DVI as it would work with my TV but stopped working with my monitor. I got a Display port cable cheap so no big deal.

    Have a 4k telly so HDMI 2.0 is what I needed. The hands down best feature of the card is the fans idle until temps get high. It makes the PC very quite and I like films so its a big plus. Also, it doesn't pick up dust in the blades and cooler as I only game now and then. Witcher 3 and GTA V are smooth as butter. I really want free sync now, but I can live with Vsync for another while yet. I wanted the 8 Gig 480, but after looking at bench marks I couldn't justify another €100. I don't regret it, but if I spent more time gaming maybe I would! I would recommend this card to anyone who games with 60Hz 1080p. Bargain really. Thumbs up to AMD for the recent drivers. It fixed my issue with 1080p on my 4k TV. Had to use 1200p until the last update. Could be just me. :confused::cool:


  • Registered Users Posts: 1,115 ✭✭✭chrismon


    ReZonator wrote: »
    I picked up a Powercolor RX 470 Red Devil 4GB for about €200. Couldn't wait till Black Friday. Overclocked out of the box, and I believe overclocked memory makes a difference. The card is a bit weird as the logo is upside down, and it was supposed to have a demonic symbol on the backplate but it's blank. I has no problems with the install but it was a tight squeeze with the internal USB 3 header. Once installed it looks nice really, not that I really care about it as it's out of sight. I had an issue with the DVI as it would work with my TV but stopped working with my monitor. I got a Display port cable cheap so no big deal.

    Have a 4k telly so HDMI 2.0 is what I needed. The hands down best feature of the card is the fans idle until temps get high. It makes the PC very quite and I like films so its a big plus. Also, it doesn't pick up dust in the blades and cooler as I only game now and then. Witcher 3 and GTA V are smooth as butter. I really want free sync now, but I can live with Vsync for another while yet. I wanted the 8 Gig 480, but after looking at bench marks I couldn't justify another €100. I don't regret it, but if I spent more time gaming maybe I would! I would recommend this card to anyone who games with 60Hz 1080p. Bargain really. Thumbs up to AMD for the recent drivers. It fixed my issue with 1080p on my 4k TV. Had to use 1200p until the last update. Could be just me. :confused::cool:

    Thanks the that!
    In need of a mini upgrade very soon.


  • Registered Users Posts: 940 ✭✭✭GHOST MGG


    Im amazed at the saphire fury nitro i installed in my sons machine...been comparing it to my gigabyte 1070 gaming and to be honest the 1070 isnt as far ahead of it in the games ive tested..very impressed!


  • Registered Users Posts: 13,986 ✭✭✭✭Cuddlesworth


    GHOST MGG wrote: »
    Im amazed at the saphire fury nitro i installed in my sons machine...been comparing it to my gigabyte 1070 gaming and to be honest the 1070 isnt as far ahead of it in the games ive tested..very impressed!

    AMD cards tend to improve their performance over time while Nvidia's stay relatively static. This is usually because Nvidias drivers are on point at launch, where AMD are always playing catch up.


  • Registered Users Posts: 14,309 ✭✭✭✭wotzgoingon


    AMD cards tend to improve their performance over time while Nvidia's stay relatively static. This is usually because Nvidias drivers are on point at launch, where AMD are always playing catch up.

    That is true. My old R9 290(overclocked) beats by a fair margin too highly clocked GTX 970's. As was seen in benchmark score threads on here but I don't think that was the way when they were first launched.


  • Advertisement
  • Closed Accounts Posts: 29,930 ✭✭✭✭TerrorFirmer


    GHOST MGG wrote: »
    Im amazed at the saphire fury nitro i installed in my sons machine...been comparing it to my gigabyte 1070 gaming and to be honest the 1070 isnt as far ahead of it in the games ive tested..very impressed!

    Out of curiosity, what games and settings are you benching? The RX480 is a great card and AMD longevity tends to be much better than Nvidia, but a GTX1070 is head and shoulders above an RX480....

    Many people actually hit a processor ceiling depending on settings and resolution, and mistakenly think cards are closer than they actually are for that reason.

    Eg if you had an i5-4670 with RX480 and i5-4670 with GTX1070, both would give similar frames on CPU bound games like Battlefield 1.

    But replace with an i7-6700 and the GTX1070 takes a massive lead as the bottleneck is removed.

    In other games which aren't massively CPU bottlenecked, run everything at ultra with AA and supersampling applied, whatever you can to put the GPU under immense strain, and the difference will also show itself in avg and min frames.


  • Registered Users Posts: 940 ✭✭✭GHOST MGG


    I was testing on my own machine i7 6700k 16gb ram gigabyte gaming 1070 on an acer predator XB281HK 28 inch screen.
    did one raft of tests at 4k 3840x2160 then switched out cards and repeated.(now that was a pain in the ass with cleaning drivers etc)

    I tested fallout 4,total war warhammer,doom 2016 and the witcher wild hunt with all the game settings set to as high as options allowed.
    RESULTS
    average fps in warhammer was 39 for the 1070 and 32 for the nitro
    average fps in doom was 58 for the 1070 and 59 for the nitro
    the witcher was 42 for the 1070 and 34 for the nitro
    fallout 4 the 1070 hit 49fps and the nitro 31
    so to me all in all it wasnt as far behind on those titles than i had imagined


  • Closed Accounts Posts: 29,930 ✭✭✭✭TerrorFirmer


    Well, to be fair, that's a whopping 60% faster in Fallout 4 and 25% in Witcher 3... curious about the Doom results though - is that using Vulcan? It's one particular title where AMD does actually excel in general, along with Hitman and I think Tomb Raider.

    I bought that same monitor actually, still using my older 27" 1440 predator, was humming and hawing about going from 144hz to 60hz but I figure it'll be worth it for 4k...hopefully.


  • Registered Users Posts: 13,986 ✭✭✭✭Cuddlesworth


    Well, to be fair, that's a whopping 60% faster in Fallout 4 and 25% in Witcher 3... curious about the Doom results though - is that using Vulcan? It's one particular title where AMD does actually excel in general, along with Hitman and I think Tomb Raider.

    I bought that same monitor actually, still using my older 27" 1440 predator, was humming and hawing about going from 144hz to 60hz but I figure it'll be worth it for 4k...hopefully.

    At 4k the Fury is vram limited in a lot of games. But it was released against the 980ti not the 1070 and neither do 4k well. I think the 1070 and the 1080 were the first true 4k cards and AMD has no competitor to them at the moment.

    Here is a decent example of the AMD V Nvidia performance fight. 8 months ago, the 1060 was the stronger card with the AMD card making a good show in DX12 titles. Well now the 480 is neck and neck with it and beats it in most new titles. But because of how head to head releases work, the numbers that are there at review time remain for the lives of the cards. For Nvidia, this is a great thing. For AMD, its really bad business.

    In 2 years time? I would guess the 1060 will be at least 10-20% behind in new titles based on previous head to heads retested years later.


  • Registered Users Posts: 940 ✭✭✭GHOST MGG


    GDDR5 stores unused textures in ram. HBM can change and replace on demand due to the bandwidth and tech. Previously you added more ram gddr5 to increase bandwidth but with HBM you can use memory to change textures back and forth on demand. in simple terms, 4gb HBM equals roughly 6gb gddr5. previously you added more ram but now you can optimize how the driver and game handles textures, how this changes with HBM2 is unknown though


  • Registered Users Posts: 13,986 ✭✭✭✭Cuddlesworth


    GHOST MGG wrote: »
    GDDR5 stores unused textures in ram. HBM can change and replace on demand due to the bandwidth and tech. Previously you added more ram gddr5 to increase bandwidth but with HBM you can use memory to change textures back and forth on demand. in simple terms, 4gb HBM equals roughly 6gb gddr5. previously you added more ram but now you can optimize how the driver and game handles textures, how this changes with HBM2 is unknown though

    Sorry but that's false info dumped around before the launch of HBM. Its still 4gigs of ram, with more bandwidth to the core and a bit more to the system memory. AMD can control how and what is prefetched through drivers but it does not turn 4 gigs of vram into 6gigs of vram. Compression might, but Nvidia are actually better at that.

    Stolen from reddit, tldr is that the Fury hits the 4gig cap quite often with low framerates as it struggles to get textures loaded. Overall it can still do 4k though.
    Once again i repost my findings backed with benchmarks:
    http://www.gamersnexus.net/hwreviews/2438-nvidia-geforce-gtx-1080-founders-edition-review-and-fps-benchmark/page-5
    -Black ops 3 /high fxaa-ultra hd :"The Fury X hits its 4GB VRAM limitation, though, and suffers for it. 0.1% low frametimes get knocked down to just 13.3FPS, creating visibly frame “drops” when playing.This is worse than AMD's own R9 390X, leveraging its 8GB GDDR5 to keep above board."http://media.gamersnexus.net/images/media/2016/gpu/gtx-1080/gtx-1080-benchmark-blops3-4k.png
    -Mirror's Edge Catalyst /ultra 1440p http://media.gamersnexus.net/images/media/2016/gpu/gtx-1060/gtx-1060-bench-mec-1440.png
    -Rainbow six siege /ultra HD TAA (ultra texture pack):"Rainbow Six Siege was tested using the 6GB ultra textures pack and it appears as though these high resolution textures are hampering the performance of the 4GB HBM cards. This is made evident by the fact that 8GB 390X and 390 cards don’t suffer the same fate as the Fury range." http://www.techspot.com/articles-info/1174/bench/Siege_02.png
    -Shadow of Mordor /ultra-ultra hd http://media.gamersnexus.net/images/media/2016/gpu/gtx-1060/gtx-1060-oc-mordor-4k.png
    -GTA V :"At 4K – Very High / Ultra, GTA V again reveals the Fury X's capacity limitations by knocking the 0.1% low performance to half the R9 390X "http://media.gamersnexus.net/images/media/2016/gpu/gtx-1080/gtx-1080-benchmark-gta-4k.png
    -Battlefield 1 alpha /vhq 2160p http://gamegpu.com/images/stories/Test_GPU/Action/Battlefield_1_Alpha/bf1_3840.png "It is also worth noting that in 4K for AMD(fiji) there were some problems with the lack of memory, which as you can see has led to a substantial decrease in the FPS"
    Fixed issues :
    -ROTS with very texture was maxing out out the vram in 2160p but this was addressed with their latest patch(it took 5 mounths).http://www.hardocp.com/images/articles/1455189919EDyKUcGV8E_9_4.gif "Frankly put, the limitation of 4GB of VRAM is a problem at maximum game settings. The game stuttered badly and CrossFire dropped off many times. The crash occurred because the game got so stuttery the game just gave up. AMD Radeon R9 390X CrossFire holds its own quite well at 4K, it scales well and its 8GB of VRAM proves to be an advantage. "
    -Dying light in ultra hd back in july 2015 : "In this graph we have turned everything on at max settings at 4K.The GTX 980 Ti is 24% faster. What's more, see that dip in the blue line at the start of the run-through on the AMD Radeon R9 Fury X? That is a dip created by the game skipping or pausing or stuttering due to VRAM capacity bottleneck. The GTX 980 Ti and TITAN X never did this at any time during testing." http://www.hardocp.com/images/articles/1437535126iwTl74Zfm5_5_3.gif
    Keep in mind that: - this might not be game breaking but you could get slows down when loading new assets between levels.
    -Vram allocation can be increased massively in more demanding parts of a game,thus a single benchmark will not represent the whole picture(Doom's 'Nightmare' texture at 2160p).


  • Registered Users Posts: 18,706 ✭✭✭✭K.O.Kiki


    I'm starting to wonder what would happen if sites stopped testing at Ultra, and test at High instead.

    Seems Nvidia Gameworks titles introduce lots of unnecessary tesselation which slows down AMD cards badly.


  • Registered Users Posts: 13,986 ✭✭✭✭Cuddlesworth


    K.O.Kiki wrote: »
    I'm starting to wonder what would happen if sites stopped testing at Ultra, and test at High instead.

    Seems Nvidia Gameworks titles introduce lots of unnecessary tesselation which slows down AMD cards badly.

    For older cards, they would do better. For the new 4X0 range, no difference since they are quite good as tessellation.

    I'm a AMD fanboy(because I dislike Nvidia) and I like Adored. But my god does that guy have a bias.

    AMD went for heavily for Aync computing and greater access to the hardware through low level API's which make things harder for developers. Nvidia went for great horsepower moving towards lower power usage, great low cpu drivers and creating tools that made developers jobs easier and played to their strengths in terms of their hardware.

    Its only now that the industry is heading towards AMD's path but DX12 is not the answer we think it is. At the moment its not being implemented fully, developers are picking and choosing parts of it. When they start to fully implement, you are going to have severe swings in either direction, unless the developers pretty much develop two games for either brand.


  • Registered Users Posts: 2,125 ✭✭✭game4it70


    So we should hopefully get some decent info at CES although if its like the Zen preview we may get very little.
    http://ve.ga/


  • Registered Users Posts: 1,634 ✭✭✭Gehad_JoyRider


    Well, to be fair, that's a whopping 60% faster in Fallout 4 and 25% in Witcher 3... curious about the Doom results though - is that using Vulcan? It's one particular title where AMD does actually excel in general, along with Hitman and I think Tomb Raider.

    I bought that same monitor actually, still using my older 27" 1440 predator, was humming and hawing about going from 144hz to 60hz but I figure it'll be worth it for 4k...hopefully.



    The Rx480 in The Division, is now faster then the 1060 in dx12


  • Advertisement
Advertisement