Advertisement
If you have a new account but are having problems posting or verifying your account, please email us on hello@boards.ie for help. Thanks :)
Hello all! Please ensure that you are posting a new thread or question in the appropriate forum. The Feedback forum is overwhelmed with questions that are having to be moved elsewhere. If you need help to verify your account contact hello@boards.ie

AMD Zen Discussion Thread

Options
13334363839131

Comments

  • Registered Users Posts: 5,574 ✭✭✭EoinHef


    Inevitably one will end up on adverts and be double the price of a standard one on there:P


  • Registered Users Posts: 940 ✭✭✭GHOST MGG


    Why would the improve clock speeds on something that's about to be replaced in a few months anyway?

    I presumed with the pricing of the card which is more expensive than the rrp of the normal 2700x they would have to add value to it


  • Closed Accounts Posts: 7,683 ✭✭✭Subcomandante Marcos


    GHOST MGG wrote: »
    I presumed with the pricing of the card which is more expensive than the rrp of the normal 2700x they would have to add value to it

    It's a collector's item, not a general release. A collector's item released just before a general release of a new architecture. It would make nonsense to improve it. It makes no sense to release it at all to be honest.


  • Site Banned Posts: 328 ✭✭ogsjw


    I'm looking at building a workstation for 3D. Would it make sense to put in a 1700 for the time being, overclock it and replace with a Zen better chip down the road? I've seen some stuff about potential RAM incompatibility but that should be more mono related, right?

    Unless you're helping out Dneg with simming for Episode IX shots, you're never gonna feel limited by a Ryzen 1700 haha. There was some issues with Ryzen gen 1 and RAM, so do your homework there for sure.


  • Registered Users Posts: 56 ✭✭WolfSpinach


    ogsjw wrote: »
    Unless you're helping out Dneg with simming for Episode IX shots, you're never gonna feel limited by a Ryzen 1700 haha. There was some issues with Ryzen gen 1 and RAM, so do your homework there for sure.

    Thanks, yeah, I actually put it together last weekend. Seems to be a lovely little machine, very quiet and feels good. I put a Dark Rock TF in to overclock but it's still dead quiet. If only my old i7 4790 could meet with a nasty accident...


  • Advertisement
  • Registered Users Posts: 18,706 ✭✭✭✭K.O.Kiki




  • Registered Users Posts: 1,307 ✭✭✭Xenoronin


    K.O.Kiki wrote: »

    I'm alright with this. Let the fans have their collectables and don't bother with pre-binning. We already know there isn't a lot of headroom on Zen anyway, so any binned chips would be underwhelming too. I hope the employees get something nice on an occasion like this.


  • Moderators, Society & Culture Moderators Posts: 15,750 Mod ✭✭✭✭smacl


    Anyone heard anything about the next generation threadrippers? I've heard speculation that they'll also double core counts but have seen nothing much beyond that. Putting together a build in a couple of weeks and was thinking 2700x with a potential to move to 3900x but now wondering about starting at TR1920 and upgrading in a year to newer threadripper.


  • Registered Users Posts: 13,984 ✭✭✭✭Cuddlesworth


    smacl wrote: »
    Anyone heard anything about the next generation threadrippers? I've heard speculation that they'll also double core counts but have seen nothing much beyond that. Putting together a build in a couple of weeks and was thinking 2700x with a potential to move to 3900x but now wondering about starting at TR1920 and upgrading in a year to newer threadripper.

    Last two threadripper releases were mid-August. With Zen2 most likely releasing a 16 core version, its seems likely the new workstation platform will scale to 64 cores like the EYPC chips announced. Although they might just release a lower core count version until Intel comes close to parity, similar to what Nvidia is doing with AMD.

    I'd really have to say though, why bother. The AM4 16 core release is going to cover 99% of multithreaded scenarios, meanwhile 8 core is still overkill for most systems.


  • Registered Users Posts: 14,309 ✭✭✭✭wotzgoingon


    Last two threadripper releases were mid-August. With Zen2 most likely releasing a 16 core version, its seems likely the new workstation platform will scale to 64 cores like the EYPC chips announced. Although they might just release a lower core count version until Intel comes close to parity, similar to what Nvidia is doing with AMD.

    I'd really have to say though, why bother. The AM4 16 core release is going to cover 99% of multithreaded scenarios, meanwhile 8 core is still overkill for most systems.

    I agree for gaming but Threadripper has it's purpose with more PCIe lanes and quad channel memory. If you need it but as I just said it is pointless buying a Threadripper system just for gaming.


  • Advertisement
  • Moderators, Society & Culture Moderators Posts: 15,750 Mod ✭✭✭✭smacl


    I'd really have to say though, why bother. The AM4 16 core release is going to cover 99% of multithreaded scenarios, meanwhile 8 core is still overkill for most systems.

    Depends very much on the application. The performance of those I develop and quite a few of those I use (e.g. compiling large C++ projects, meshing huge models, rectifying large photography sets) is largely dependent on number of available threads and having enough RAM to allow those threads sufficient resources. You don't see it with big data in games so much as the data has already been meshed and organised in a spatially efficient manner, but how much welly do you reckon it would take to mesh a model like this so you could drop it into a gaming engine like unity?


  • Closed Accounts Posts: 7,683 ✭✭✭Subcomandante Marcos


    smacl wrote: »
    Depends very much on the application. The performance of those I develop and quite a few of those I use (e.g. compiling large C++ projects, meshing huge models, rectifying large photography sets) is largely dependent on number of available threads and having enough RAM to allow those threads sufficient resources. You don't see it with big data in games so much as the data has already been meshed and organised in a spatially efficient manner, but how much welly do you reckon it would take to mesh a model like this so you could drop it into a gaming engine like unity?



    I had a chance to use a second gen thread ripper
    paired with a high end quadro as my work computer last year for generating large, complex 3D GIS models and compiling code for modeling and the difference between it and the standard workstations with more standard Xeon processors I'd have been used to was phenomenal, hours of wasted time wiped out per model.


  • Registered Users Posts: 13,984 ✭✭✭✭Cuddlesworth


    smacl wrote: »
    Anyone heard anything about the next generation threadrippers? I've heard speculation that they'll also double core counts but have seen nothing much beyond that. Putting together a build in a couple of weeks and was thinking 2700x with a potential to move to 3900x but now wondering about starting at TR1920 and upgrading in a year to newer threadripper.
    smacl wrote: »
    Depends very much on the application. The performance of those I develop and quite a few of those I use (e.g. compiling large C++ projects, meshing huge models, rectifying large photography sets) is largely dependent on number of available threads and having enough RAM to allow those threads sufficient resources. You don't see it with big data in games so much as the data has already been meshed and organised in a spatially efficient manner, but how much welly do you reckon it would take to mesh a model like this so you could drop it into a gaming engine like unity?

    If your workload scales linearly with core count(which is unusual) then why are you asking the question in the first place? AM4 will never scale past 16 cores, TR4 will. And the price difference between the 16 core threadripper and the 16 Core AM4 system will probably be in the motherboard.


  • Moderators, Society & Culture Moderators Posts: 15,750 Mod ✭✭✭✭smacl


    If your workload scales linearly with core count(which is unusual) then why are you asking the question in the first place? AM4 will never scale past 16 cores, TR4 will. And the price difference between the 16 core threadripper and the 16 Core AM4 system will probably be in the motherboard.

    TBH, I'd looked at threadripper builds when the chip first came out and they worked out more expensive than I was willing to pay. I had a look at K.O.Kiki's post on another high-end build thread the other day and realised this has since changed dramatically and will most probably go that way. Any idea if newer generation TR4 chips will also be drop in replacements for current gen?


  • Registered Users Posts: 18,706 ✭✭✭✭K.O.Kiki


    AMD have made no promises (unlike with socket AM4) but I'm going to assume "yes" - unless there's some incompatibilities, e.g. DDR5, chipset limitation, or insufficient VRMs.


  • Registered Users Posts: 6,948 ✭✭✭circadian


    So there's been plenty of leaks on the 3rd gen Ryzen. If the info is accurate then I see this as being another solid lineup.

    Navi, however, remains to be seen. I'm close to pulling the trigger on a build but I'd love to see if AMD can touch the 2080/ti and at what price.


  • Moderators, Society & Culture Moderators Posts: 15,750 Mod ✭✭✭✭smacl


    If your workload scales linearly with core count(which is unusual) then why are you asking the question in the first place? AM4 will never scale past 16 cores, TR4 will. And the price difference between the 16 core threadripper and the 16 Core AM4 system will probably be in the motherboard.

    And built my tr4 box today with many thanks to the advice from those in the know here. Working out of the country next week so will be the following week before I get to put it through its paces. Love the tr4 chip placement method, zero stress.


  • Registered Users Posts: 14,309 ✭✭✭✭wotzgoingon


    circadian wrote: »
    So there's been plenty of leaks on the 3rd gen Ryzen. If the info is accurate then I see this as being another solid lineup.

    Navi, however, remains to be seen. I'm close to pulling the trigger on a build but I'd love to see if AMD can touch the 2080/ti and at what price.

    Just get a Vega 56 or 64 or if your feeling rich a VII.

    I have been using AMD for a few years now R9 290, R9 Fury X and now a Vega 56 and the games run brilliant. I wouldn't get too caught up in reviewers benchmarks as you do not notice the difference.

    Sure AMD once set up a blind test a Zen CPU and AMD card and a Intel/Nividia combo and nobody could tell the difference when they played the games.


  • Registered Users Posts: 6,948 ✭✭✭circadian


    Just get a Vega 56 or 64 or if your feeling rich a VII.

    I have been using AMD for a few years now R9 290, R9 Fury X and now a Vega 56 and the games run brilliant. I wouldn't get too caught up in reviewers benchmarks as you do not notice the difference.

    Sure AMD once set up a blind test a Zen CPU and AMD card and a Intel/Nividia combo and nobody could tell the difference when they played the games.

    That's a fair point. I really want the vii but I'm playing on a 4k monitor. I doubt the difference between an R vii and 2080 is noticeable but the noise and power draw would be (building in an SG13).


  • Registered Users Posts: 14,309 ✭✭✭✭wotzgoingon


    circadian wrote: »
    That's a fair point. I really want the vii but I'm playing on a 4k monitor. I doubt the difference between an R vii and 2080 is noticeable but the noise and power draw would be (building in an SG13).

    I game on a 4k Tv with a Vega 56. I do not hear any noise. Fans are running at around 1300-1600 rpm at around 55-70 degrees. Undervolt on core and slight voltage bump and overclock on HBM.


  • Advertisement
  • Registered Users Posts: 18,706 ✭✭✭✭K.O.Kiki


    circadian wrote: »
    That's a fair point. I really want the vii but I'm playing on a 4k monitor. I doubt the difference between an R vii and 2080 is noticeable but the noise and power draw would be (building in an SG13).

    RTX 2080 may have lower TDP but it uses slightly more power at peak.
    If you're sensible with CPU choice neither would require more than 450W PSU though.


  • Closed Accounts Posts: 29,930 ✭✭✭✭TerrorFirmer


    If you're playing on a 60hz screen I think the VII is fine, but for me personally, on a 144hz display it's far too inconsistent versus the 2080 to consider it from a purely gaming POV.

    In some games it performs well enough, but it others it's worse than the 2070 even. I think if it was cheaper it'd make more sense (or obviously if you've other uses for the card outside of gaming, where the 16GB HBM2 can shine)


  • Registered Users Posts: 6,948 ✭✭✭circadian


    Yeah it's a 60hz screen with Freesync so that's swinging me towards the Radeon vii, I think if I overclocked it a bit I'd get the performance I'm looking for. Not to mention a lot of reviews are based on games that aren't optimised very well and the visual difference between Ultra/high settings is negligable.

    I'm definitely getting a 2600x though, in terms of bang for buck I don't see any other processor coming close.


  • Closed Accounts Posts: 7,683 ✭✭✭Subcomandante Marcos


    circadian wrote: »
    Yeah it's a 60hz screen with Freesync so that's swinging me towards the Radeon vii, I think if I overclocked it a bit I'd get the performance I'm looking for. Not to mention a lot of reviews are based on games that aren't optimised very well and the visual difference between Ultra/high settings is negligable.

    I'm definitely getting a 2600x though, in terms of bang for buck I don't see any other processor coming close.

    If you're overclocking the you you may as well got the non X 2600 and OC thay, same performance as the X for less money with a bit of fiddling.


  • Registered Users Posts: 6,948 ✭✭✭circadian


    If you're overclocking the you you may as well got the non X 2600 and OC thay, same performance as the X for less money with a bit of fiddling.




    I'd agree but this is going into a Sugo 13 so I'd like to keep cpu to stock in terms of cooling.


  • Registered Users Posts: 7,636 ✭✭✭Inviere


    circadian wrote: »
    I'd agree but this is going into a Sugo 13 so I'd like to keep cpu to stock in terms of cooling.

    Sweet little sff cases those, not much room for cooling though. I think the general consensus is to use blower style gpu cooling for cases that small, so the heat is actually exhausted from the case rather than let sit inside the case. My first sff build was in a Sugo 13, and I'd a 1070 blower in it which ran quite happy. I moved to a Core V1 from the Sugo though, for a bit of extra flexibility.


  • Registered Users Posts: 6,948 ✭✭✭circadian


    Inviere wrote: »
    Sweet little sff cases those, not much room for cooling though. I think the general consensus is to use blower style gpu cooling for cases that small, so the heat is actually exhausted from the case rather than let sit inside the case. My first sff build was in a Sugo 13, and I'd a 1070 blower in it which ran quite happy. I moved to a Core V1`from the Sugo though, for a bit of extra flexibility.

    Yeah there's no Radeon vii that's a blower that I can find, at least one that'll fit in the case. Noise is a problem with them too.


  • Registered Users Posts: 18,706 ✭✭✭✭K.O.Kiki


    Inviere wrote: »
    Sweet little sff cases those, not much room for cooling though. I think the general consensus is to use blower style gpu cooling for cases that small, so the heat is actually exhausted from the case rather than let sit inside the case. My first sff build was in a Sugo 13, and I'd a 1070 blower in it which ran quite happy. I moved to a Core V1 from the Sugo though, for a bit of extra flexibility.

    I'd never suggest a blower card.
    Better to get a GPU with a real heatsink & mount a Noctua/be quiet fan somewhere to vent it.


  • Registered Users Posts: 14,309 ✭✭✭✭wotzgoingon


    K.O.Kiki wrote: »
    I'd never suggest a blower card.
    Better to get a GPU with a real heatsink & mount a Noctua/be quiet fan somewhere to vent it.

    I agree a decent fan set to a faster speed to vent is best and a decent one would still be quiet doing so.

    I've a switch on my case to set the CPU and exhaust fan to speed up when I game not needed but I do so anyway.

    Also you would hear a blower GPU fan way louder than any exhaust fan even a Nvidia card.


  • Advertisement
  • Registered Users Posts: 3,495 ✭✭✭Lu Tze


    Seems gigabyte have released a bios enabling pci-e 4 on the b450 and x470 boards, for when the 3000 series are released.


Advertisement