Advertisement
If you have a new account but are having problems posting or verifying your account, please email us on hello@boards.ie for help. Thanks :)
Hello all! Please ensure that you are posting a new thread or question in the appropriate forum. The Feedback forum is overwhelmed with questions that are having to be moved elsewhere. If you need help to verify your account contact hello@boards.ie
Hi all! We have been experiencing an issue on site where threads have been missing the latest postings. The platform host Vanilla are working on this issue. A workaround that has been used by some is to navigate back from 1 to 10+ pages to re-sync the thread and this will then show the latest posts. Thanks, Mike.
Hi there,
There is an issue with role permissions that is being worked on at the moment.
If you are having trouble with access or permissions on regional forums please post here to get access: https://www.boards.ie/discussion/2058365403/you-do-not-have-permission-for-that#latest

Nvidia VS AMD

  • 28-05-2015 3:49pm
    #1
    Registered Users, Registered Users 2 Posts: 7,182 ✭✭✭


    My last 3 GPU purchases have been Nvidia (5 if you count PCs build for friends), 560Ti and a staggered 670SLI. Before that was the HD4870 and before that was some 7xxx series Nvidia.

    I've always gone for price/performance over anything else. Always. I'd had my share of driver mishaps from both sides so I don't take that into consideration when buying.

    But lately, I'm getting tired of Nvidia's attitude, everything is closed source. It's making me really lean AMD for my next GPU, and if the Zen cores are up to scratch then maybe a move from Intel is in order as well.

    This Hairworks/Gameworks debacle on Project Cars and Witcher 3 is really annoying me. The lack of decent driver updates for non-Maxwell cards too.

    Anyone else feel the same? Am I mental?


Comments

  • Registered Users, Registered Users 2 Posts: 1,143 ✭✭✭jumbobreakfast


    Not mental. I'm hoping that AMD can come up with something that trounces Nvidia GPUs and Intel CPUs too. It would be great for competition and a strong AMD offering would be good for Nvidia/Intel users.

    Nvidia disrupted consumers buying decisions by announcing tech like G-sync and VR-SLI but then delaying the release. Some people bought 2 980s because Nvidia suggested that VR-SLI would be a feature but they still haven't delivered.

    On the other hand, AMD have been very generous in opening up freesync and especially Mantle (MS apparently copy-pasted a whole chunk of it: http://s169.photobucket.com/user/Rocketrod6a/media/Mantle%20vs%20DX12%20pgm%20guide.jpg.html
    Valve also gave special mentions to AMD when talking about the help they got for the Vive development.

    GPUs are expensive so even though I'd like to change to AMD I might be swung towards better game support in the end.


  • Registered Users, Registered Users 2 Posts: 11,397 ✭✭✭✭Digital Solitude


    Can't see them catching up in the CPU line, Intel are leagues ahead currently.

    Think the GPUs are personal preference tbh, I'd prefer AMDs cards with Intel CPUs myself. nVidia being a bit of a shower aren't helping.

    Hopefully we see some good things with the new series of GPUs anyways


  • Registered Users, Registered Users 2 Posts: 85,193 ✭✭✭✭Overheal


    I prefer AMD over both Intel and Nvidia. Intel was found guilty of antitrust violations that basically served to illegitimately keep AMD's feet glued down for years while they raced ahead with their market lead. Meanwhile Nvidia knew about the series of defective GPUs that plagued many notebooks of multiple brands in the mid 2000s (including Dell, HP, and Apple) and still decided to push the product on to consumers. Even the Xbox 360 was not immune from this executive decision. I know you're familiar with the Red Ring of Death, GC - caused by poor manufacturing processes in the (ding ding ding) Nvidia GPU baked into the console.

    AMD meanwhile had a no-complaints track record in the Gamecube, Wii, Wii U, Xbox One, Playstation 4... well, you get the idea.

    AMD CPUs have always had pretty solid price:perfomance ratio; I've never had a complaint with the GPUs. I hear rave things from friends who game on AMD A8/A10 driven laptops; Dell Apple and HP unsurprisingly favor AMD heavily over Nvidia when marketing multimedia/gaming laptops, the Mac Pro, MBP, etc.

    My current rig is 2x5770s and a 1055t hexacore, built in 2009; only now getting to the point where I might find a game or two in Steam Early Access to be relatively unplayable - and Star Citizen chugs a fair bit :pac:


  • Registered Users, Registered Users 2 Posts: 3,878 ✭✭✭Robert ninja


    I'm running Nvidia hardware but the next time I upgrade I'll be going AMD for sure. Almost entirely because of this.

    In short, Nvidia allows games to override your colour settings. Anyone else running a gaming TN panel for responsiveness? Or just a badly calibrated out-of-box IPS/PLS, or hell just about any monitor out of the box? You know how much calibration is important so you don't end up with washed out colours. Well yeah, it's all out the window with Nvidia drivers unless you play borderless windowed and you'd be surprised at how many games either don't have it or it causes problems.

    And this is an issue from 2011 with still no override feature for user colour profile settings software-side. Please tell me what is the god damn point of nvidia hairworks (which is meh imo) and all these other fancy trademark effects when in the end you can't even get your colours to stay calibrated? Ridiculous. I've never really wanted to upgrade until The Witcher 3 which only performs at 60fps when 720p at close to minimum settings. Why it performs so bad I don't know... yes it looks great but I've played Metro, Crysis and The Witcher 2 which can all be comparable at times in graphics. Nvidia gameworks for ya, there. Get the latest 500+ quid GPU or take your pick at either sub 30fps or sub 1080p. Sick of it, honestly. I don't think I'll be buying another game that has gameworks.


  • Registered Users, Registered Users 2 Posts: 23,137 ✭✭✭✭TheDoc


    Your not mental at all.

    I purchased a Sapphire R9 290 Trix last week, on the pure basis that I don't want an NVIDIA product in my machine, and I'm not providing support via my purchase.

    My last card was AMD, and before that was NVIDIA. Typically I went with best bang for buck performance. But NVIDIA have shifted the line in the sand creating proprietary code and libraries not accessible to AMD to optimise.

    Coupled with their false advertisement of the 970, shady contracts and limitations on developers, I simply didn't want to have a 970 in my machine.

    I had initially done my resreach around JAnuary when I put a new motherboard and intel chip into my machine, that I'd buy a new GPU in the summer, and I was 100% settled on the 970. Recent issues though had me shift dramatically, and then the final click was seeing the card I bought practically brand new up for sale second hand at an incredible price.

    I can somewhat appreciate those defending NVIDIA by saying it happens in any industry, that you need to be innovative and develop your own stuff,and open source is a dreamworld. That might well be the case, in other industries, but this is a drastic change for the GPU market

    It wasn't so long ago that NVIDIA and AMD provided code open source on their website for dev's to use, and it wasn't so long ago AMD were wiping the floor with NVIDIA, and actually reached out providing them consultation hours with their engineers, and providing assistance to bring their products up to spec.

    I'll be firmly team red for the next few years. The new games issues doesn't bug me as I typically don't buy them until a year or so after release, but AMD are operating fast enough getting drivers out there, considering they are having to do critical optimisation AFTER the game releases, as opposed to the 90 day leadtime NVIDIA give themselves with their shady contracts and closed libraries.


  • Advertisement
  • Registered Users, Registered Users 2 Posts: 1,143 ✭✭✭jumbobreakfast


    TheDoc wrote: »
    The new games issues doesn't bug me as I typically don't buy them until a year or so after release, but AMD are operating fast enough getting drivers out there, considering they are having to do critical optimisation AFTER the game releases, as opposed to the 90 day leadtime NVIDIA give themselves with their shady contracts and closed libraries.

    That's the only thing keeping me with Nvidia. I just can't afford to spend a lot of money on an AMD card that might not work well with the latest games even if the reasons are a bit dodgy. This is an interesting graph showing the numbers of discrete graphics cards sold by Nvidia versus ATI/AMD since 2003:

    https://i.imgur.com/bNqJYgA.png The graph might be a bit confusing, light blue line is for AMD, dark blue for Nvidia. The boxes show when a particular card was released but could point at either line to highlight the effect they had on either manfuacturers market share.

    AMD is under severe pressure to reverse that trend I'm sure or else they might just stop making discrete GPUs altogether and then we are all screwed. VR is the main reason that I might switch to AMD if they come up with something good, I'd like to support them.


  • Registered Users, Registered Users 2 Posts: 6,984 ✭✭✭Venom


    TheDoc wrote: »
    Your not mental at all.

    I purchased a Sapphire R9 290 Trix last week, on the pure basis that I don't want an NVIDIA product in my machine, and I'm not providing support via my purchase.

    My last card was AMD, and before that was NVIDIA. Typically I went with best bang for buck performance. But NVIDIA have shifted the line in the sand creating proprietary code and libraries not accessible to AMD to optimise.

    Coupled with their false advertisement of the 970, shady contracts and limitations on developers, I simply didn't want to have a 970 in my machine.

    I had initially done my resreach around JAnuary when I put a new motherboard and intel chip into my machine, that I'd buy a new GPU in the summer, and I was 100% settled on the 970. Recent issues though had me shift dramatically, and then the final click was seeing the card I bought practically brand new up for sale second hand at an incredible price.

    I can somewhat appreciate those defending NVIDIA by saying it happens in any industry, that you need to be innovative and develop your own stuff,and open source is a dreamworld. That might well be the case, in other industries, but this is a drastic change for the GPU market

    It wasn't so long ago that NVIDIA and AMD provided code open source on their website for dev's to use, and it wasn't so long ago AMD were wiping the floor with NVIDIA, and actually reached out providing them consultation hours with their engineers, and providing assistance to bring their products up to spec.

    I'll be firmly team red for the next few years. The new games issues doesn't bug me as I typically don't buy them until a year or so after release, but AMD are operating fast enough getting drivers out there, considering they are having to do critical optimisation AFTER the game releases, as opposed to the 90 day leadtime NVIDIA give themselves with their shady contracts and closed libraries.


    I agree 100% with everything you said but I'm 90% sure I'll be going Nvidia for my next card. AMD are just dropping the ball over and over again by being constantly late with drivers for new games along with there policy of two main driver updates per year with beta drivers to fill in the gaps, is just not good enough.

    Nvidia's propriety tech such as Phys-X, H-sync, VSR and Hairworks is better supported and overall just works better than AMD's versions. Rumors that AMD's R9 3xxx line being rebrands is not going to help dig them out of the hole there currently in either.


  • Moderators, Science, Health & Environment Moderators Posts: 9,018 Mod ✭✭✭✭mewso


    I've been AMD for a long time regardless of bang for buck. It changes constantly so I prefer to be loyal. Unfortunately having said that if you were thinking about buying an AMD card soon I would wait for the 300 series to come out. Not to buy one but to see how much the 290 cards drop by. From what I can tell the 300 series will just be a slightly boosted 200 series. Nothing game changing so the price drop in the previous range is the one to look for imo. The big advantage Nvidia have at the moment is the low power consumption. I have a decent PSU so it's not that big a deal but if you don't have a good PSU then Nvidia is the best option really. Maybe the 300 series will address the power thing.

    *Edit - I don't get all the big new games that come out but I thought AMD were quick enough with their GTA driver. How often have they been slow with drivers for new games?


  • Registered Users, Registered Users 2 Posts: 23,137 ✭✭✭✭TheDoc


    In fairness those NVIDIA proprietary libraries are available to developers who sign up to use them, under the strict conditions of their use.

    Few big developers already have signalled they most likely won't use NVIDIA libraries after seeing what happened with Cars and Witcher, and will either utilise the already open source libraries, or go with AMD's libraries, which are not closed off.

    I think some sympathy needs to be had with AMD here, been given no time to prepare optimisation drivers. Like how exactly are you supposed to develop optimisation around tech and source code, that you cannot see? It's pure trial and error. And then you need to factor in the customised setups of millions of users. It's not like your doing it for a PS4 where every machine is the same.

    In my job at the moment a new system is coming in, that will feed data into the existing system I'm responsible for. I've been involved in the process for months, getting all the information I need to ensure interfaces are correctly setup, and we have an optimum flow of data coming across out network etc.

    If that vendor didn't share with me specifics about the data flow, they wouldn't have been signed up as a vendor, simples. I wouldn't be running around trying to optimise and design interfaces in the "hope" they work.

    So I've massive sympathy with AMD here. Closed libraries from a competitor were implemented into the game three months before launch, and no details provided to AMD in order for them to optimise. They had to wait for release, get copies of the launch titles, and then run through massive trial and error to get things working.

    I kinda hope AMD keep their open source policy. Much of the gains made in the GPU market have actually been from them, and their specifically designed libraries are far superior to NVIDIA's. Sure thats where NVIDIA even got the idea from, and the help developing, being provided the open source libraries from AMD.

    Entirely possible AMD could close off their new technologies, and make a show of NVIDIA, just not sure thats what they are into.


  • Registered Users, Registered Users 2 Posts: 13,084 ✭✭✭✭Kirby


    TheDoc wrote: »
    In fairness those NVIDIA proprietary libraries are available to developers who sign up to use them, under the strict conditions of their use.

    Few big developers already have signalled they most likely won't use NVIDIA libraries after seeing what happened with Cars and Witcher, and will either utilise the already open source libraries, or go with AMD's libraries, which are not closed off.

    I think some sympathy needs to be had with AMD here
    , been given no time to prepare optimisation drivers. Like how exactly are you supposed to develop optimisation around tech and source code, that you cannot see? It's pure trial and error. And then you need to factor in the customised setups of millions of users. It's not like your doing it for a PS4 where every machine is the same.

    In my job at the moment a new system is coming in, that will feed data into the existing system I'm responsible for. I've been involved in the process for months, getting all the information I need to ensure interfaces are correctly setup, and we have an optimum flow of data coming across out network etc.

    If that vendor didn't share with me specifics about the data flow, they wouldn't have been signed up as a vendor, simples. I wouldn't be running around trying to optimise and design interfaces in the "hope" they work.

    So I've massive sympathy with AMD here. Closed libraries from a competitor were implemented into the game three months before launch, and no details provided to AMD in order for them to optimise. They had to wait for release, get copies of the launch titles, and then run through massive trial and error to get things working.

    I kinda hope AMD keep their open source policy. Much of the gains made in the GPU market have actually been from them, and their specifically designed libraries are far superior to NVIDIA's. Sure thats where NVIDIA even got the idea from, and the help developing, being provided the open source libraries from AMD.

    Entirely possible AMD could close off their new technologies, and make a show of NVIDIA, just not sure thats what they are into.

    Why? They arent a childhood friend. They are a business. And they are getting outfought. Thats not nice for them but I'm not sure why anybody would feel sorry for them.

    Nvidia lie about their cards as seen with 970 debacle, they make exclusive deals with companies, all that terrible stuff. Despite all the ethical issues and shady crap that nvidia does, right now they are the better option. As a consumer, thats all I care about. I want something that works without having to wait 3 months for drivers.

    I'm actually having a mixed time with the wicther at the moment. It's crashing quite a lot. And its frustrating. But it only serves to highlight how this hasnt happened in about a decade for me. After having several ATI cards, my last 3 cards have been nvidia ones and none of my games crash......pretty much ever. I havent had any overheated or cooked cards, and performance has been good. The drivers are updated all the time.

    I agree that if AMD stopped making cards, it would be bad for consumers. Competition is good. But that doesnt mean I'm going to buy an AMD for that reason. I'm not altruistic. I just want to game.


  • Advertisement
  • Registered Users, Registered Users 2 Posts: 1,143 ✭✭✭jumbobreakfast


    AMD definitely seem to be the good guys from better linux drivers to making their tech open. I wonder if we'll see a further division with Nvidia on Windows and AMD on SteamOS - I reckon there could be a sting in the tail of this free Windows 10 upgrade


  • Registered Users, Registered Users 2 Posts: 23,137 ✭✭✭✭TheDoc


    The reason we enjoy the advancements in gaming technology, and have enjoyed the games we have played over the last ten years, fifteen years, is down to the open source nature of sharing technology advancements between AMD and NVIDIA.

    AMD are trying to maintain that long standing measure, while NVIDIA break off into the modern, built it, patent it, cash it.

    It's easy to be nieve and not appreciate that NVIDIA are entitled to make money and profits from their developments, but on the flipside, they never would have made those developments in the first place, without AMD sharing their advancements in the late early 00's and mid 00's. Maybe people arn't aware that happened.

    At the end of the day sure, if you just want a card and don't care about the surroundings, you'll buy whatever and not care about the consequencies.

    However there is a large portion of the gaming community who always keep and eye on how developements take place between AMD & Intel and AMD & NVIDIA, appreciating that if one or the other leave the market, the consumer base as a whole is ****ed and PC gaming as we have come to enjoy will suffer a rapid change in the landscape.

    I'm only one customer and won't change the world, but I simply just don't buy things from companies who do **** I don't agree with or condone, if the practice is known to me. I'm not some mad eco warrior, I just think the NVIDIA contraversies have gone on month to month, and I felt more comfortable buying AMD last week.

    And I actually had a R9 270x in my rig for two years, and had no issues, so comfortable that I'd have a good card in a 290. That there is issues with Witcher and Cars is just really a combination of the shifting gaming market as a whole.


  • Registered Users, Registered Users 2 Posts: 7,182 ✭✭✭Genghiz Cohen


    Kirby wrote: »
    Why? They arent a childhood friend. They are a business. And they are getting outfought. Thats not nice for them but I'm not sure why anybody would feel sorry for them.

    It's the means by which they are being beaten that should cause concern. It's like AMD and Nvidia are using 2 different rule sets and Nvidia are winning because of that. We should support AMD's rules because they are better for the consumer.
    Kirby wrote: »
    Nvidia lie about their cards as seen with 970 debacle, they make exclusive deals with companies, all that terrible stuff. Despite all the ethical issues and shady crap that nvidia does, right now they are the better option. As a consumer, thats all I care about. I want something that works without having to wait 3 months for drivers.

    I agree that if AMD stopped making cards, it would be bad for consumers. Competition is good. But that doesnt mean I'm going to buy an AMD for that reason. I'm not altruistic. I just want to game.

    You want to game, and you want the best bang for your euro. That's what everyone wants and it's what you will want when it comes time to upgrade. But unless the consumer speaks up AMD may be out of the running and stop making GPUs altogether and that only benefits Nvidia.


  • Registered Users, Registered Users 2 Posts: 23,137 ✭✭✭✭TheDoc


    The above is a real possibility.

    NVIDIA have circulated some obscure reports on market share showing then leading with around 70% of the PC market, which is not accurate imo. Latest steam survey shows its neared 55-45 in favour of NVIDIA.

    But AMD are clearly the market leader in consoles, which have more units I'd hazard globally, and reach a wider audience. They are also the market leader in the mobile gaming space, and have a pretty massive market share there.

    They also just won the latest tender to have their chips in place for IMacs and Macbooks going forward for a few years.

    Might well get to the point where AMD don't see the need to invest heavily in the PC market, which would would be a massive blow to the consumer base. Even seeing NVIDIA's libraries seemingly target poor optimisation to cards below the 900 series is an absolute scandal. A real horrible technique in a blatant attempt to move to an Apple style model where consumers feel they need to keep abreast with the latest releases, dropping 400-500 euro every 18 months as opposed to previously where you could get a good 2-3 years from a GPU, if not longer.


  • Registered Users, Registered Users 2 Posts: 6,984 ✭✭✭Venom


    TheDoc wrote: »
    The above is a real possibility.

    NVIDIA have circulated some obscure reports on market share showing then leading with around 70% of the PC market, which is not accurate imo. Latest steam survey shows its neared 55-45 in favour of NVIDIA.

    But AMD are clearly the market leader in consoles, which have more units I'd hazard globally, and reach a wider audience. They are also the market leader in the mobile gaming space, and have a pretty massive market share there.

    They also just won the latest tender to have their chips in place for IMacs and Macbooks going forward for a few years.

    Might well get to the point where AMD don't see the need to invest heavily in the PC market, which would would be a massive blow to the consumer base. Even seeing NVIDIA's libraries seemingly target poor optimisation to cards below the 900 series is an absolute scandal. A real horrible technique in a blatant attempt to move to an Apple style model where consumers feel they need to keep abreast with the latest releases, dropping 400-500 euro every 18 months as opposed to previously where you could get a good 2-3 years from a GPU, if not longer.

    Every tech website and youtube channel regards those marketshare reports as being pretty accurate tho, so thats a pretty big scam for Nvidia to have pulled off. AMD has suffered a $180 million loss in profits for the first quarter of this year and are down 26.4% in sales compared to Q4 2014 which does not back up a 55-45 market split.

    The overall vibe from various tech websites and commenters regarding AMD's R9 3xxx series is very meh due to them just being another rebrand of a 2 year old rebrand and while AMD's Fiji chipset could be a Titan killer, the rumored $750-850 price tag puts it out of the reach of the majority of people.

    While my current and previous GPU's are from AMD, they sadly are being outfought and outgunned by Nvidia and I just see nothing changing in the future :(


  • Registered Users, Registered Users 2 Posts: 12,775 ✭✭✭✭Gbear


    I've a 7950 that's blown my mind as long as I've had it but it's showing it's age these days and if I want new games, particularly ones running on the new gen of super fast high res monitors, then I need an upgrade.


    I've been pining for the r9 300 series since December or whenever it was announced.

    With the rumours of the rebrands of most of the r9 200 line and a flagship HBM card to rival the Titan, I'm concerned that there's going to be a 980Ti shaped hole in AMD's inventory.

    I'd still never go with Nvidia after all these years, so I might have to consider some of the old stock at knockdown prices - maybe a couple of those Sapphire 290x 8GBs or a 295x2.

    What I'm praying for is that the HBM card (seeing rumours it's going to have a "Titan" like name to distinguish it from the main line) won't be any more than £700. Any more and you're getting into silly territory.
    Or that the 390x will be either very competitively priced or able to trade blows with the 980Ti.


  • Registered Users, Registered Users 2 Posts: 3,387 ✭✭✭glynf


    Much as I dislike their current tactics, chances are I will go green next time around again. Two cards are a sizeable investment I really don't want to risk not getting the best performance/compatibility with newer games. I had crossfire 6970's and 7970's, in their day both excellent cards on their own but the amount of hassle and frustration I had with slow & buggy driver releases and/or eyefinity profiles and xfire support really pissed me off. I lost count of the hours trying DIY fixes for bastard microstutter.

    Snaily bastards that Nvidia tend to be, multi monitor & SLI is generally a lot better supported. Saying that I want to see the R9 390 do well...even if the performance is not as good as touted I hope the drop their pricing & sell sh!te loads. More competition is needed and it might put pressure on Nvidia to do something about their ridiculous pricing.


  • Registered Users, Registered Users 2 Posts: 85,193 ✭✭✭✭Overheal


    Kirby wrote: »
    Why? They arent a childhood friend. They are a business. And they are getting outfought. Thats not nice for them but I'm not sure why anybody would feel sorry for them.

    Nvidia lie about their cards as seen with 970 debacle, they make exclusive deals with companies, all that terrible stuff. Despite all the ethical issues and shady crap that nvidia does, right now they are the better option. As a consumer, thats all I care about. I want something that works without having to wait 3 months for drivers.

    I'm actually having a mixed time with the wicther at the moment. It's crashing quite a lot. And its frustrating. But it only serves to highlight how this hasnt happened in about a decade for me. After having several ATI cards, my last 3 cards have been nvidia ones and none of my games crash......pretty much ever. I havent had any overheated or cooked cards, and performance has been good. The drivers are updated all the time.

    I agree that if AMD stopped making cards, it would be bad for consumers. Competition is good. But that doesnt mean I'm going to buy an AMD for that reason. I'm not altruistic. I just want to game.
    That's an odd position to take, though you must be somewhat of a rarity if you've never had a bad card from nvidia. Hell my first card from them was the GF 4 MX 440. Talk about a card you couldn't do jack **** with, nah it doesn't need this pixel shader 2.0. 'Well now you can't play half the games on the market. Sucks to suck.' You also never had an Xbox 360 go bad on you.

    Both companies rebrand chips all the time - hell, Intel does basically the same bloody thing (I've been indoctrinated on their market material mind you, through Intel Retail Edge). Even by their own admission, every other processor generation is essentially just a re-print of the last, with better optimization for energy consumption, they work on a tick-tock cycle like the iPhone or many other products. GPUs have always been much the same way. It's absurdly expensive to design and tool-up manufacturing to print a completely new generation of chip. Not trying to make this an industrial engineering thread though, but here is a taste: http://en.wikipedia.org/wiki/List_of_Intel_manufacturing_sites not all sites can affordably or feasibly keep tooling down to the current 14 nanometer process, for instance.

    Personally I would actually wait to do anything until more is known about how the Virtual Reality market is going to pan out - Oculus Rift, the MS HoloLens, etc. are all still in development. Unless you go buy some horribly expensive beast of a GPU for the sake of future-proofing, whatever you buy in the sane-range is going to have issues.


  • Registered Users, Registered Users 2 Posts: 6,984 ✭✭✭Venom


    The Oculus Rift is set to launch very soon I believe. The minimum specs have been listed and are GTX 970/R9 290.


  • Closed Accounts Posts: 29,930 ✭✭✭✭TerrorFirmer


    AMD's physical cards are good but driver support isn't as good as Nvidia.

    I prefer AMD's cards in most of the mid-upper range - 280x vs 960, r9 290 vs 970, etc - higher power but better performance per euro.

    CPU wise though AMD are a disaster in the competitive desktop PC range, though their a8 and a10 APU's in laptop's are alright, least you can play the odd game on them better compared to Intel's integrated graphics.

    They will never catch up to Intel at this point though in the competitive gaming market. They've been very far behind for almost a decade now. Perhaps they'll instead turn to strengthening their market in terms of APU's for desktops, laptops, consoles.
    That's an odd position to take, though you must be somewhat of a rarity if you've never had a bad card from nvidia. Hell my first card from them was the GF 4 MX 440. Talk about a card you couldn't do jack **** with, nah it doesn't need this pixel shader 2.0. 'Well now you can't play half the games on the market. Sucks to suck.' You also never had an Xbox 360 go bad on you.

    ATI/AMD are as guilty of that as Nvidia. Remeber the X series incident with Shader Model 3.0? Overnight AMD cards that would usually run games at high settings weren't even able to run some games at all.

    And you do know that ATI provided the GPU for the Xbox 360? Not Nvidia. Nvidia supplied the GPU for the original console in 2002.


  • Advertisement
  • Registered Users, Registered Users 2 Posts: 7,182 ✭✭✭Genghiz Cohen


    They will never catch up to Intel at this point though in the competitive gaming market. They've been very far behind for almost a decade now. Perhaps they'll instead turn to strengthening their market in terms of APU's for desktops, laptops, consoles.

    I remember thinking the same thing about Intel back when AMD Athlon were THE gaming CPU. Then Conroe and the Core2 Duos came out.

    Let's see what Zen has for use before making any calls.


  • Closed Accounts Posts: 29,930 ✭✭✭✭TerrorFirmer


    I remember thinking the same thing about Intel back when AMD Athlon were THE gaming CPU. Then Conroe and the Core2 Duos came out.

    Let's see what Zen has for use before making any calls.

    Yeah but the difference there is that Intel had massive funds to pour into research that AMD just did not and certainly now do not have.

    Intel is making billions in profit each year while AMD posts losses.

    They got lazy with the Pentium 4 and built on that tech for too long, when AMD embarrassed them they pulled up their slacks and crushed AMD with Conroe...that was 2006 and AMD never caught up - or even came close.

    Their strong market now is APU's on all platforms, dedicated processors are just falling too far behind. I'd be shocked if Zen is anything other than just playing catch-up to Intel's i-lines.


  • Registered Users, Registered Users 2 Posts: 6,984 ✭✭✭Venom


    Even if Zen completely destroys Intel's lineup, I'd bet good money Intel have some killer chips in the wings just waiting to be released. AMD's poor CPU offerings the last few years means Intel have just added 10-15% performance increases per generation and don't have to break out the big guns.


  • Registered Users, Registered Users 2 Posts: 1,143 ✭✭✭jumbobreakfast


    Disappointing press conference from AMD this morning at Computex, all APU stuff announced. Fiji reveal won't be until June 16th at E3. I wonder did Nvidia pressure them into a mad scramble to optimise their drivers?:


  • Registered Users, Registered Users 2 Posts: 6,768 ✭✭✭raze_them_all_


    Disappointing press conference from AMD this morning at Computex, all APU stuff announced. Fiji reveal won't be until June 16th at E3. I wonder did Nvidia pressure them into a mad scramble to optimise their drivers?:

    the 980ti performance and price probably caused a stir, I know my mate was delighted he didn't pull the trigger on a titan


  • Registered Users, Registered Users 2 Posts: 12,775 ✭✭✭✭Gbear


    The more that seems to come out about the 300 series, the more of a disaster it's starting to sound like.

    I think this could well be the death of AMD.

    I'm starting to strongly consider getting a 980ti.

    I want the market to be competitive, but only up to a point. I don't want to have to hamstring my PC just so I can sit on a high horse because I helped the "little guy".


  • Registered Users, Registered Users 2 Posts: 3,387 ✭✭✭glynf


    The Fury X looks like an interesting card;

    fury05.jpg

    fury12.jpg

    Hopefully they price it aggressively & give nvidia a run.


  • Registered Users, Registered Users 2 Posts: 12,775 ✭✭✭✭Gbear


    glynf wrote: »
    The Fury X looks like an interesting card;

    fury05.jpg

    fury12.jpg

    Hopefully they price it aggressively & give nvidia a run.

    In the end it'll be down to price/performance.

    Things aren't looking good for the regular cards though.

    And there seems to be an awful lot of noise about those and not much about the Fury. There's been some rumours that it still isn't ready.


  • Registered Users, Registered Users 2 Posts: 6,984 ✭✭✭Venom


    One of the big retail stores in the US screwed up and is selling the R9 380X for $450 which is a pretty big price jump over the normal R9 290X for just an extra 4GB of Vram :confused:


  • Advertisement
  • Moderators, Education Moderators, Technology & Internet Moderators Posts: 35,101 Mod ✭✭✭✭AlmightyCushion


    Venom wrote: »
    One of the big retail stores in the US screwed up and is selling the R9 380X for $450 which is a pretty big price jump over the normal R9 290X for just an extra 4GB of Vram :confused:

    I'd ignore those prices. They're from Best Buy who aren't exactly known for being competitive. Also, big companies regularly set up products on their system with high prices before they get an official release date and once the official release date hits they drop it to the normal price. It's to discourage people from buying them if they get put on display before the official release date hits. Saying that as said above the more information we hear about the 300 series the less promising it sounds so they could be the proper prices or close at least. I hope not but it is worrying.


  • Registered Users, Registered Users 2 Posts: 1,815 ✭✭✭imitation


    Just went red and got a r9 290. It was too good a bargain to pass up and a brilliant upgrade from a gtx 670. Its not a big gap between the 290 and a 970 performance wise, but the price gap is massive. To be honest though I was a fair bit nervous, because my previous card was a 4870 and it was factory overclocked and never quite worked correctly, I often had to clock back my self to fix crashing issues. Nvidia defintely have gone gun ho custom features, it started with physx and its just progressing.

    I think its a dangerous game really, its almost going back to the days of 3dfx glide vs opengl vs directx. Im hoping developers will see the risk in fracturing the pc market and stick with stuff compatible with both platforms, rather than taking Nvidias tempting software. I think its likely this will happen if you look at physx which is more of a gimmick really.

    You would also think AMD would have a natural advantage driverwise having the apu for the ps4 and xbone. I guess its a clear indicator of how much middleware there is between the hardware and the game.


  • Registered Users, Registered Users 2 Posts: 14,698 ✭✭✭✭BlitzKrieg


    I was screwed by nividia way back with the geforce 4 mx card (the one that was called a geforce 4 but performance wise was could have passed for an old voodoo card) and stuck with radeon for years after who I found reliable

    only gave nividia another chance when I was given one of those 2 good to pass up deals on a 560ti, which was in fairness for its price a bloody brilliant card.

    But went back to radeon soon after when a better card was offered on the cheap.

    more so because this was at a point when the geforce experience was at its most insufferable (i.e just before the overhaul with shadowplay etc)

    but as good as my radeon 8770 hd was it struggled in certain areas and once again I got offered a really good deal from a friend and have only in the last few weeks swapped back to nividia with the geforce 970. Didnt help that AMD started trying to copy the geforce experience with their evolved or something experience which was equally insufferable

    which is ok, it's really quiet but geforce experience even with shadowplay still feels far too big, but when compared to amd evolved (who's shadowplay function barely worked half the time) its not as stupidly overlarge as it use to be.


    I guess I've had a better reliable record with amd but nvidia did a lot of good repair on its reputation with me with the very solid 560ti


  • Registered Users, Registered Users 2 Posts: 12,775 ✭✭✭✭Gbear


    Well, the normal lineup does mostly sound like a pile of meh, but the new cards - the Nano, the Fury and the Fury X, sound very compelling.


    $650 MSRP for the Fury X or probably something in the region of €700/£550.
    $550 for the Fury.

    The Nano is a little pint-sized version of the other two, I suppose.
    It's air cooled and about the size of a wallet but will presumably have 4k-ready levels of performance.
    Could be a game changer for M-ATX and laptops.

    I think both the Furies be released over the next 2 weeks.
    The Nano is a bit later.

    There'll be a dual Fiji coming in the autumn, which I assume will rend the earth asunder.


  • Closed Accounts Posts: 937 ✭✭✭Dair76


    Planning a build for around Skylake's release, so will be watching benchies and reviews of these cards closely.


  • Registered Users, Registered Users 2 Posts: 757 ✭✭✭platinums


    its refreshing to see people are not only taking note of Intels under hand attitude and Nvidia's Propriette natures, but actually taking actions in not purchasing their kit.

    My first processor was an AMD Athlon K7, pure beast back then (think i still had a Voodoo!), and un-matched value for money. (£800 for a 1000mhz cpu back then).

    Im really hoping AMD can return to the golden years they once achieved so my next upgrade is coming up and if all goes well i will go pure AMD\ATI.


  • Advertisement
  • Registered Users, Registered Users 2 Posts: 5,578 ✭✭✭EoinHef


    Other than maybe Zen architecture that is supposed to launch next year saving them,AMD are out in the wilderness regarding CPU's. Bar some very specific uses intel are the way to go. I wish this wasnt the case but unfortunately it is. Even i3's are beating the 8 core 8350's in a lot cases (especially gaming)and thats with a lot less power consumption. DX12 may close that gap a bit regarding games but its still in therory,id like to see some actual real world benches for that.

    Their APU's seem to be doing reasonably well for them and thats a bright spot but dont think that on its own is enough to get excited about.


  • Moderators, Education Moderators, Technology & Internet Moderators Posts: 35,101 Mod ✭✭✭✭AlmightyCushion


    Looks like the Fury cards are performing very well. Obviously, we can't say for sure until third party reviews and benchmarks come out.

    http://hexus.net/tech/news/graphics/84080-amd-shares-r9-fury-x-far-cry-4-uhd-performance-charts/


  • Closed Accounts Posts: 937 ✭✭✭Dair76


    Good to see a strong contender from AMD. It makes choosing a new card for my new rig harder though - I'll be buying an adaptive sync monitor at the same time, which means I'll be tied to the one manufacturer for several years. Me no likey that. I wish they'd just both adopt the same feckin standard.


  • Moderators, Education Moderators, Technology & Internet Moderators Posts: 35,101 Mod ✭✭✭✭AlmightyCushion


    Dair76 wrote: »
    Good to see a strong contender from AMD. It makes choosing a new card for my new rig harder though - I'll be buying an adaptive sync monitor at the same time, which means I'll be tied to the one manufacturer for several years. Me no likey that. I wish they'd just both adopt the same feckin standard.

    I'd go freesync. It's more likely that nVidia will support it in time than AMD ever supporting g sync.


  • Closed Accounts Posts: 937 ✭✭✭Dair76


    I'd go freesync. It's more likely that nVidia will support it in time than AMD ever supporting g sync.

    True that. I just prefer not being tied to the same manufacturer. Ah sure, I can buy a new monitor every time I change GPU. :pac:


  • Advertisement
Advertisement