Advertisement
If you have a new account but are having problems posting or verifying your account, please email us on hello@boards.ie for help. Thanks :)
Hello all! Please ensure that you are posting a new thread or question in the appropriate forum. The Feedback forum is overwhelmed with questions that are having to be moved elsewhere. If you need help to verify your account contact hello@boards.ie

Intel or AMD CPU for gaming PC?

Options
12346

Comments

  • Closed Accounts Posts: 165 ✭✭Eamonn Brophy


    Serephucus wrote: »
    It's not so much what you have to do, versus what you're doing, with novice users. If you tell them to click here, then save this, then enter this number, fine, they'll do it. If they ask what they're doing, and you tell them they're overclocking - running a processor past its design spec - they'll instantly get antsy and unsure, and not want to risk screwing anything up.

    I build systems for a lot of people I know, and maintain them. I've had the above happen to more more than once. People would rather buy money, and get something that they know "just works" - even if it does involve someone else replacing a part for them, they're familiar with that idea. Your tires go bad on your car, you replace them, makes you turn better.

    I've been into PC's for years, I'm a computer science student, and I've built several machines and I still get on edge overclocking!


  • Registered Users Posts: 7,180 ✭✭✭Serephucus


    Solitaire wrote: »
    Nope. It can cause low spikes in minimum FPS but that's an exceptional case. Usually, its more like having very low or even zero FPS for a fraction of a second, then super-high FPS for the next 1-2/10s of a sec, and so on. This doesn't really affect FPS much at all, as the same number (or from the look of the benchmarks, more) of frames are being kicked out per second; its what happening in the fractions of a second where things have gone horribly wrong. This causes a disruptive visual effect where the image seems to lag and stutter; the (for CF users) infamous "microstutter"

    This just isn't clicking with me. Surely, if you have 0FPS for 1/2 of a second, and 60FPS for the other half, that gives you an average of 30FPS, meaning a dip.

    I did notice some slight screen tearing, now you mention it. With NVIDIA cards though you can toggle between AFR/SFR modes with SLI. Dunno if AMD have that option. SFR would give tearing, AFR microstutter.


  • Registered Users Posts: 27,645 ✭✭✭✭nesf


    #Again, I'm not trying to be contentious but your comments about some people never wanting to tinker with their systems would make a lot more sense to me if you weren't touting a processor upgrade route as one of the reasons to buy the i3. The 'overclock' on the X4 requires you to open a downloaded utility and about half a dozen mouse clicks in a big friendly no-brainer interface. It takes about 60 seconds and doesn't even require a restart.

    I agree overclocking is easy now, or at least tame overclocking is easy. The thing is some people will upgrade a CPU no problem but ask them to overclock and stress test a CPU and they won't even consider it. I don't think this is reasonable but I've met a lot of people like this. The issue is a fear of things going wrong, and well, yes they can with overclocking, even with good overclocking chips like the k series Sandy Bridges and the Phenom Black Editions. I think the problem comes from people not really understanding what happens with the chip when you overclock it and being afraid of messing with things they don't understand. Which is a fair enough way to tackle computing if you're not technically minded. Upgrading a CPU on the other hand is straightforward, look at X benchmark, see that it's better. Simples.
    Finally, I think you might find the below article from Tom's hardware interesting, wherein they analyze cpu versus gpu bottlenecks. They basically find that in modern systems the bottleneck nearly always occurs at the GPU level and also that 8 out of the 20 games they tested significantly benefitted from having a four core cpu. They also state that the multi-core advantage is only likely to increase as time goes on.

    http://www.tomshardware.com/reviews/game-performance-bottleneck,2738-16.html

    In many games sure, but note the ones that respond well to CPU overclocking, that's what I was talking about when I said the games you play matters in this. Multi core is nice but a dual core Sandy Bridge will outperform a quad core Phenom X4 in real world testing. Like for like more cores are better but you can't take chips from different manufacturers and generations and say that the one with more cores will automatically be better.


  • Moderators, Technology & Internet Moderators Posts: 18,377 Mod ✭✭✭✭Solitaire


    Serephucus wrote: »
    This just isn't clicking with me. Surely, if you have 0FPS for 1/2 of a second, and 60FPS for the other half, that gives you an average of 30FPS, meaning a dip.

    The problem is that you'd have say 20FPS from all the "dips" in one second, and >100FPS from the good parts as the cards bunch up the frames when they come out of sync. So its not always a dip in FPS.
    nesf wrote: »
    In many games sure, but note the ones that respond well to CPU overclocking, that's what I was talking about when I said the games you play matters in this. Multi core is nice but a dual core Sandy Bridge will outperform a quad core Phenom X4 in real world testing. Like for like more cores are better but you can't take chips from different manufacturers and generations and say that the one with more cores will automatically be better.

    This. Sandy has, what? An average 70% throughput advantage compared to Phenom 2, clock-for-clock?

    The only real problems are really dodgy console ports that won't allocate the third thread to the spare cycles on the other cores properly without throwing a tantrum. So.... basically GTA4 :pac:


  • Registered Users Posts: 363 ✭✭Paul_Hacket


    nesf wrote: »
    I agree overclocking is easy now, or at least tame overclocking is easy. The thing is some people will upgrade a CPU no problem but ask them to overclock and stress test a CPU and they won't even consider it.

    We're not talking about pushing a chip to its limits here - we're talking about adding around 20% to its clock rate, something the manufacturer basically of the black edition chips basically encourages you to do.

    Anyone who I've demonstrated how to do this with an overclock utility couldn't believe how easy it was. I honestly can't imagine anybody would would find that more traumatic than yanking their case open, unscrewing the heatsink and pulling out the cpu to replace it with a new one with the possibility of permanently damaging components with static charges, or bending pins etc.

    The worst that can happen with a standard overclock is that the chip will overheat and shut down - it's virtually impossible to permanently damage one these days unless you were to run it at levels way beyond its design envelope - and the machine would give you continuous over-heating errors if you did that, so you would have many chances to remedy the situation.

    Again, I don't know anyone who would be more comfortable messing around with their hardware than doing a software overclock of the one described.


  • Advertisement
  • Registered Users Posts: 27,645 ✭✭✭✭nesf


    Solitaire wrote: »
    This. Sandy has, what? An average 70% throughput advantage compared to Phenom 2, clock-for-clock?

    The only real problems are really dodgy console ports that won't allocate the third thread to the spare cycles on the other cores properly without throwing a tantrum. So.... basically GTA4 :pac:

    I don't know the numbers but the benchmarks are insanely good for the Sandy Bridge chips at equal cores and clock speeds to the Phenom chips. No competition at all from AMD. Which is a real pity as I always had a soft spot for AMD as the underdog. You need a really well optimised multi-threaded app to make the Phenom II's look better and games are far, far from being such apps these days unfortunately.

    The Intel chips are lovely little workhorses though. I'm almost having buyer's remorse that I didn't go for a 2600K though really it would have been stupidly excessive. :)


  • Registered Users Posts: 7,180 ✭✭✭Serephucus


    Solitaire wrote: »
    The problem is that you'd have say 20FPS from all the "dips" in one second, and >100FPS from the good parts as the cards bunch up the frames when they come out of sync. So its not always a dip in FPS.

    Hm. Fair enough. I suppose it's just something I'd have to experience, but I get the idea now at least.
    We're not talking about pushing a chip to its limits here - we're talking about adding around 20% to its clock rate, something the manufacturer basically of the black edition chips basically encourages you to do.

    [snip]

    Again, I don't know anyone who would be more comfortable messing around with their hardware than doing a software overclock of the one described.

    Ok, let's assume for a minute that the person we're talking about is a technophobe. Any semi-competent person would probably be pretty easily persuaded to overclock once shown enough.

    If you were to tell someone "I'm going to swap your processor for this better one, it'll make your computer faster", they'd understand that. They might not know exactly what's involved, but they can equate it to something they know.

    If, on the other hand you were to tell them "I'm going to make your processor go faster" most find something inherently wrong with this, because, well, nothing is for free, so you must be doing something you otherwise shouldn't, and they'll get worried, and tell you they don't want anything being messed up.

    If they do manage to ok the OCing, if you tell them when you're leaving that you have a stress test going, and that if it crashes, to call you, they'll tell you to un-do whatever you did there and then.


  • Registered Users Posts: 363 ✭✭Paul_Hacket


    nesf wrote: »
    In many games sure, but note the ones that respond well to CPU overclocking, that's what I was talking about when I said the games you play matters in this. Multi core is nice but a dual core Sandy Bridge will outperform a quad core Phenom X4 in real world testing. Like for like more cores are better but you can't take chips from different manufacturers and generations and say that the one with more cores will automatically be better.

    They state themselves in the conclusion that the even in the games that benefit from overclocking the benefits are small and trivial compared to the benefits from adding a more powerful graphics card. The takeaway is that the GPU is the bottleneck 90% of the time provided you're starting from a moderately fast processor. That's why they typically spend around $100 on the processor in their budget gaming rig builds but over $150 on the graphics card.


  • Registered Users Posts: 363 ✭✭Paul_Hacket


    Serephucus wrote: »

    Ok, let's assume for a minute that the person we're talking about is a technophobe. Any semi-competent person would probably be pretty easily persuaded to overclock once shown enough.

    If you were to tell someone "I'm going to swap your processor for this better one, it'll make your computer faster", they'd understand that. They might not know exactly what's involved, but they can equate it to something they know.

    If, on the other hand you were to tell them "I'm going to make your processor go faster" most find something inherently wrong with this, because, well, nothing is for free, so you must be doing something you otherwise shouldn't, and they'll get worried, and tell you they don't want anything being messed up.

    If they do manage to ok the OCing, if you tell them when you're leaving that you have a stress test going, and that if it crashes, to call you, they'll tell you to un-do whatever you did there and then.

    Sorry, I don't know where you're getting these assumptions and I just don't agree.


  • Registered Users Posts: 27,645 ✭✭✭✭nesf


    We're not talking about pushing a chip to its limits here - we're talking about adding around 20% to its clock rate, something the manufacturer basically of the black edition chips basically encourages you to do.

    Anyone who I've demonstrated how to do this with an overclock utility couldn't believe how easy it was. I honestly can't imagine anybody would would find that more traumatic than yanking their case open, unscrewing the heatsink and pulling out the cpu to replace it with a new one with the possibility of permanently damaging components with static charges, or bending pins etc.

    The worst that can happen with a standard overclock is that the chip will overheat and shut down - it's virtually impossible to permanently damage one these days unless you were to run it at levels way beyond its design envelope - and the machine would give you continuous over-heating errors if you did that, so you would have many chances to remedy the situation.

    Again, I don't know anyone who would be more comfortable messing around with their hardware than doing a software overclock of the one described.

    We're meeting different people then. I agree that people should look at software overclocking for unlocked chips and go "yeah, I could do that" but they don't necessarily. Never underestimate how much fear a person can have for technology and how Luddite they can be about the whole thing yet completely depend on said technology for a plethora of activities. It's bizarre really.


  • Advertisement
  • Registered Users Posts: 7,180 ✭✭✭Serephucus


    Sorry, I don't know where you're getting these assumptions and I just don't agree.

    They're not assumptions. I'm talking from personal experience here. As I've said, I build systems for friends / family all the time, and this is always what happens when I get the inevitable "My computer's getting slow, is there anything you can do to make it faster?" requests.

    Edit: Oh, RE: pricing, get a G840 instead! €75 vs. €105. It's basically a 2.8GHz 2100, minus the hyperthreading.


  • Registered Users Posts: 27,645 ✭✭✭✭nesf


    They state themselves in the conclusion that the even in the games that benefit from overclocking the benefits are small and trivial compared to the benefits from adding a more powerful graphics card. The takeaway is that the GPU is the bottleneck 90% of the time provided you're starting from a moderately fast processor. That's why they typically spend around $100 on the processor in their budget gaming rig builds but over $150 on the graphics card.

    I don't know, this year's anandtech benchmarks for an 2500k vs a X4 965 BE show anything up to a 50fps gain for the i5 in some games at high enough resolutions. There's decent gains to be had with a CPU upgrade.

    Remember that article is only looking at overclocking a CPU it's not looking at going from a previous generation chip to a present generation chip and all the throughput advantage that provides.

    Edit: Here's the benchmarks for reference: http://www.anandtech.com/bench/Product/102?vs=288


  • Registered Users Posts: 363 ✭✭Paul_Hacket


    Serephucus wrote: »
    They're not assumptions. I'm talking from personal experience here. As I've said, I build systems for friends / family all the time, and this is always what happens when I get the inevitable "My computer's getting slow, is there anything you can do to make it faster?" requests.

    Edit: Oh, RE: pricing, get a G840 instead! €75 vs. €105. It's basically a 2.8GHz 2100, minus the hyperthreading.

    Well your personal experience differs greatly from mine. I've worked with computers for almost 25 years - bought my first, a Mac SE, in 1987. I've supervised classroom and college computer labs where I taught media studies and built and upgraded systems for dozens of people. In this case we're talking about encouraging a guy to do his own processor upgrade versus doing a software based overclock which won't bring his cpu temperatures our of the 40s (when the chip is rated as safe into the 70s).

    I have this chip and ran it all last summer when the temperatures in New York (where I live now) got into the 40s, it stayed cool and never crashed. If you don't give people any reason to be scared of such an operation they don't become so in my experience. Again we'll just have to agree to disagree it seems.


  • Moderators, Technology & Internet Moderators Posts: 18,377 Mod ✭✭✭✭Solitaire


    nesf wrote: »
    You need a really well optimised multi-threaded app to make the Phenom II's look better and games are far, far from being such apps these days unfortunately.

    Actually you don't :p A really well-optimised multithreaded app would likely get an 80-100% performance boost from the i3's HyperThreading. Its only multithreaded apps that are really badly optimised at both thread streamlining and reassigning threads to occupied cores that make an i3 look bad in any way! :o


  • Registered Users Posts: 27,645 ✭✭✭✭nesf


    Solitaire wrote: »
    Actually you don't :p A really well-optimised multithreaded app would likely get an 80-100% performance boost from the i3's HyperThreading. Its only multithreaded apps that are really badly optimised at both thread streamlining and reassigning threads to occupied cores that make an i3 look bad in any way! :o

    I stand corrected. :)

    The i3 2100 has HyperThreading? Didn't know that.


  • Registered Users Posts: 363 ✭✭Paul_Hacket


    nesf wrote: »
    I don't know, this year's anandtech benchmarks for an 2500k vs a X4 965 BE show anything up to a 50fps gain for the i5 in some games at high enough resolutions. There's decent gains to be had with a CPU upgrade.

    Remember that article is only looking at overclocking a CPU it's not looking at going from a previous generation chip to a present generation chip and all the throughput advantage that provides.

    Edit: Here's the benchmarks for reference: http://www.anandtech.com/bench/Product/102?vs=288

    You're comparing a 2500k to a chip that costs a lot less than it - that's totally unfair. And I don't see any games in the list with a 50fps second gain. Which one is that? In three of the four games at the bottom of the list the 2500k is only 10% or so faster, in Far Cry it does much better but the AMD still does over 50 fps at high settings so it's not even something that most people would perceptually notice in the real world.


  • Registered Users Posts: 27,645 ✭✭✭✭nesf


    You're comparing a 2500k to a chip that costs a lot less than it - that's totally unfair. And I don't see any games in the list with a 50fps second gain. Which one is that? In three of the four games at the bottom of the list the 2500k is only 10% or so faster, in Far Cry it does much better but the AMD still does over 50 fps at high settings so it's not even something that most people would perceptually notice in the real world.

    There's more games further up in the benchmark. The 50 FPS one is Dragon Age Origins.

    I'm deliberately comparing two very different chips to underline the difference in upgrading from one to the other and the gains to be had in doing so in certain games. I just want to show that you can get a lot out of a CPU upgrade here.


  • Moderators, Technology & Internet Moderators Posts: 18,377 Mod ✭✭✭✭Solitaire


    nesf wrote: »
    I stand corrected. :)

    The i3 2100 has HyperThreading? Didn't know that.

    Ooooooh yes ;) That's what makes it so good as a budget option; so long as an app isn't horrendously badly coded the i3 has four virtual cores, not just the two! ;) In fact since the first-gen i3s the newer i7s have also had a massively improved HT far better than that of the old first-gen i7s, let alone the horrid old implementation on P4s and Atom! :P


  • Registered Users Posts: 27,645 ✭✭✭✭nesf


    Solitaire wrote: »
    Ooooooh yes ;) That's what makes it so good as a budget option; so long as an app isn't horrendously badly coded the i3 has four virtual cores, not just the two! ;) In fact since the first-gen i3s the newer i7s have also had a massively improved HT far better than that of the old first-gen i7s, let alone the horrid old implementation on P4s and Atom! :P

    I have a first gen i7 in my iMac and the HT is quite nice for video converting and similar. Never found a game that made any decent use out of it though.

    That makes the i3 an absolutely lovely chip at that price point.


  • Registered Users Posts: 363 ✭✭Paul_Hacket


    nesf wrote: »
    There's more games further up in the benchmark. The 50 FPS one is Dragon Age Origins.

    I'm deliberately comparing two very different chips to underline the difference in upgrading from one to the other and the gains to be had in doing so in certain games. I just want to show that you can get a lot out of a CPU upgrade here.

    The difference between these two games is that one system plays it at 100fps and the other at 150. This game is a weird outlier and nobody who is playing it would actually be able to differentiate between them - after 60fps the human eye just doesn't notice. For 90% of games the frame rate increase will be far, far more significant by adding a second graphics card.

    Haven't we been over this territory before?


  • Advertisement
  • Registered Users Posts: 7,180 ✭✭✭Serephucus


    nobody who is playing it would actually be able to differentiate between them - after 60fps the human eye just doesn't notice.

    Ah, untrue. 60Hz/FPS is the refresh rates of most monitors, but the human eye actually tops out at about 200Hz. You will definitely notice the difference between 60 and 120Hz. I do, and I'm severely visually impaired.


  • Registered Users Posts: 27,645 ✭✭✭✭nesf


    The difference between these two games is that one system plays it at 100fps and the other at 150. This game is a weird outlier and nobody who is playing it would actually be able to differentiate between them - after 60fps the human eye just doesn't notice. For 90% of games the frame rate increase will be far, far more significant by adding a second graphics card.

    Haven't we been over this territory before?

    Sure, but I'm thinking of the long term here. If I'm building a system where I want the chip to last 4 years or so, I want the chip that is overly good now so it'll support dual 7 series or whatever further down the line.

    I'm saying you'll get a nice performance bump out of the better CPU on your way to better graphics cards down the line. (Or vice versa) It's not an either/or issue, we want to upgrade both realistically.


  • Registered Users Posts: 363 ✭✭Paul_Hacket


    Serephucus wrote: »
    Ah, untrue. 60Hz/FPS is the refresh rates of most monitors, but the human eye actually tops out at about 200Hz. You will definitely notice the difference between 60 and 120Hz. I do, and I'm severely visually impaired.

    You are wrong. I've worked for years in video and have given classes on this subject. Organization such as SMPTE (society of motion pictures and television engineers) and ASC (american society of cinematographers) have studies this at the technical level several times to see if there would be any advantage in improving the frame rate of cinema or TV beyone 60 frames per second. They have found that there is very little perceived increase in smoothness beyond 60 frames per second and practically none beyond 100.

    Even proponents of high framerate cinema such as Peter Jackson and James Cameron do not think there is any reason to go beyond 60 frames per second since they admit there is no perceived increase in clarity - what they are lobbying for is an increase from the current 24 fps standard. See the below for more info:

    http://prolost.com/60p

    http://forums.creativecow.net/thread/267/1785


  • Registered Users Posts: 7,180 ✭✭✭Serephucus


    We'll agree to disagree there. I can't remember where I read that, and you probably know more on the subject than I do.


  • Closed Accounts Posts: 165 ✭✭Eamonn Brophy


    You are wrong. I've worked for years in video and have given classes on this subject. Organization such as SMPTE (society of motion pictures and television engineers) and ASC (american society of cinematographers) have studies this at the technical level several times to see if there would be any advantage in improving the frame rate of cinema or TV beyone 60 frames per second. They have found that there is very little perceived increase in smoothness beyond 60 frames per second and practically none beyond 100.

    Even proponents of high framerate cinema such as Peter Jackson and James Cameron do not think there is any reason to go beyond 60 frames per second since they admit there is no perceived increase in clarity - what they are lobbying for is an increase from the current 24 fps standard. See the below for more info:

    http://prolost.com/60p

    http://forums.creativecow.net/thread/267/1785


    Are you STILL here arguing? Seriously? I've a wall that needs knocking that you could talk down for me....


  • Registered Users Posts: 363 ✭✭Paul_Hacket


    Are you STILL here arguing? Seriously? I've a wall that needs knocking that you could talk down for me....

    Read the thread - was I the one that brought any of this up? And your contribution, apart from ad hominem blather, has been what exactly?


  • Closed Accounts Posts: 165 ✭✭Eamonn Brophy


    Read the thread - was I the one that brought any of this up? And your contribution, apart from ad hominem blather, has been what exactly?

    Another reply, trying to incite yet another back and forth argument about nonsense.


  • Registered Users Posts: 363 ✭✭Paul_Hacket


    Another reply, trying to incite yet another back and forth argument about nonsense.

    No, you really don't get to make statements like that when you yourself are the one being insulting and trying to rile people up. We were having a discussion about something - it wasn't acrimonious. You on the other hand are just spewing bile. I'm not interested.


  • Closed Accounts Posts: 165 ✭✭Eamonn Brophy


    Serephucus wrote: »
    Noted. Logic only from here on in.

    @aperture: Here's an alternete build from Dabs. I chose Dabs because they offer free delivery, and with budget builds like this, the €30 can make a difference.

    Item|Price
    Total build cost: €456.48 (inc delivery Free!)
    ASRock S1155 Intel H61M DDR3 mATX|€49.19
    XIGMATEK Asgard Chassis|€32.52
    Intel Core i3-2100 3.10GHz LGA1155 3MB|€105.45
    XFX 550 Watt Core Edition Full Wired 80+ Bronze PSU|€51.20
    Kingston 8GB (2 x 4GB) HyperX Blu DDR3 1600MHz DIMM 240-pin CL9|€36.33
    Asus GeForce GTX 560 Ti 822MHz 1GB GDDR5 PCI-Express HDMI|€181.79


    OP, go with this. It should max out anything at reasonable resolutions!


  • Advertisement
  • Closed Accounts Posts: 1,353 ✭✭✭Sasquatch76


    You are wrong. I've worked for years in video and have given classes on this subject. Organization such as SMPTE (society of motion pictures and television engineers) and ASC (american society of cinematographers) have studies this at the technical level several times to see if there would be any advantage in improving the frame rate of cinema or TV beyone 60 frames per second. They have found that there is very little perceived increase in smoothness beyond 60 frames per second and practically none beyond 100.

    Even proponents of high framerate cinema such as Peter Jackson and James Cameron do not think there is any reason to go beyond 60 frames per second since they admit there is no perceived increase in clarity - what they are lobbying for is an increase from the current 24 fps standard. See the below for more info:

    http://prolost.com/60p

    http://forums.creativecow.net/thread/267/1785
    Those articles apply to movies where motion blur compensates for the lower frame rate. Movies are perfectly watchable at 24 fps. Ever tried playing Quake at the same frame-rate?


This discussion has been closed.
Advertisement