Advertisement
If you have a new account but are having problems posting or verifying your account, please email us on hello@boards.ie for help. Thanks :)
Hello all! Please ensure that you are posting a new thread or question in the appropriate forum. The Feedback forum is overwhelmed with questions that are having to be moved elsewhere. If you need help to verify your account contact hello@boards.ie

Looking to build a new mid-range rig

Options
2»

Comments

  • Registered Users Posts: 18,706 ✭✭✭✭K.O.Kiki




  • Moderators, Regional North West Moderators Posts: 19,117 Mod ✭✭✭✭byte
    byte


    Oh, I do like both the Zalman and Antec cases.



  • Registered Users Posts: 2,293 ✭✭✭billybonkers


    Sorry to hijack, but if you order from AWD-IT do you get hit with the extra customs etc here in Ireland? They say on the site you don't but I am sceptical. Same for Amazon UK do you still get hit with the extras? It's a long time since I have order PC parts...



  • Moderators, Regional North West Moderators Posts: 19,117 Mod ✭✭✭✭byte
    byte


    On AWD-IT's website, it says

    DELIVERING TO IRELAND (ROI)

    All our products can be shipped to Ireland at the checkout. There are no additional charges regarding VAT or Customs charges when you order from AWD-IT.

    Our website displays VAT at 20% based on UK. The correct VAT of 23% will be displayed at the checkout for Ireland.

    For Amazon it's the same, they change the VAT from 20% to 23% at checkout, which is why you see price differences at checkout compared to what you see earlier.

    As far as I'm aware, there are no custom duties on computer hardware.

    ********

    With regards my build, I've ordered the Antec case from Scan, which I hope will arrive at the Scottish destination by Wednesday evening, as I've relative coming over who will take it with them.



  • Moderators, Regional North West Moderators Posts: 19,117 Mod ✭✭✭✭byte
    byte


    So, after much messing around, I've compiled two builds, an Intel Core i5 and AMD Ryzen 5.

    AMD

    There's not that big a price difference between Intel and AMD prices really, so I'm honestly leaning towards Intel which is newer and marginally faster.

    Both prices are all from caseking.de for uniformity, but I'll see if Amazon can be used for some, as I've a credit balance to use from giftcards. Caseking seems to be a sister site of OCUK so RMA's should hopefully be OK if the worst happens.

    I'm still tempted with liquid cooling particularly if I go Intel, as the i5 gets hotter than the Ryzen.

    Anything I'm missing? :)

    And yea, my spend has jumped up a fair whack from my initial €1,600 guesstimate.



  • Advertisement
  • Registered Users Posts: 18,706 ✭✭✭✭K.O.Kiki


    Do not get the Hyper 212 - it's overpriced trash.

    For 13eur more you can now get the excellent Scythe Fuma 2 rev.B which also ships with Intel LGA1700 brackets 😎It also comes with a tube of thermal paste so you don't need to buy that separately.

    SSD, you're paying PCIe4 prices for a PCIe3 drive - might as well just get a Gigabyte Aorus drive at that stage.



  • Registered Users Posts: 246 ✭✭Jon Doe


    If you can afford to have an ethernet cable connecting the desktop to the router, you could get a cheaper X570/Z690 mobo - without wifi - and up your SSD: the 980 Pro or KOKiki's Aorus.

    As for the CPU, I'm biassed against Intel. You should not reward Intel for building a "10" core (not really, more like 6 core + 4 atoms) 150W CPU. To me, that indicates a crappy CPU: your machine doesn't perform properly in the 60-100W range so you pump power into the thing so you can increase the frequency... That's never a good idea in electronics... :S Still, as far as CPU brands are concerned it is very much a personal choice... :/



  • Moderators, Regional North West Moderators Posts: 19,117 Mod ✭✭✭✭byte
    byte


    Thanks again for your advice.

    I hadn't actually noticed that the EVO 970 was PCIe 3! So it'll be either the Aorus or 980PRO for storage. I've also just realised that my existing SSD 2.5" EVO860 500GB, not 1TB like I had thought! It'll do for now as slave I guess, and later, I'll add another 1TB NVMe I guess. I'll still throw in one of the HDD's too with media on it.

    The Scythe also looks the part! If I don't end up with a liquid cooler to suit LGA1700...

    @Jon Doe I was choosing the wifi option because they have Bluetooth too, which I'd find handy without the need for external USB dongle.

    On the CPU, I thought it was 120W, not 150W!? I'm not sure it's a crappy CPU?



  • Registered Users Posts: 18,706 ✭✭✭✭K.O.Kiki


    The 12600K is 125W TDP but can use around 155W in normal application use. However if you run it at stock / with thermal limits removed (not overclocked!) it shouldn't break 70C.



  • Registered Users Posts: 246 ✭✭Jon Doe


    I was choosing the wifi option because they have Bluetooth too, which I'd find handy without the need for external USB dongle.

    As I always say, the OPs knows best what are his/her needs (and budget!). If you need both, make frequent use and it is within you budget, go for it.

    On the CPU, I thought it was 120W, not 150W!? I'm not sure it's a crappy CPU?

    Maybe the CPU itself - the architecture - is not crappy. But Intel's fab process is absolute crap and they know it. It's the reason why they feel the need to bolt atom cores to their desktop CPUs - AMD has 6c CPUs? We have 10c CPUs!! They have been having process problems since 2018~19. That, combined with Ryzen is the reason why AMD has been gnawing market share from Intel. Intel says 150W:

    https://www.intel.com/content/www/us/en/products/sku/134589/intel-core-i512600k-processor-20m-cache-up-to-4-90-ghz/specifications.html

    If you really want to go with Intel and don't want to do overclocking - no need for a K CPU - then pick this one:

    65W when idling, 117W max and no atom core foolishness.



  • Advertisement
  • Moderators, Regional North West Moderators Posts: 19,117 Mod ✭✭✭✭byte
    byte


    Hmm, you have made me reconsider again!

    150W seems a big jump from what the Ryzen 5 can manage in marginal differences. I also didn't realise that the extra 4 cores are basically there to add core numbers.

    I'm now thinking go back to a Ryzen 5 as originally considered, with a B550 board and use the savings towards an additional 1TB m.2 drive.



  • Registered Users Posts: 246 ✭✭Jon Doe


    I'm now thinking go back to a Ryzen 5 as originally considered, with a B550 board and use the savings towards an additional 1TB m.2 drive.

    Or maybe you should "invest" the difference elsewhere?

    geizhals.eu/kfa-geforce-rtx-3080-ti-sg-1-click-oc-38iom5md99dk-a2539212.html

    😋

    Just a final FYI: in my book a 1800~1900€ desktop is not a mid-range gaming rig! Not even by a long shoot! 😀



  • Moderators, Regional North West Moderators Posts: 19,117 Mod ✭✭✭✭byte
    byte


    Indeed, you are right, if I hadn't won a raffle on an iPhone 13 Pro, I'd not be at these figures right now!

    I personally don't think I need to go beyond a basic RTX3080 as a casual gamer, so I think I'd be better with more storage.

    I do intend to replace this AOC 27" 1080 monitor with something more decent a few months down the line too, hopefully...



  • Registered Users Posts: 602 ✭✭✭Aodhan5000


    The little cores are proven to be very useful in certain multithreaded applications. Not that everything works like cinebench but, if you look at the cinebench scores, the little cores are rock solid. The i5 12600k with the 6P +4E is really good in multithread and single thread workloads (see Hardware Unboxed video).

    With regard to power draw, yes Intel is less efficient, but the difference in power draw while gaming is really not that significant which seems to be the use case for this build.

    But if the AMD system works out cheaper for you, and you would rather put that money to something more important, go for it.

    Source:

    https://youtu.be/LzhwVLUVork



  • Registered Users Posts: 246 ✭✭Jon Doe


    The little cores are proven to be very useful

    You'll keep on thinking like that until Thread Director makes one too many bad decisions: "Oh? You need to read a mesh from disk, load it onto the graphic's card memory so that you can render the next frame? Don't worry: I'll get an atom core on it right away! Well... right away-ish..." Once you get tired of all the hiccups you'll just go to BIOS settings and disable the economy cores so as to economize your patience... :P

    power draw while gaming is really not that significant

    Not significant? In your opinion, what other common activity demands more of the CPU? It is my understanding that gaming is the one thing that has been driving the x86 industry for the past few decades - that implies increased power consumption. DOOM I alone, sold more 486's than any spreadsheet software you can think of.

    Concerning your source, I wouldn't put much stock on a guy that in Nov 6, 2021 was recommending a CPU that was put on the market in June 2020 for buyers on a budget. It may be the case that someone somewhere had a load of 10400f's beached in some warehouse and spent some marketing dollars to help move that stock. Always keep in mind that your eyes can't see above 25~30 fps. I can't be sure, but I believe that if you can sense that a screen is delivering above 60fps (for example, you need more than 60fps to avoid getting headaches) you are a very rare exception.



  • Registered Users Posts: 602 ✭✭✭Aodhan5000


    If you can give me a RELIABLE source saying that the thread director makes decisions to use "atom cores" as you call them, to render frames over a P core and causing stuttering which impacts the gaming experience in a noticeable and problemo way, send it on.

    I said the DIFFERENCE in power draw is not that significant. You're just quoting me badly to misrepresent my views. For instance, in Aida 64, the difference in power draw would be much greater as it utilizes the entire CPU to a much greater degree than gaming. I don't really understand the rest of what you're saying there, it's not very coherent.

    The 10400f was a very solid budget CPU. Can't say as to whether or not he was recommending it at the end of 2021 but maybe your idea of budget is different to his.

    And as a generality, the human eye can see much more than 30fps. For me there is a clear difference between 60Hz and 144Hz even.

    Now I don't know if you're trolling or what, but stop the whole craic of misinforming people.



  • Registered Users Posts: 76 ✭✭ericfartman


    What in the **** are you talking about? Lets all go back to consoles and 30FPS gaming.



  • Registered Users Posts: 246 ✭✭Jon Doe


    I'm talking about this: https://www.pcgamer.com/how-many-frames-per-second-can-the-human-eye-really-see/

    I'm talking about not spending more money than you need to. For me, a constant rate of 40/50 fps or more is enough. I can't tell the difference beyond that.

    I never mentioned consoles. If you check my previous posts you will learn that I don't like consoles due to vendor lock in. So... what the **** are you talking about?



  • Registered Users Posts: 18,706 ✭✭✭✭K.O.Kiki


    I accidentally set

    I

    I accidentally set Tomb Raider to 30fps the other day & could tell instantly that it controlled & looked wrong.



  • Registered Users Posts: 246 ✭✭Jon Doe


    At 30 fps I probably would too. Have you tried to progressively increase the frame cap to determine when do you cease to notice the difference?



  • Advertisement
  • Registered Users Posts: 18,706 ✭✭✭✭K.O.Kiki


    No, absolutely no point. Just run games at 60fps if they're graphically intensive. 120fps if they're not (e.g. competitive shooters). I can tell the difference between 33ms (30fps) + 16.7ms (60fps) frame times while playing. 16.7ms -> 8ms (120fps) is slightly harder but it is there. Running games at 40fps would still be bad (25ms).



  • Registered Users Posts: 246 ✭✭Jon Doe


    If you can give me a RELIABLE source saying that the thread director makes decisions to use "atom cores" as you call them, to render frames over a P core and causing stuttering which impacts the gaming experience in a noticeable and problemo way, send it on.

    The thread debugger, experience and a bit of common sense. If even the branch predictor is capable of making a miss prediction I can only guess the problems that the thread director will introduce. All for the sake of having a big-little architecture. On a desktop CPU... Jeezus!

    It wasn't my idea to call them atom cores, but if it quacks like a duck, walks like a duck and flies like a duck, what else could it be?

    I don't think it's likely for a 'little' core to be placed in charge of rendering frames but it is conceivable that it could put in charge of tasks that are critical to frame rendering. Especially if those tasks are related to IO.

    the difference in power draw while gaming is really not that significant

    The 5600X is a 65W CPU. The 12600K is a 150W CPU. That's an 85W difference. Gaming is the activity that is most likely to take your CPU to task. Sorry if I offended you, that wasn't the intention. I just think that spending X more €, and Y more Watts to get 90 fps instead of 60 fps is... uninspired? It just seems like a strategy that very reminiscent of 'brute force', all for the sake of what can only be described as a pyrrihc victory - sure, you got a 50% difference. But you don't notice the difference that much. :P And you burned a hole in your wallet...

    For instance, in Aida 64

    I don't know what you do for a living, but I don't use Aida 64. I measure performance in transactions per second and power consumption in €/year. And people that go to the market and pick a 150W part when there's 65W part that delivers 90% of the performance have a tendency to get fired.

    I don't really understand the rest of what you're saying there, it's not very coherent.

    If you're interested you can ask for details on what bits you didn't understand, and I'll be more than happy to answer... :)

    The 10400f was a very solid budget CPU. Can't say as to whether or not he was recommending it at the end of 2021 but maybe your idea of budget is different to his.

    That is possible. Then again, there is also this:

    And as a generality, the human eye can see much more than 30fps. For me there is a clear difference between 60Hz and 144Hz even.

    Wow, if true that is impressive! Do you get head aches when fps drops below 60fps?

    Now I don't know if you're trolling or what, but stop the whole craic of misinforming people.

    If by trolling, you mean that I'm deliberately trying to get people in a foul mood, then the answer is no. It seems to me that your definition of "misinforming people" is giving my opinion - an opinion that you happen not to agree with. Can't we just agree to disagree instead?



  • Registered Users Posts: 246 ✭✭Jon Doe


    You can tell the difference between 60 and 120fps? :O I wish my eyes were half as good as yours... :P Your eyes are close to those of a bird of prey!

    https://www.healthline.com/health/human-eye-fps#can-we-test-fps-vision



  • Registered Users Posts: 602 ✭✭✭Aodhan5000


    As I asked already, if you can give me a RELIABLE source showing that E cores causes PROBLEMATIC frame drops, then we can have a discussion there. I have seen some of the alder lake chips having weaker 1% lows, possibly as a result of the E cores, I don't know. How problematic that is, I also don't know. Simple solution: Buy a 12400f instead of a 12600k if you're going Intel.

    Do not compare the power ratings of the chips, compare their real world power consumption in the relevant use case, which here is gaming.

    Source once again: https://youtu.be/LzhwVLUVork

    I just used Aida 64 as an example of when the power draw difference would be much more significant than in gaming.

    At a glance, can't see anything relevant in that Reddit post. Be more specific by quoting it please.

    The majority of your sources for "seeing" (or whatever way you want to put it) , more than 30/60 frames don't really give a concrete answer, frequently saying that seeing and perceiving are two different things. I'm not even gonna get to try to start differentiating the two of them.

    Instead, here's an interesting video showing the benefits of not only a higher refresh monitor, but also the benefit of having more frames than you're monitor is capable of showing however this can also have downsides such as screen tearing which I'm not really gonna get into because it is off topic.

    Source: https://youtu.be/OX31kZbAXsA



  • Registered Users Posts: 4,400 ✭✭✭Homelander


    Weird argument. There is a monumental difference between 60fps and 90-120fps. When you're playing games you're not just "seeing", you're also inputting and feeling response.

    Play on a 144hz monitor for a few hours and go back to 60hz and it feels radically different. I find it very hard to play at 60hz now, so much so I bought a new laptop with a 120hz screen for when I'm at a friends house.

    I may not "see" the difference but I most definitely feel it. I have a 240hz primary monitor but I don't really notice any difference between 120 and 240hz.



  • Registered Users Posts: 246 ✭✭Jon Doe


    @Aodhan5000 all I'm saying is that the atom cores introduce needless complexity and uncertainty. In essence headaches. This uncertainty is something that you don't seem to mind but I certainly do. Take this user for example:

    It seems he's been getting some inconsistent results - values that he can't explain. :( I'd rather not have this kind of thing on my worries list.

    @Homelander that really depends on each person. I for example can't tell the difference above 40/50 fps... If the study mentioned in Healthline above is to be trusted, some people can spot a specific image in the midst of a sequence of images running at 77fps (13ms). It depends on the viewer. A high frame rate more than something that we all want, it's something that some of us may need.



Advertisement