Advertisement
If you have a new account but are having problems posting or verifying your account, please email us on hello@boards.ie for help. Thanks :)
Hello all! Please ensure that you are posting a new thread or question in the appropriate forum. The Feedback forum is overwhelmed with questions that are having to be moved elsewhere. If you need help to verify your account contact hello@boards.ie

Is this the last of consoles as we know ?

13

Comments

  • Registered Users, Registered Users 2 Posts: 22,929 ✭✭✭✭ShadowHearth


    Retr0gamer wrote: »
    I heard it's only 4 cores with hyperthreading so it's 8 virtual cores. Whatever that means. Could be BS as well.

    So its only Hyperthreading so? Well that is in a way good news too. Though again, marketing bull****.

    If they really will start using the HT, then CPUs on Pc wont be bottlenecked anymore. At the moment all you need for gaming PC is i5 2500k/3570k. thats is sort of a sealing at the moment. you can pay axtra hundreds of euro for better CPU, but gains are so little it even comical. If they will main stream HT on CPUs thanks to consoles, we can see i7 with HT to be back in business and give even more boost and power.

    As for new GPUs. Last time i read both sides cancelled their gpu lines for this year. do you have a link, where they state that they will go for new GPUs? If they will, then happy days.


  • Moderators, Education Moderators, Technology & Internet Moderators Posts: 35,086 Mod ✭✭✭✭AlmightyCushion


    nesf wrote: »
    Sure. On Graphics power we have 1.7 TFLOPS (ps4) against 3.7 TFLOPS for a top end ATI Radeon card (the ps4 GPU is Radeon based so we can't use nVidia cards for comparison). Less than half the raw power* of a top end current card (never mind 2014 card). Optimisation will shrink that gap by quite a bit but close it? I sincerely doubt that. My current GPU from the previous generation is around 2.7 TFLOPS for comparison.


    *There is more to graphics cards than TFLOPS but it's useful as a rough guide when looking within a family of cards, since Sony is using a Radeon based chip this is a reasonable comparison to draw.


    Now, you don't need a top end card from the current generation to have pretty looking games. So I'm not saying that the ps4 graphics will look crap, I imagine that they will look very nice and be more than acceptable for people, especially those who've never gamed on a powerful PC which, let's face it, is most people. But close the gap with PCs? i.e. reach the potential of a high end PC in a year's time when even last generation high end cards are more powerful? I don't think so.

    You can't compare a console to a system running a 7970 or similar card. The PS4 will probably retail for a little more than the 7970 does. Sony would be daft to even try to get similar specs to a bleeding edge gaming PC, they'd price themselves out of the market.

    The 78XX series are a great card for 1080P performance. I'd only get a 79XX series card if I was gaming at 1440P or 1600P. Sony made the right choice. The 78XX are a sweet spot between price and performance for what they are looking for. I'd question their choice of processor more than the graphics card. Jaguar is aimed at tablets and netbooks, I worry it may struggle under heavy workload. Hopefully, Sony know what they're doing.


  • Moderators, Science, Health & Environment Moderators Posts: 10,079 Mod ✭✭✭✭marco_polo


    Its not hyperthreading its two Jaguar modules with four physical cores apiece.

    AMD will release new cards by year end but it doesn't have anything to do with consoles and everything to do with the fact they are making truckloads of cash on current sales of 7000 series GPUs.

    http://www.pcworld.com/article/2028408/amd-to-release-new-radeon-hd-8000-graphics-cards-in-2013.html

    I think the PS4 looks pretty ok hardware wise for the probable price, not much more Sony could fit in there if they are looking to hit a 100W-ish power mark as well.


  • Registered Users, Registered Users 2 Posts: 7,410 ✭✭✭Icyseanfitz


    marco_polo wrote: »
    Its not hyperthreading its two Jaguar modules with four physical cores apiece.

    AMD will release new cards by year end but it doesn't have anything to do with consoles and everything to do with the fact they are making truckloads of cash on current sales of 7000 series GPUs.

    http://www.pcworld.com/article/2028408/amd-to-release-new-radeon-hd-8000-graphics-cards-in-2013.html

    I think the PS4 looks pretty ok hardware wise for the probable price, not much more Sony could fit in there if they are looking to hit a 100W-ish power mark as well.

    thanks for that shadowhearth had me paranoid there for a bit lol, nice to see that they are still releasing in 2013


  • Registered Users, Registered Users 2 Posts: 7,873 ✭✭✭Grumpypants


    Specs are pretty much redundant for the next few years. Developers are going to have the added cost of developing on two brand new consoles.

    With new PC graphics cards down the line there will be a boost in graphical power but there are only a few developers who will be interested in pushing PC's beyond what the consoles can do.

    Look at Watchdogs for example, it is currently being developed for PS3, Xbox, PC, PS4 and xbox720 or what ever it is going to be called and think of the cost and time that takes.

    It makes sense to have the next gen and PC games being developed in tandem. And that's Ubisoft who have the money to spend, most of the others outside those attached to the big publishers simply don't have the money.

    Add in that the devs and the big publishers need the consoles to sell. As it is the main revenue stream they will not want them to appear outdated before they are even launched.

    The first year after release the consoles will be a testing ground, then once developers settle down you will see them start to push the consoles in the same way Naughty dog pushed the PS3.

    Factor in everything. The huge user base on PS3 and Xbox that will still sell the bulk of games for the next few years. The huge cost of developing for the next gen for a pretty small player base for the first year or two at least. Then not wanting to deflate interest in the console platforms.

    With all things considered I can't see much between the games on next gen or PC for at least 3 years.


  • Advertisement
  • Registered Users, Registered Users 2 Posts: 7,410 ✭✭✭Icyseanfitz


    you do know the pc has exclusives yeah? also the dev cost on these new consoles (ps4 anyway) should be the same if not lower than they are now as its an x86 system


  • Registered Users, Registered Users 2 Posts: 7,873 ✭✭✭Grumpypants


    you do know the pc has exclusives yeah? also the dev cost on these new consoles (ps4 anyway) should be the same if not lower than they are now as its an x86 system

    It is hard to compare PC exclusives to their console counterpart though :P

    It is an additional cost not an instead of cost. They are developing on three platforms now, next year that will be 5 platforms.


  • Moderators, Category Moderators, Computer Games Moderators Posts: 51,560 CMod ✭✭✭✭Retr0gamer


    Might be happening a little bit sooner since the architecture is well understood.

    I'd say it will be pretty similar to last gen where you will see them max out the hardware pretty quickly before coming up with a bunch of technologies to fudge certain processes just like with this gen with stuff like MLAA, FXAA and using quarter quality HDR effects. I don't think we'll ever see every game running in 60FPS either. The amount of horsepower you get from going to 30FPS is too tempting and in this industry if your game looks worse than a similar game but runs faster people will only notice how bad the game looks.

    Of course those 'fudging' technologies aren't a bad thing, many of them are being used by PC games for functions that used to be brute forced. I love how I can get AA from MLAA or FXAA with near no performance overhead where before I'd be playing everything with AA turned off.


  • Registered Users, Registered Users 2 Posts: 27,645 ✭✭✭✭nesf


    You can't compare a console to a system running a 7970 or similar card. The PS4 will probably retail for a little more than the 7970 does. Sony would be daft to even try to get similar specs to a bleeding edge gaming PC, they'd price themselves out of the market.

    The 78XX series are a great card for 1080P performance. I'd only get a 79XX series card if I was gaming at 1440P or 1600P. Sony made the right choice. The 78XX are a sweet spot between price and performance for what they are looking for. I'd question their choice of processor more than the graphics card. Jaguar is aimed at tablets and netbooks, I worry it may struggle under heavy workload. Hopefully, Sony know what they're doing.

    I don't disagree at all. If Sony went for 79xx level tech or better they'd be morons. My point is this thing about closing the gap with PCs completely doesn't hold up to even cursory scrutiny mainly because the needs are so different. Rephrase it as standing up to the average (read: 1-3 year old) 1080p gaming PC now and I might agree.

    Jaguar. Well. Maybe developers can extract a lot of out 8 low powered cores when they know that's what they're designing for. I think it's most likely a heat thing here. I don't think a CPU designed for Desktops when pushed hard would work very nicely with the kinds of cases expected from consoles now that they're expected to fit into the whole home entertainment set-up.


  • Moderators, Science, Health & Environment Moderators Posts: 10,079 Mod ✭✭✭✭marco_polo


    You can't compare a console to a system running a 7970 or similar card. The PS4 will probably retail for a little more than the 7970 does. Sony would be daft to even try to get similar specs to a bleeding edge gaming PC, they'd price themselves out of the market.

    The 78XX series are a great card for 1080P performance. I'd only get a 79XX series card if I was gaming at 1440P or 1600P. Sony made the right choice. The 78XX are a sweet spot between price and performance for what they are looking for. I'd question their choice of processor more than the graphics card. Jaguar is aimed at tablets and netbooks, I worry it may struggle under heavy workload. Hopefully, Sony know what they're doing.

    I wouldn't think the CPU will be a major problem as on of the advantages of a HSA architecture is that the GPU cores could now be utilised for much more than just graphics and basic physics.

    It even opens up a few completely new possibilities like having GPU physics effects influencing gameplay, the example given in the presentation is smoke particles impairing AI vision.

    Interesting if slightly technical AMD presentation on the subject here.

    http://www.slideshare.net/zlatan4177/gpgpu-algorithms-in-games

    EDIT: That Intel will have an obvious vested interest in not letting the AMD driven HSA architecture get any traction in mainstream desktop software development, could actually mean that the PS4 remains a relatively decent machine for a good while yet as developers will have free reign to do whatever they want with all available computing resources.


  • Advertisement
  • Registered Users, Registered Users 2 Posts: 7,410 ✭✭✭Icyseanfitz


    It is hard to compare PC exclusives to their console counterpart though :P

    It is an additional cost not an instead of cost. They are developing on three platforms now, next year that will be 5 platforms.

    why?

    as for having 5 platforms at once, that will change after release, no ones going to keep making new games for old consoles (except the odd ea/activision game)

    most games are made on pc's to begin with, which are x86 systems so it will take a lot less resources (funding) to port it over to the ps4 than it would have with say the ps3's cell architecture, this is why devs are so vocally positive about the ps4


  • Registered Users, Registered Users 2 Posts: 12,546 ✭✭✭✭Varik


    as for having 5 platforms at once, that will change after release, no ones going to keep making new games for old consoles (except the odd ea/activision game)

    Of course it'll change after release, but not for years. There's 140 million 360 & PS3s, publishers aren't going to abandon that market any time soon and it'll be more than just EA & Activision.
    It is hard to compare PC exclusives to their console counterpart though :P
    why?

    Because anyone can make a PC exclusive, I've made a grand total of 3 games myself and 2 of them sucked.


  • Registered Users, Registered Users 2 Posts: 7,410 ✭✭✭Icyseanfitz


    Varik wrote: »
    Of course it'll change after release, but not for years. There's 140 million 360 & PS3s, publishers aren't going to abandon that market any time soon and it'll be more than just EA & Activision.





    Because anyone can make a PC exclusive, I've made a grand total of 3 games myself and 2 of them sucked.

    in my opinion thats one of the great strengths of pc's, i dont understand how having an open system diminishes the value of its exclusives :confused:

    and i really dont think devs will be spending big chunks of money and time creating games for the ps3 + 360 3 years after ps4 + 720 are released


  • Registered Users, Registered Users 2 Posts: 4,640 ✭✭✭Pushtrak


    Varik wrote: »
    Of course it'll change after release, but not for years. There's 140 million 360 & PS3s, publishers aren't going to abandon that market any time soon and it'll be more than just EA & Activision.
    I wouldn't say years. This time next year is quite likely, some might say probable. I can't see it being even a year and a half. I'm even saying year and a half from now, not necessarily even from the date they get released. Though, if we were to use that metric, I'd be even less likely to think years.

    Why am I saying this? Well, mostly because of the release schedule of games for the 360/PS3. A lot of games that would otherwise have sold in the final months of the year are being sold earlier. As if, fearing that people will move on to the next gen and not buy them this generation. Course, a case could be made that it is a cynical attempt to get people to buy now, and buy later, which could be the case.


  • Registered Users, Registered Users 2 Posts: 12,546 ✭✭✭✭Varik


    in my opinion thats one of the great strengths of pc's, i dont understand how having an open system diminishes the value of its exclusives :confused:

    Never said it was a bad thing, comparing exclusive on the number of games alone is stupid.
    and i really dont think devs will be spending big chunks of money and time creating games for the ps3 + 360 3 years after ps4 + 720 are released


    2 years and most support for the current gen will drop off, games take a long while to develop and it's not like a developer can take many chances. Maybe 1-2 years after launch they'll have a good idea of potential sales and be able to focus on next gen but it's hard to justify starting development on a game now for a console that hasn't release yet or has only been out a short while.


  • Registered Users, Registered Users 2 Posts: 4,640 ✭✭✭Pushtrak


    Varik wrote: »
    2 years and most support for the current gen will drop off, games take a long while to develop and it's not like a developer can take many chances. Maybe 1-2 years after launch they'll have a good idea of potential sales and be able to focus on next gen but it's hard to justify starting development on a game now for a console that hasn't release yet or has only been out a short while.
    That has always been a concern, and yet, every gen we don't see significant issues. I can't remember when it was that previous gen games stopped being made for Xbox, PS2 and Gamecube that were also made for 360 and PS3 say. Did it even get as far as summer '06? The timetable of releases this year shows they want to get games out in the tail end of this system. They don't want to release end of the year when the new systems are out.


  • Registered Users Posts: 1,837 ✭✭✭intellectual dosser


    I had a conversation about this with someone last week, my line was that in five years people wont ask "What console do you use?" they will ask "What controller do you use?".

    I can see everything going PC based. I can imagine Sony or Nintendo storing an engine in the Cloud and you have a monthly subscription or something similar to Steam. Why bother having a physical console that only limits what you can do.


  • Registered Users Posts: 6,921 ✭✭✭ebbsy


    Azza wrote: »
    Arcade needs to make a comeback...I have got a mad hankering to spend lots of money 50 cent at a time!

    +1

    Ikari Warriors, Outrun, I remember the first time I saw Gauntlet.........


  • Registered Users Posts: 6,921 ✭✭✭ebbsy


    I had a conversation about this with someone last week, my line was that in five years people wont ask "What console do you use?" they will ask "What controller do you use?".

    I can see everything going PC based. I can imagine Sony or Nintendo storing an engine in the Cloud and you have a monthly subscription or something similar to Steam. Why bother having a physical console that only limits what you can do.

    Would parents still want to buy something physical that they can see for their children though ?


  • Registered Users Posts: 1,837 ✭✭✭intellectual dosser


    ebbsy wrote: »
    Would parents still want to buy something physical that they can see for their children though ?

    Hhmmmm.....I am no longer a child, nor do I have children so I didnt think of this.

    This way might save parents having to buy both a PC and a console...my folks hadnt a notion of controlling what I played on my mega drive so it might be better for them all round.


  • Advertisement
  • Moderators, Social & Fun Moderators Posts: 28,633 Mod ✭✭✭✭Shiminay


    Parents are alreay getting used to the idea of buying kids "points cards" as gifts though - be it XBox Live, MS Points, iTunes, Steam, etc, so that may be less of a barrier than you might think, but it's worth keeping in mind none the less.

    Good devs will squeeze a LOT out of seemingly limited architecture when they know that they don't have to spend time allowing for variables. The XBox 360 still produces decent looking games, regardless of how it compares with a modern PC, so given a bit of time to streamline their internal processes, the devs will have us wondering why we were ever worried about them having "only" chosen a 78XX card.

    What I like about this is that the sharing of internal resources may lead to a couple of breakthroughs for all computing in terms of multi-task Processors and general purpose RAM. Tablets with ARM chips are pointing that way too (although you're starting to see several of them introduce a separate graphics chip too). Imagine a computer where all you have is processor cores and ram and it decides how much is dedicated to physics, game engine, sound etc. Want to upgrade? Just plug another CPU chip in or some more RAM. "Minimum Requirements: 8 Cores and 8GB of RAM - Recommended Requirements: 16 Cores and 16GB of RAM" and that's that for a specs list. Wouldn't get any simpler than that if you made upgrading modular and idiot proof.


  • Moderators, Category Moderators, Computer Games Moderators Posts: 51,560 CMod ✭✭✭✭Retr0gamer


    Shiminay wrote: »
    What I like about this is that the sharing of internal resources may lead to a couple of breakthroughs for all computing in terms of multi-task Processors and general purpose RAM. Tablets with ARM chips are pointing that way too (although you're starting to see several of them introduce a separate graphics chip too). Imagine a computer where all you have is processor cores and ram and it decides how much is dedicated to physics, game engine, sound etc. Want to upgrade? Just plug another CPU chip in or some more RAM. "Minimum Requirements: 8 Cores and 8GB of RAM - Recommended Requirements: 16 Cores and 16GB of RAM" and that's that for a specs list. Wouldn't get any simpler than that if you made upgrading modular and idiot proof.

    Not sure that would work. As far as I know the silicon on the main processors is optimised to be a jack of all trades where as the silicon on GPU's is built specifically to handle just the 3D calculations and processes it is designed for which is the advantage of using a dedicated GPU.


  • Registered Users, Registered Users 2 Posts: 27,645 ✭✭✭✭nesf


    ebbsy wrote: »
    Would parents still want to buy something physical that they can see for their children though ?

    I don't know about older kids (my oldest is six and has always lived in a world with virtual products) and kids with less geeky parents bringing them up but my kids have been happy with virtual apps and games since they were old enough to know what they are. They obviously like physical gifts too but when it comes to computer games they don't care whether it's a physical box and disc or something downloaded. Now other stuff like Lego, where there's virtual builder sets, they really prefer the real thing. They'll mess around with little brick building apps on the iPad but they only sink the real hours into the traditional box of mixed Lego that they have going.


  • Registered Users, Registered Users 2 Posts: 22,929 ✭✭✭✭ShadowHearth


    nesf wrote: »
    I don't know about older kids (my oldest is six and has always lived in a world with virtual products) and kids with less geeky parents bringing them up but my kids have been happy with virtual apps and games since they were old enough to know what they are. They obviously like physical gifts too but when it comes to computer games they don't care whether it's a physical box and disc or something downloaded. Now other stuff like Lego, where there's virtual builder sets, they really prefer the real thing. They'll mess around with little brick building apps on the iPad but they only sink the real hours into the traditional box of mixed Lego that they have going.

    You don't need to be a kid to sink hours in to lego... :D


  • Registered Users Posts: 12,775 ✭✭✭✭Gbear


    Retr0gamer wrote: »
    Not sure that would work. As far as I know the silicon on the main processors is optimised to be a jack of all trades where as the silicon on GPU's is built specifically to handle just the 3D calculations and processes it is designed for which is the advantage of using a dedicated GPU.

    I would imagine that that specialisation is only necessary up to a point.

    I think we're already at a stage where the upper echelons of GPU power are redundant except maybe for some tiny minority of tasks and certainly from a gaming point of view.
    Games were being released 5-10 years ago that couldn't actually be run on any commercially available hardware. I think the likes of Quake 4, Doom 3 or more recently Crysis were examples. That's no longer the case by a long shot.
    Things like the Titan can probably run Crysis 3 on 3X1600p in eyefinity.

    Although it's likely a dedicated GPU will always give better performance an amalgamated chip will be able to run anything around pretty soon at a decent resolution and framerate.


  • Registered Users, Registered Users 2 Posts: 8,824 ✭✭✭ShooterSF


    theidiots wrote: »
    Its about time that consoles let customers update the hardware, really just release a new graphics card a year or two down the line

    Oh god no. Console gaming is based around the idea that everyone has the same setup and the devs only have to concern themselves with that setup. Fragmentation would be a horrible idea and it is one of the reasons why things like the kinect struggle outside of having a few dedicated titles, because devs aren't going to create two versions of a game for the same console depending on what hardware you have.


  • Registered Users, Registered Users 2 Posts: 27,645 ✭✭✭✭nesf


    ShooterSF wrote: »
    Oh god no. Console gaming is based around the idea that everyone has the same setup and the devs only have to concern themselves with that setup. Fragmentation would be a horrible idea and it is one of the reasons why things like the kinect struggle outside of having a few dedicated titles, because devs aren't going to create two versions of a game for the same console depending on what hardware you have.

    This. As soon as different graphics cards are available, or expandable RAM is available developers won't be able to work the optimisation magic they currently work to get such good looking games out of pretty weak hardware in that they just wouldn't have the budgets (or time) to optimise for the various configurations available.

    If more powerful graphics cards etc were introduced, developers would still optimise around the base hardware set-up and only give some bells and whistles for the (almost certainly) rarer "high end" consoles. It's pretty much the same with PC games at the moment, they're optimised for dual core processors, they usually can use but do not need four cores, anything beyond four cores is almost never utilised by most types of game. It's similar with very high end graphics cards, they only ever make sense if your resolution is above 1080p because developers basically make games for the good mid range card + 1080p monitor population which is a large chunk of the PC gaming market.


  • Moderators, Category Moderators, Computer Games Moderators Posts: 51,560 CMod ✭✭✭✭Retr0gamer


    We've had updatable hardware before. The Mega CD, 32X, N64 RAM Pack, the TG-16 CD-Rom unit. I think history and sales numbers have shown what a stupid idea it is.


  • Registered Users, Registered Users 2 Posts: 4,640 ✭✭✭Pushtrak


    Retr0gamer wrote: »
    We've had updatable hardware before. The Mega CD, 32X, N64 RAM Pack, the TG-16 CD-Rom unit. I think history and sales numbers have shown what a stupid idea it is.
    And failed ones, like the N64DD which didn't get a full release. Nothing else comes to mind, but I did want to post to mourn the N64DD. It was an idea with a lot of potential.

    That's actually what got Nintendo to do what they did with the Wii, and not release a new controller method for the Cube. It's a strategy that paid off for them. With the Kinect, I was thinking it a bad idea to release it in the middle of a generation, and should have done a similar thing as Nintendo did. Though, I guess, having people test out the device probably was ultimately worth it. Also, a return on investment sooner than waiting for the next gen I guess makes sense.


  • Advertisement
  • Registered Users, Registered Users 2 Posts: 35,063 ✭✭✭✭o1s1n
    Master of the Universe


    Pushtrak wrote: »
    And failed ones, like the N64DD which didn't get a full release. Nothing else comes to mind, but I did want to post to mourn the N64DD. It was an idea with a lot of potential.
    .

    The ones Retr0 mention were all failures.

    You missed the glory that was the Atari Jaguar CD attachment. You know, the add on that makes your console look like a jacks.

    jagcd.jpg


Advertisement