Advertisement
If you have a new account but are having problems posting or verifying your account, please email us on hello@boards.ie for help. Thanks :)
Hello all! Please ensure that you are posting a new thread or question in the appropriate forum. The Feedback forum is overwhelmed with questions that are having to be moved elsewhere. If you need help to verify your account contact hello@boards.ie

30/60fps.....why not 40fps?

  • 20-06-2013 3:29pm
    #1
    Banned (with Prison Access) Posts: 21,634 ✭✭✭✭


    Someone asked this question in the Playstation forum, would any techophiles be able to answer this?

    60fps seems to be the benchmark, but in terms of consoles....why cant they do say 40 or 45fps?. Does it have to be either 30fps or 60fps?...anything inbetween?


    Also, can you explain why 60 is better the 30? or why fps is important?

    Thanks :)


Comments

  • Registered Users Posts: 12,223 ✭✭✭✭MadYaker


    Don't know the answer to the first few questions because I'm not a console gamer. I know why fps is important. It stands for frames per second. More fps means smoother gameplay, it usually depends on the processing power of your graphics card. Better graphics card = more fps.


  • Closed Accounts Posts: 9,088 ✭✭✭SpaceTime


    Probably because it's a multiple of the North American / Japanese traditional TV refresh rate 30 frames per second / 60 interlaced fields per second.

    In Europe it should probably be 25, 50 and 100FPS to give optimal compatibility with TVs.

    There are a lot of conventions in display technology that have nothing to do with the current systems but originate in the old analogue days and were just kept in digital systems.


  • Registered Users, Registered Users 2 Posts: 6,721 ✭✭✭Otacon


    Someone will probably come along with a far more succinct explanation later but...

    Frames per second refers to the number of still images rendered in a second. These frames, being sequentially shown, form the display of the game. The more frames each second, the more fluid and seamless the moving image appears.

    As you mention, 60FPS is a benchmark currently. It is what is aimed for as it looks good. In this current generation, in order to get that fidelity, developers often had to sacrifice texture quality or resolution to achieve it.

    Now, I will wait for someone to come along and make me look stupid...


  • Registered Users, Registered Users 2 Posts: 7,019 ✭✭✭Wossack


    tis tied to the refresh rate of yer telly afaik - them being usually multiples of 30


  • Registered Users, Registered Users 2 Posts: 6,997 ✭✭✭circadian


    SpaceTime wrote: »
    Probably because it's a multiple of the North American / Japanese traditional TV refresh rate 30 frames per second / 60 interlaced fields per second.

    In Europe it should probably be 25, 50 and 100FPS to give optimal compatibility with TVs.

    Was just about to post the same thing. Personally I can't see a massive difference between 30 and 60fps. Wipeout HD probably benefitted from 60fps but its main theme is speed. In a shooter or whatever I can't see the difference (UT excluded).


  • Advertisement
  • Registered Users, Registered Users 2 Posts: 2,943 ✭✭✭Burning Eclipse


    SpaceTime wrote: »
    Probably because it's a multiple of the North American / Japanese traditional TV refresh rate 30 frames per second / 60 interlaced fields per second.

    In Europe it should probably be 25, 50 and 100FPS to give optimal compatibility with TVs.

    There are a lot of conventions in display technology that have nothing to do with the current systems but originate in the old analogue days and were just kept in digital systems.

    Listen to this guy.

    Most of the bog standard HDTVs sold in EMEA markets are 50hz... Not sure how 60fps will work out here.

    So, 60 is better for smoothness of image, and importantly for input as well! Input shouldn't be underestimated.


  • Closed Accounts Posts: 9,088 ✭✭✭SpaceTime


    The Geeky explanation:

    The refresh rate of TVs was originally tied to the mains frequency of the power system.
    The TV tube scanning gun was literally synchronised to the AC power which was assumed to be (and is in Ireland, Europe and the US etc) very accurate. It was the handiest reference that allowed every TV in a given area to be sychronised.

    Old electric clocks used to use the power system frequency to control the timing of each tick too. It wasn't until quartz oscillators and various electronics came about that they were able to avoid doing that.

    In European countries (and the majority of the world which followed European standards) it's 50Hz (50 cycles per second) - Metric power - i.e. 100 peaks and troughs on the graph per second. - So TVs refreshed at 50Hz.

    In North America, half of Japan and a few other places it's 60Hz (60 cycles per second). The logic possibly was there were 60 seconds so, why not just break it into 60 cycles too. So TVs refreshed at 60Hz

    They picked those frequencies for totally practical reasons.
    The alternators (generators) in the power plants have to rotate at a particular speed to give you the correct frequency of power. So, you don't want to have the frequency set too high, or the generators have to rotate very fast and that becomes rather impractical from a mechanical point of view as you'd start to need gears and all sorts of stuff.

    However, they needed to frequency to be high enough so that you wouldn't notice your incandescent lights flickering. So, that's how they ended up on those frequencies.

    There were some older systems that used lower frequencies like 25Hz and 16 2/3 Hz but people found the flickering lights intolerable.

    In these digital days, there's really no particular reason why we should be confining ourselves to the same refresh rates as the ESB use for generating power.


  • Registered Users, Registered Users 2 Posts: 6,721 ✭✭✭Otacon


    Otacon wrote: »
    Now, I will wait for someone to come along and make me look stupid...

    /looks up the thread

    That didn't take long.


  • Moderators Posts: 5,558 ✭✭✭Azza


    Some console games are 60fps but they don't looks as good 30fps games.

    Console developers tend to prefer pushing graphics over performance.

    There isn't much point going for frame rates in between 30fps and 60fps if the game has enabled V-Sync.

    If you have a game running at 45fps with V-Sync on you will get the frame rate capped at 30.

    Of course you could run the game without V-Sync and get the benefits of the higher frame rate but then your game will suffer from screen tearing.

    Higher frame rate's give you smoother controls.


  • Registered Users, Registered Users 2 Posts: 2,943 ✭✭✭Burning Eclipse


    And to really throw a spanner in the works... certain Bluray discs (and players/TVs) support 24fps, because that's what a large portion of movies are shot at!


  • Advertisement
  • Banned (with Prison Access) Posts: 21,634 ✭✭✭✭Richard Dower


    Ah!....so V-Sync is the culprit for 30/60 fps....gotcha! :cool:

    Azza wrote: »
    Some console games are 60fps but they don't looks as good 30fps games.

    Console developers tend to prefer pushing graphics over performance.

    There isn't much point going for frame rates in between 30fps and 60fps if the game has enabled V-Sync.

    If you have a game running at 45fps with V-Sync on you will get the frame rate capped at 30.

    Of course you could run the game without V-Sync and get the benefits of the higher frame rate but then your game will suffer from screen tearing.

    Higher frame rate's give you smoother controls.


  • Moderators, Category Moderators, Computer Games Moderators Posts: 51,559 CMod ✭✭✭✭Retr0gamer


    60 Hz is a standard framerate on all HD TVs. If you go to 40 fps it creates technical hurdles to overcome whereas refreshing the display every second frame makes more sense.


  • Registered Users Posts: 6,547 ✭✭✭Agricola


    And to really throw a spanner in the works... certain Bluray discs (and players/TVs) support 24fps, because that's what a large portion of movies are shot at!

    Yeah and that relatively slow frame speed originates from the early days of cinema at the start of the 20th century. I believe the first crank operated cameras produced a 24fps output as that was the speed dictated by the mans arm! Despite all the innovations over the years, the movie industry kept it as standard, primarily because it so much cheaper than shooting a movie in a higher frame rate.

    That may all change now P Jackson is pioneering 48fps in the Hobbit films.


  • Registered Users Posts: 1,923 ✭✭✭Reati


    Agricola wrote: »
    That may all change now P Jackson is pioneering 48fps in the Hobbit films.

    And I think James Cameron also trying to shoot the Avatar sequels at 48 or 60 FPS too.


  • Closed Accounts Posts: 9,088 ✭✭✭SpaceTime


    Retr0gamer wrote: »
    60 Hz is a standard framerate on all HD TVs. If you go to 40 fps it creates technical hurdles to overcome whereas refreshing the display every second frame makes more sense.

    Actually, it's not.

    Standard *broadcast* HDTV in Europe is 50Hz and 60Hz in the US.

    The TVs can generally refresh a *LOT* faster than that, especially as you get towards higher end models. They use various software tricks to actually insert all sorts of extra frames between the broadcast frames to make the video seem a lot smoother.

    The US traditional NTSC frame rate's actually 29.97 Hz oddly enough too, not actually 30Hz which was a hack to allow colour transmission on NTSC without interference, yet that stuck right into the digital era for US broadcasts.

    ...

    40 fps would probably cause ATSC and DVB systems to get upset, but 25, 30, 50 or 60 fps would be fine.

    40 fps is just an odd value for broadcast systems.


  • Registered Users, Registered Users 2 Posts: 3,739 ✭✭✭scamalert


    Dunno anything about tvs and fps to do with them,but know one thing on pc if your rig cant handle game over 60fps its almost impossible to play,as it creates lag/stutter.And dont know how it works on consoles but i think that it comes down to console capabilities,thus giving developers maximum knowledge how much graphics and resources they can put into games,or for example just reduce map size to increase graphics effects and vise versa.


  • Closed Accounts Posts: 9,088 ✭✭✭SpaceTime


    Bear in mind that recent higher end flat panel TVs have 700Hz+ refresh rates.
    Even the last generation of high end CRTs often had 100Hz or even 200Hz refresh rates and used software to improve the smoothness of the video.

    Technology like 'MotionPlus' on samsung tv's basically works by creating using each broadcast frame as a keyframe and then drawing 'tweens' between them to smooth the animation out by adding in vastly more frames per second.

    So whether you're watching a 24fps movie or a regular DVB 50Hz broadcast, it looks a hell of a lot smoother.

    AFAIK, you can feed in much higher frame rates of HDMI connections from a console.

    Always worth checking your TV's frame rates though. You're paying for refresh rates and expensive LED/LCD technologies that have very responsive pixels and you're also paying for very sophisticated image processing software like MotionPlus or Sony Bravia engine etc etc.

    Compare high end LED tv to a bog standard CRT with a 50Hz screen and you'll see the difference!

    I'd actually say a lot of people would find a 1980s / 90s era TV unwatchable as we've become use to very steady images.


  • Registered Users, Registered Users 2 Posts: 1,143 ✭✭✭jumbobreakfast


    The new Nvidia 780 and 770s allow you to overclock your display refresh rate. They state in the blurb that 60Hz was just a hangover from the old CRT days.

    http://www.geforce.com/whats-new/articles/introducing-the-geforce-gtx-770#display-overclocking

    It doesnt explain why 30fps is a standard but I thought that someone might find it interesting. retr0gamer's explanation made most sense to me but perhaps there's a better reason?


  • Moderators, Category Moderators, Computer Games Moderators Posts: 51,559 CMod ✭✭✭✭Retr0gamer


    I think Azza explains it best tbh :)


Advertisement