Advertisement
If you have a new account but are having problems posting or verifying your account, please email us on hello@boards.ie for help. Thanks :)
Hello all! Please ensure that you are posting a new thread or question in the appropriate forum. The Feedback forum is overwhelmed with questions that are having to be moved elsewhere. If you need help to verify your account contact hello@boards.ie

Effective monitor scaling

Options
  • 19-01-2011 6:19pm
    #1
    Registered Users Posts: 1,216 ✭✭✭


    Having used CRT monitors for most of my life I've gotten used to having resolutions where I want them. But the experience varies with LCD monitors. For example I'd prefer running 1024 x 768 on a 17" LCD. On a 19" I might switch to 1280 if I was so inclined. This is non-widescreen of course.

    The result is that the monitor scales to the native. Which on a 17" is 1280 x 1024. The end picture tends to vary in quality. On a Dell 17" a few years back the text ended up slightly fuzzy. On a high end NEC, the text seemed just fine to me.

    This obviously applies to gamers too who could easily be running at resolutions different from the native. Not to mention LCD TVs which run non-native pretty much all the time (Standard TV is 576 lines).

    The question is: How do you get the best non-native experience from your monitor? The scaling hardware does seem to be getting better - my guess is this is driven heavily by the LCD TV market. But can you scale on the graphics card? Which is better?

    I'll also comment that on OSX, noone seems to care what the resolution is because the desktop scales.


Comments

  • Registered Users Posts: 3,553 ✭✭✭lmimmfn


    the mesh is fixed on an LCD whereas a CRT is analog and can do a variety of resolutions, bottom line is if theres 2 pixels on the LCD and you want to only light 1.5 of them, its impossible, and no scaling will ever ever fix that :)

    Non native at lower resolutions is absolutely aweful, if youve a 22" widescreen 1080p screen it does 1680x1050 well simply because the pixels are a lot smaller so you dont really notice it


  • Registered Users Posts: 1,242 ✭✭✭Moon54


    It's always better to use the native resolution of the LCD monitor.
    Text, pictures etc will always look better.

    If text size, icon size, etc are an issue, ie too small, they can be adjusted in the control panel.

    In Windows 7 it's super easy to do, just click anywhere on the desktop, hold down CTRL and use your mouse scroll wheel to size the icons to whatever you like. Works in folders, and in your browser too!


  • Registered Users Posts: 1,216 ✭✭✭carveone


    Yes, I understand that native is better. But you don't always have that choice. I was really wondering are there graphics cards that do better scaling than the monitors. So you could request 1024 x 768 and the card would output 1280 x 1024, preferably DVI.

    This has become an issue otherwise Windows 7 wouldn't have included the ability to enlarge its icons and images (text hasn't been an issue for years really - not since truetype in Windows 3.1). I'll admit that is a nice feature and seems to work a lot better than the DPI options in previous versions of Windows which scaled the text but not the graphics or dialogs and made everything look like cack.

    Besides, as I said, PAL SD TV is 576i and LCD TVs are 1080p; there is considerable discussion on AV forums on which hardware does this the best. Ditto for gamers.


  • Moderators, Science, Health & Environment Moderators Posts: 10,079 Mod ✭✭✭✭marco_polo


    carveone wrote: »
    Yes, I understand that native is better. But you don't always have that choice. I was really wondering are there graphics cards that do better scaling than the monitors. So you could request 1024 x 768 and the card would output 1280 x 1024, preferably DVI.

    This has become an issue otherwise Windows 7 wouldn't have included the ability to enlarge its icons and images (text hasn't been an issue for years really - not since truetype in Windows 3.1). I'll admit that is a nice feature and seems to work a lot better than the DPI options in previous versions of Windows which scaled the text but not the graphics or dialogs and made everything look like cack.

    Besides, as I said, PAL SD TV is 576i and LCD TVs are 1080p; there is considerable discussion on AV forums on which hardware does this the best. Ditto for gamers.

    Not sure I follow, all a card can do is output the requested resolution either by the OS or the fullscreen application currently running. If you want 1024 x 768 or 1280 x 1024 then you select that resolution in the application / display manager. It is completely up to the display device to scale it properly.

    And if you scale a 16:9 or 16:10 resolutions to 4:3 ones then a display device has only three real options, either huge distortion, huge chunks missing on either side or letterboxed. Unless the aspect ratios are the same or very similar (ie from 16:9 720p to 16:9 1080p) then something has to give.

    EDIT: Well you learn something new everyday. It is not nescessarly 'completely' up to the monitor as it turns out that the ATI Catalyst Control drivers gives you a little bit of control over scaling rather than leave it up to the display device(Sure Nvidia drivers have similar features).

    http://support.amd.com/us/kbarticles/Pages/UnableToSetGPUScaling.aspx#centeredtimings


  • Registered Users Posts: 1,216 ✭✭✭carveone


    marco_polo wrote: »
    And if you scale a 16:9 or 16:10 resolutions to 4:3 ones

    Urgh. I'm not quite that bad! At most I'm upscaling, keeping the aspect ratio.

    (In fact I'm one of those people that go nuts when TVs are set to widescreen on a 4:3 input. And noone notices! How can you not notice! It's awful!)
    marco_polo wrote:
    it turns out that the ATI Catalyst Control drivers gives you a little bit of control over scaling rather than leave it up to the display device(Sure Nvidia drivers have similar features).

    GPU scaling. That's the term I'm looking for :)

    Technically speaking, I was thinking that monitors are only capable of a dumb interpolation, filling up every 2nd or 3rd pixel with the value of that of a neighbour. Better is a linear interpolation where the missing pixels get the average colour value of its neighbours. And then there's bicubic interpolation which would have the best quality if the GPU would do it for you.


  • Advertisement
  • Registered Users Posts: 3,553 ✭✭✭lmimmfn


    carveone wrote: »

    GPU scaling. That's the term I'm looking for :)

    Technically speaking, I was thinking that monitors are only capable of a dumb interpolation, filling up every 2nd or 3rd pixel with the value of that of a neighbour. Better is a linear interpolation where the missing pixels get the average colour value of its neighbours. And then there's bicubic interpolation which would have the best quality if the GPU would do it for you.
    ahh you mean upscaling, you can set this on cards, but tbh though its not great because if its not used with high levels of AA( on lower panels ) it cant do much. AA is what will make games look amazingly better when running a lower resolution on a higher res screen, of course AA has a serious impact on framerates depending on the gpu.

    As for movies, hmmm, well better off setting it up in VLC or whatever to interpolate, for VLC see this( its old but will be similar ) - http://blog.thewombat.org/2008/11/how-to-use-vlc-096-as-upscaling-media.html


  • Moderators, Science, Health & Environment Moderators Posts: 10,079 Mod ✭✭✭✭marco_polo


    carveone wrote: »
    Urgh. I'm not quite that bad! At most I'm upscaling, keeping the aspect ratio.

    (In fact I'm one of those people that go nuts when TVs are set to widescreen on a 4:3 input. And noone notices! How can you not notice! It's awful!)

    Sorry I wasn't trying to be a smartass or anything like that :), I just wasn't sure exactly what you were looking for. I get where you are coming from now, you are just wondering if a video card will do a better job than the monitor.


    GPU scaling. That's the term I'm looking for :)

    Technically speaking, I was thinking that monitors are only capable of a dumb interpolation, filling up every 2nd or 3rd pixel with the value of that of a neighbour. Better is a linear interpolation where the missing pixels get the average colour value of its neighbours. And then there's bicubic interpolation which would have the best quality if the GPU would do it for you.

    As you will have seen yourself with the 17" Dell vs NEC, scaling quality depends largly on the quality of the monitor itself. The general concensus on GPU scaling is appears to be that it is much better job of scaling than cheaper monitors, and is not discernably different to that of a high quality monitor.


  • Registered Users Posts: 1,216 ✭✭✭carveone


    marco_polo wrote: »
    Sorry I wasn't trying to be a smartass or anything like that :), I just wasn't sure exactly what you were looking for. I get where you are coming from now, you are just wondering if a video card will do a better job than the monitor.

    That's it in a nutshell. No worries - most monitors are widescreen these days. I didn't take offence :)

    So, the answer seems to be "maybe".
    As you will have seen yourself with the 17" Dell vs NEC, scaling quality depends largly on the quality of the monitor itself. The general concensus on GPU scaling is appears to be that it is much better job of scaling than cheaper monitors, and is not discernably different to that of a high quality monitor.

    After much googling, that's the answer I come up with too. The scaling on a monitor is done by a DSP which tends to vary in the algorithm it uses. Similar to choosing a scaling method in Photoshop. Cheap monitors have cheap DSPs.

    If you have a dumb DSP, use the GPU. Nvidia and ATI drivers have options to set this. If you have a good DSP, use the monitor - reviews by Prad or XbitLabs test interpolation and scaling. If you have a 700 quid CRT I'd be inclined to use that :p As well of course - the only thing better than a high end monitor is two high end monitors.

    As to affecting performance on the Graphics card, scaling is done by dedicated hardware so shouldn't cause a performance impact. AA (Anti Aliasing) is a different matter though!


Advertisement