Advertisement
If you have a new account but are having problems posting or verifying your account, please email us on hello@boards.ie for help. Thanks :)
Hello all! Please ensure that you are posting a new thread or question in the appropriate forum. The Feedback forum is overwhelmed with questions that are having to be moved elsewhere. If you need help to verify your account contact hello@boards.ie

Direct-X/OpenGL..is it feasible to run stream/block encryption routines with GPU?

Options
  • 01-08-2005 3:54am
    #1
    Closed Accounts Posts: 1,567 ✭✭✭


    The topic pretty much asks the question, i'm just wondering are the GPU's efficient at xors,ands and ors on bytes and do you know of any links to source code?


Comments

  • Registered Users Posts: 2,426 ✭✭✭ressem


    Current GPUs are optimised for floating point (and not completely standard either), encryption processing is usually integer based.

    Apparently it has been used in the past, and hacks have been used to convert back and forth from floating point and recognise errors.

    You could look at http://brook.sf.net or google gpgpu (general purpose gpu). There's source included in brook for a few different types of app, but nothing in the encryption line.


  • Registered Users Posts: 1,481 ✭✭✭satchmo


    There are certain API calls like glLogicOp that will allow you to perform logical bitwise operations on the framebuffer, but it would likely require a lot of reading back from the GPU which would limit the advantage of using it in the first place... my encryption is rusty at best, so I'm not sure exactly what would be needed. The alternative is to precompute a table of results and store them in a texture - this is the usual workaround for non-standard fragment operations, as texture lookups are blazingly fast in comparison. Also keep in mind that GPUs are unable to generate pseudo-random numbers, so if you need them you'll have to supply these in a texture too.

    You're not really going to find code that you can drop into an app, you'll probably have to code it up yourself if you do end up using the GPU. Have a look at Brook and gpgpu.org, and maybe a quick look at this paper, although it's a rework of a 2001 paper and so is fairly out of date.

    Let us know how you get on either way.


  • Closed Accounts Posts: 1,567 ✭✭✭Martyr


    OK, thanks for the info.
    If anything surfaces, i'll be happy to let you all know.


Advertisement