Advertisement
If you have a new account but are having problems posting or verifying your account, please email us on hello@boards.ie for help. Thanks :)
Hello all! Please ensure that you are posting a new thread or question in the appropriate forum. The Feedback forum is overwhelmed with questions that are having to be moved elsewhere. If you need help to verify your account contact hello@boards.ie

Would you buy a Physics Card?

Options
  • 18-05-2005 9:03pm
    #1
    Registered Users Posts: 999 ✭✭✭


    A physic card basically takes the load off the CPU as far as working out in-game physics. So you buy a CPU, a graphics card AND a physics card. I believe they're gonna be similarly priced to graphics cards with high-end and low-end options ($100-300).

    It's supposed to allow more detail on screen as well as more realistic behaviour of objects in games. I heard about this a little while ago and kinda thought it was a little bit rubbish. But ASUS have decided to sell these cards lending some creibility to them. To read a bit more about them visit the card manufacturer's website. There might be more info after E3, with games announcing support for the card. Currently, there's only 3 or 4 games on the horizon AFAIK.

    So I'm wondering, would anybody buy this physics card? Cos basically if it gets popular, more games will be made for it, meaning games would either run slow without the card, or you'd have to run on low settings - Meaning, I'd have to buy one. I don't see the need for these cards for a while yet - I think they're a gimmick.

    Any thoughts?

    Would you buy a physics card? 43 votes

    Yeah, sounds cool. I'll probably buy one of these.
    0%
    I'll wait and see.
    16%
    Moriartypo0kShevYDooomgarthvi_am_dogboybounty 7 votes
    Nah, sounds like a load of crap.
    83%
    satchmoGonzoastrofoolPraetorianNietzscheanMatt Simisthe_sycoPugsleyTuskyGoodshapeTöpherRewRabiesKevokboogie manRobertFosterbkdloobDRi_Nollaig 36 votes


«1

Comments

  • Closed Accounts Posts: 3,357 ✭✭✭secret_squirrel


    If it does take off watch that company get snapped up by either ATI or nVidia - or for them to jump into with their own solutions.
    Personally with all the dualie's coming out I would have thought a more economic solution would be to use one of the processors for the physics and the other for the rest of the game.
    Essentially its not going to be long before games use all that extra processing power.

    Cant see it taking off personally - especially when they are predicting a slowdown in the Gfx market after the silly release schedules of the last few years.


  • Registered Users Posts: 10,299 ✭✭✭✭BloodBath


    Nah, sounds like a load of crap.
    I think they're a gimmick.

    Not according to most of the companys. This is exactly how gpu's started out. A seperate processor to take some load of the cpu when generating graphics which eventually turned into an essential piece of hardware. With physics now being put to use quite a lot in games the cpu needs a little help once again. Analysts are saying this will be a success. So you will have your cpu, your gpu and your ppu.


  • Hosted Moderators Posts: 18,115 ✭✭✭✭ShiverinEskimo


    Nah, sounds like a load of crap.
    Isn't the aim of multi-core and cell chip processors to create dedicated processors for things such as physics engine threads??

    Sounds like a money spinner..


  • Registered Users Posts: 999 ✭✭✭cregser


    Isn't the aim of multi-core and cell chip processors to create dedicated processors for things such as physics engine threads??
    That's what I would have thought. I'd rather get a dual-core processor over a ppu. And I'd be disappointed if it couldn't handle fancy physics. Anyway, water effects (for example) and simulated physics are pretty realistic now. I can only see this being used by computational simulations (of car crashes or whatever). Of course, I'd like to see these games on the horizon and why they feel they can take advantage of this card.

    [EDIT]Interview and game developer opinions at http://www.gdhardware.com/interviews/agiea/001.htm


  • Registered Users Posts: 3,969 ✭✭✭mp3guy


    Nah, sounds like a load of crap.
    Now that i think of it, its a bit dumb. GPU for graphics, PPU for physics, SPU for sound. What about your CPU? Seems a bit pointless. Is your CPU just going to tell everything where to go? Seems quite the waste...


  • Advertisement
  • Registered Users Posts: 15,815 ✭✭✭✭po0k


    I'll wait and see.
    I'd rather use a hardware MPEG2 en/decoder than use a software method.
    And I'm rather fond of hardware-accelerated graphics.
    This isn't a money spinner.
    This is the difference between you lobbing a brick at a wall and knocking a few chips off and having a division of trebuchets hurling boulders at a castle and each and ever boulder, block and peasant being flung through the air and crashing through the water surface perfectly.
    The sort of large-scale and high-actor-count physics this sort of hardware will enable in games will be truely something to add to games and playability. Look at the gravity gun in HL2.
    2D and 3D graphics got the hardware treatment initially, then sound, now physics, next will be AI I'd imagine.
    Why use a general purpose processor to do in 100cycles what a custom ASIC can do in 5?


  • Closed Accounts Posts: 1,502 ✭✭✭MrPinK


    A dedicated physics processor doesn't sounds like a bad idea, but stick it on on the graphics card for god sake. I don't want to have to by a seperate one, and I've got all my PCI slots filled up as it is.


  • Registered Users Posts: 10,299 ✭✭✭✭BloodBath


    Nah, sounds like a load of crap.
    Now that i think of it, its a bit dumb. GPU for graphics, PPU for physics, SPU for sound. What about your CPU? Seems a bit pointless. Is your CPU just going to tell everything where to go? Seems quite the waste...

    Yes exactly. Cpu's are being stressed enough as it is with complicated physics calculations, A.I., 3d co-ordinates, ect. It's not like it's sitting there idle. Cpu's haven't advanced much in the past couple of years compared to previous years. They do need the help.

    SyxPak hit the nail on the head. This allows a lot more comlicated physics calculations than anything seen before and allows to cpu to concentrate on other factors like A.I.


  • Closed Accounts Posts: 978 ✭✭✭bounty


    I'll wait and see.
    the more processing units the better games will be

    its about time physics got taken seriously in games

    at the moment, shooting a wall looks crap, nothing happens except the texture creates a bullet mark

    i want to be able to throw a grenade and watch as the ground creates a large hole, and every particle of dirt is sprayed realistically about


  • Closed Accounts Posts: 2,653 ✭✭✭steviec


    Physics is the most interesting thing to happen to games in general in the last couple of years, graphics are as good as they need to be but there is sooooo much room for improvement in this area, and to achieve that(and stay ahead of those next gen consoles which no x86 CPU is gonna do for a long time) they need a specialised processor.

    And to be honest I'd rather just have to buy a card to play the latest games, instead of upgrading a processor which invariably means new motherboard and new memory etc. My athlon xp 2000+ handles top quality graphics fine with its GPU, but newer games with more complex calculations are getting more and more difficult and I know its the processor thats the bottlenexk rather than graphics, so I'd be delighted if I could get a PPU instead of having to spend an absolute fortune in time and money bringing my system up to date.


  • Advertisement
  • Registered Users Posts: 1,831 ✭✭✭dloob


    Nah, sounds like a load of crap.
    It could be really good.
    A custom chip would easily beat the CPU at physics processing even if a whole core is dedicated to it.
    It will be interesting to see how games handle the introduction of it.
    Bit of a chicken and egg scenario make full use the PPU meaning it won't run on the vast majority of machines or only use it a little meaning most of it goes to waste.

    Is physics as easy to scale as graphics?
    If the computer doesn't have the high end grapics card you can decrease the resolution and reduce the polygon count etc.
    But could you do the same kind of thing with physics


  • Registered Users Posts: 11,196 ✭✭✭✭Crash


    surely this also feeds into the idea of CPU's not being a speed race anymore, and more being about the way they handle things? i mean if a CPU is no longer burdened with a lot of this, it means segmented easier upgraded machines and the idea of the CPU clock speed race vanished forever and ever and ever :P


  • Hosted Moderators Posts: 18,115 ✭✭✭✭ShiverinEskimo


    Nah, sounds like a load of crap.
    CPU's will have to move towards better handling rather than faster speeds - but gaming is just one small area in the PC world that a PCI board for physics and graphics and sound can benefit - at this rate you'll have an encoding board too for DVD/CD ripping...You could build a dedicated board for most things i reckon - I guess this is really the idea of Cell chips, many little chips each with dedicated functions....


  • Registered Users Posts: 15,815 ✭✭✭✭po0k


    I'll wait and see.
    We're seeing a shift toward horizontal scaling as opposed to vertical scaling in performance. This happened years ago on enterprise kit.
    The Opteron (and hence A64s) were designed from the ground up to run at least 2 cores on the one die, and they've been around for a good 2 years at this stage, plus the time spent in development.

    Again, I reckon the next thing to be hardware accelerated will be AI, with Neural Net Processing Units (NNPU © SyxPak 2005) or whatever.

    As regards the PPU being on a card, I read up on the AGEIA stuff a few months ago when the first rumblings started (Go to their site and download the tech demo - runs on most machines - you'll be fiddling for ages with it).
    If you read through the press-releases and blurb they mention integrating the 'chip' into notebooks, laptops etc. etc. I wouldn't be too surprised to see it lobbed into motherboards soon enough, or as an extra chip on a sound or graphics card.
    Physics can be scaled, simply don't try to model as many actors or reduce the precision. As far as I know it's fairly linear too, like 3D graphics.


  • Closed Accounts Posts: 4,487 ✭✭✭Kevin_rc_ie


    I'd a prefer a chemistry engine so they could produce smells.


  • Registered Users Posts: 3,969 ✭✭✭mp3guy


    Nah, sounds like a load of crap.
    I'd a prefer a chemistry engine so they could produce smells.

    I can imagine it now, "I can smell burning, but i'm not playing anything...."


  • Registered Users Posts: 9,893 ✭✭✭Canis Lupus


    Not sure how I feel about a dedicated PPU. If it was cheap ie less that 100e then maybe and only maybe.


  • Registered Users Posts: 37,301 ✭✭✭✭the_syco


    Nah, sounds like a load of crap.
    SyxPak wrote:
    I wouldn't be too surprised to see it lobbed into motherboards soon enough, or as an extra chip on a sound or graphics card.
    This'd be a good idea. Like the CPU, it can then be upgraded. Also, if its on the mobo, the main company would sell it off to to other companies, etc.


  • Registered Users Posts: 1,272 ✭✭✭i_am_dogboy


    I'll wait and see.
    SyxPak wrote:
    Physics can be scaled, simply don't try to model as many actors or reduce the precision. As far as I know it's fairly linear too, like 3D graphics.
    Not only fairly linear, but incredibly so. With everything being determined by a set of core rules the ppu designers can almost completely rule out the branching elements of the processor and concentrate heavily of vector manipulation and maths. From what I've seen the only elements of physics programming that involves any sort of branching is collision detection, and even then it's not as heavy in compound testing as AI or game logic. And maybe you can confirm this one, aren't processors like the cell and the ppc incredibly optimised for maths and vector operations? You know so much that they don't really handle branching, and such, as is found in game logic and AI. I remember hearing one of the guys in the GDC "burning down the house" talk talking about in order and out of order chips....

    Going a bit off topic, the idea of AI specific CPU's, or at least effective one's is probably still a few years off, as logic programming and traditional programming are so different, and I haven't seen any locig specific or optimised CPU's. When I say logic I mean from a programming standpoint, I mean like a CPU designed to pound the **** out of lisp, prolog or similar languages. I think the idea of segregating the different aspects of development is great myself, it means the AI people can write AI code in an AI language, the physics people can write physics code in a numerical languagte, likewise for graphics and sound, then development environments or API's could include some nice API's to bring it all together.


  • Registered Users Posts: 2,124 ✭✭✭Explosive_Cornflake


    SyxPak wrote:
    As regards the PPU being on a card, I read up on the AGEIA stuff a few months ago when the first rumblings started (Go to their site and download the tech demo - runs on most machines - you'll be fiddling for ages with it).
    Linkage Please.


  • Advertisement
  • Registered Users Posts: 11,987 ✭✭✭✭zAbbo


    This is a step in the right direction imo.

    Now it will be funning getting the best performance/cost ratio for the mobo/cpu/gpu/spu/ram etc.

    Not to mention monitor response times, and all to play on the phat interleaved net connection ;)


  • Closed Accounts Posts: 6,601 ✭✭✭Kali


    I remember Falcon 3.0 required a maths co-processor if your cpu didn't have one... as far as i recall they only started being inbuilt in the cpu from the 486DX onwards. Seems the industry has come full-circle.


  • Closed Accounts Posts: 29,930 ✭✭✭✭TerrorFirmer


    mp3guy wrote:
    I can imagine it now, "I can smell burning, but i'm not playing anything...."

    lmao


  • Closed Accounts Posts: 3,357 ✭✭✭secret_squirrel


    BloodBath wrote:
    Cpu's haven't advanced much in the past couple of years compared to previous years. They do need the help.

    Disagree totally - multi-threading and SMTP are the way forward not another seperate accelerator - thats the idea behind the Xbox360 and the Cell processors. If you read the design thoughts behind the Xbox360 they envision its 6 threads over 3 cores being used in exactly the way everyone is talking about on this thread - namely to run specific threads for specific game functionality such as Physics and AI.


  • Closed Accounts Posts: 3,285 ✭✭✭Smellyirishman


    I think a PPU would be a great idea in some shape or form. Physics has been the single greatest improvment to games in the last few years.

    With this type of physics processing, you would being to see intelligent particles (smoke that interacted with it enviroment, same with fire etc..)

    I know City of Villains are working in partnership with these guys, and the new particle system has already recieved a LOT of praise (embers from powers bouncing off walls etc).

    I think it is a good idea, exactly how to implement it is a different story though.

    To give you some sort of idea...
    somebody wrote:
    Supposedly HL2 supports around 100ish physics entities on the screen at a time while the top end AGEIA card can handle 40,000 physics entities on the screen at a time.


  • Registered Users Posts: 999 ✭✭✭cregser


    Download the demo program SyxPak was talking about. You can play jenga-jenga and throw rag dolls around. My favourite demo is the siege. It shows catapults hurling rocks and human rag-dolls at a brick tower. You see each brick move out of place and eventually it slooooooooowly leans over before collapsing. You can even pick up a catapult and throw it at the tower.

    It shows the fps in the corner and my comp runs the demo at high enough rates. But the "Big Bang" demo slows to 3fps on my 3500+, 1gigRAM with 6800GT!


  • Closed Accounts Posts: 3,285 ✭✭✭Smellyirishman




  • Registered Users Posts: 10,299 ✭✭✭✭BloodBath


    Nah, sounds like a load of crap.
    Disagree totally - multi-threading and SMTP are the way forward not another seperate accelerator - thats the idea behind the Xbox360 and the Cell processors. If you read the design thoughts behind the Xbox360 they envision its 6 threads over 3 cores being used in exactly the way everyone is talking about on this thread - namely to run specific threads for specific game functionality such as Physics and AI.

    I was talking about pc's.


  • Closed Accounts Posts: 3,357 ✭✭✭secret_squirrel


    BloodBath wrote:
    I was talking about pc's.
    Ok... the PowerPC chips all the next gen consoles are using - have been used in similar form on MAC's for years.

    The dualies from AMD and Intel allow you to run multiple threads and the Quad cores due in 2007 will allow you to run more - the basic arguement is the same.

    In 2 years time we are going to have CPU's capable of running up to 8 threads in the case of a quad core Pentium with HT. Plenty of spare power for Physics and AI.

    And that doesnt even take into account that once most games are properly multithreaded then dual socket Mobo's might take off.


  • Advertisement
  • Registered Users Posts: 4,146 ✭✭✭_CreeD_


    Guys a lot of you seem to be completely ignoring the points made about the new batch of Multi core PC based CPUs. All of the reviews have pointed out how current games do not use the extra core capabilities, offloading AI to the 2nd core is a very obvious move, and one likely for Intel and Amd to push to game developers. No one's arguing a single core currently can handle even today's high end physic model demans, just that with these latest moves in CPU design the use of a hardware accelerator seems pointless.
    Also there are some downsides to moving to a hardware accelerator - Which API's will they support? Havok? Proprietary?... How long will it take for a standard to be drawn up so you can finally use the things with your favourite games (it took YEARS for this to happen with 3d accelerators, remember Glide? Chrome? etc. it wasn't until DirectX5 that things begain to settle)? If they can standardise things before this becomes mass market then it will have a chance but I think you'd be nuts to jump on the bandwagon any time in the next 2 years or so.


Advertisement