PhysX Dedicated Physics Processor Explored 142
Ned_Network writes "Yahoo! News & Reuters has a story about a start-up who have created a dedicated physics processor for gamers' PCs. The processor undertakes physics calculations for the CPU and is said to make gaming more realistic - examples such as falling rocks, exploding debris and the way that opponents collapse when you shoot them are cited as advantages of the chip. Only 6 current titles take advantage of the chip but the FAQ claims that another 100 are in production."
Physics Good, Fire Bad (Score:5, Insightful)
Re:Is that what I think it is. (Score:5, Insightful)
Anything the programmers can do to examine more moves into the future is a good thing for them. Even Deep Blue couldn't look more than 30 moves into the future. Dunno about the 'son of' Deep Blue.
Animations, etc consume trivial amounts of CPU/graphics power compared to examining the next XY possible moves in a chess game.
Re:Physics Good, Fire Bad (Score:1, Insightful)
At this point, there's only one game that takes any advantage of dual-core CPUs. Most games are still targetted towards lowend 2Ghz/GeForceMX systems. Seems kind of ridiclous to run headlong into specalized PHYZICKS processors when high-end games already fail to take better advantage of existing hardware.
Re:Cellfactor video looks pretty cool... (Score:3, Insightful)
Here's the problem with this (Score:5, Insightful)
On the other hand, graphics cards make sense for consumers because there are only two graphics APIs, OpenGL and DirectX, and they offer very similar functionality under the hood (but significantly different high-level APIs). So a graphics card can accelerate games written with either OpenGL or DirectX, but that's not the case with the emerging PPU field. In graphics, the APIs developed and converged on common functionality long before hardware acceleration was available at the consumer level, but I don't think the physics API situation is stable or mature enough to warrant dedicated hardware add-in cards at this time.
However, I think there are two possible scenarios that could change this.
1) Havok and Ageia could create open or closed physics API specifications and make them available to chip manufacturers, e.g. ATI and NVIDIA, which have the market penetration and manufacturing capability to make PPUs widely available. I could imagine a high-end PCIe card that had both a GPU and a PPU on-board.
2) Microsoft. Think what you will about them, but DirectX has greatly influenced the game industry and is the de-facto standard low-level API (although there are notable exceptions, such as id [idsoftware.com]). Microsoft could introduce a new component of DirectX which specifies a physics API that could then be implemented in hardware.
But unless one of those things happens, I don't think proprietary PPUs are going to make a lot of sense for consumers.
Re:no way in hell (Score:1, Insightful)
Improved physics matter only to "hardcore techies?" I challenge you to explain Half-Life2's success without including the use of physics in your answer.
Physics is an emerging area in gaming and huge quantities of resources are being poured into its improvement. A card that not only offloads the physics calculations to a separate chip, but as a result gives us the capabilities for more and better in-game physics capabilities is absolutely a great idea. Puzzles can become more interesting, visuals can become more immersive due to improved particle physics just for starters, you'll have creative ways to destroy your enemies without shooting them directly, destructible environments... and the list keeps going..
It's only a matter of time until these take off. Some folks might have a tough time finding an empty slot for one of these on their motherboard (with all the QUINTUPLE-SLI configs people have now-adays), but they'll just upgrade to a bigger case and a board with more slots especially if developers keep stepping on-board.
Games probably won't REQUIRE one for quite some time, but I would expect these will be about as widespread as 5.1+ sound-cards in just a few years..
Already exists. Kinda (Score:4, Insightful)
And here's the real sticking point (Score:3, Insightful)
Well that clearly isn't going to work, not enough people will own it to mandidate it.
Ok that means you are stuck using it for eye candy. Physics effects that make things look cooler, but don't really change gameplay. Hmmm, well at $300 just for eye candy, you face some stiff competition. I bet $300 spent on a PhysX doesn't make games as pretty as $300 spent on a GeForce 7900 does.
We'll see but I think your processor argument has a lot of merit. Is this thing going to be far enough ahead to outpace processors for some time to come? Because I don't think it's the kind of thing people will upgrade every year, and I think there;s going to be a lot of intertia to overcome. I mean I'm intrigued, and $300 is not out fo the range I'd consider spending for an addin card is I like what it does. However I've got to wait and see if it's got any legs and if the difference is big enough for me to care. Well during that time, I'm going to have to guess people will improve physics in software and start using dual cores for that. Right now I have a processor core that sits almost idle during games, just tending to system tasks. I have to ask how much more you could get out of it when it's used, how close to the PhysX accelerator can you come. Answer may be close enough I don't care to purchase one.
Multiplayer (Score:5, Insightful)
Re:another flash website... (Score:3, Insightful)
If they can't do animated vector graphics the right way, they shouldn't do them at all!
Re:Great for single player, bad for multiplayer? (Score:4, Insightful)
I can offer an uninformed theory. If an event is passed to the other players such as "barrel at explodes", then the processing is done at the client end for all of the players. If the event is done properly, they should all reach the same conclusion.
Unfortunately, as I'm writing this, I can start to see the problem. Okay, I apologize, but I'm going to do a 180 here. Imagine a car crashes through a brick wall and a hundred bricks go flying away. That alone should work fine. But if another player runs into the path of one of the bricks and it bounces off of him, suddenly it's no longer as predictable. His latency along with everybody else's latency means ONE of the computers has to make the decision of where everything goes. That, in and of itself, is probably okay, but then you have to pass a great deal more data along to let the other clients know what's happening.
So... yeah, I see your point.
History repeats itself (Score:4, Insightful)
My best guess is that this is going to repeat. AGEIA have now done what 3Dfx did, introducing a dedicated hardware chip for something that until now has been done in software. They even have their own proprietary physics API. Soon ATI and nVidia will incorporate similar features into their GPUs, and Microsoft will create a brand new DirectX subsystem called DirectPhysics. And AGEIA will slowly fade away (if they don't learn from 3Dfx's mistakes).