I voted "not useful", not because it's "not cross platform".
It's not useful because it is not ubiquitous. And so it's hard to actually add anything to gameplay (game mechanics).
If the CPU offloaded game stuff that actually mattered to PhysX it would cause the game to work very differently on PCs without PhysX. Game producers don't like that. Also makes multiplayer support harder - if PhysX mattered in the Game, then both players would need matching PhysX behaviour.
As it is, game makers will just use PhysX to help simulate eyecandy stuff like dust, flags flapping around etc.
I'd rather use the spare FLOPS to increase the frame-rate.
Current state of gaming titles and thier support looks marginal (I am on gtx260 ), but I hope and expect, if intelligibly implemented - one extra difficulty level added to most of the titles.
Like to see environment deciding your skill ( i mean ,besides extra zombies and their accuracy at diffrente levels).
And I guess physX, helps in making things more natural. But does it help when u have game titles full of cartoons (no logic)?.
I would love to see pressure, gravity, humidity, wind ... take on interactively on the user.
hmm. i have a doubt. does physX alter the sound processing dynamically - say i enter a location with slightly high atomspheric pressure - does the pitch change? i mean, like how we see in movies. or will the developers play a pre-recorded sound at a particular Atm pressure?.
So comming to the questions(PhysX acceleration in buying software)
PhysX HW support + good implemented titles - most likey i'll buy most of the softwares (ok . i'll limit myself to FPS and RTS)
about next question (PhysX in hardware buying decisions)
- i did enjoy PORTAL, Crysis (besides others)- 10/10 for both.
what more could have been added to these title. ( i am not sure if physX was implemented in these title already -(i'm lazy to check now ). But, if it was implemented, i hardly noticed ).
i guess people on non-PhysX hardware too should have equally enjoyed. So when u have good titles on hand - PhysX - hardly matters - maybe a BONUS (when intelligibly implemented ): Add an EXTRA LEVEL to your game when you start comparing with non PhysX hardware.
But that doesn't mean I don't want the effects that are _possible_ with physx in my games. If/when anyone makes a fully supported and useful hardware accelerated physics (or even raytracing) video or add-in card, I'll jump right on it.
How do you ask "How Important is PhysX when buying software/hardware?" and then not have a simple choice like "Not Important; Physx doesn't factor into my buying decisions."??
Since there was no choice like that I picked "Not useful" for both. However, I don't care that it's not cross platform I just don't care about it at all at this point.
I happen to own an 8800gt and a GTX260 core 216. The main way that PhysX has impacted me is that the driver files increased in size and provide no benefit. They should leave PhysX separate. If there's ever a compelling reason to use it I can download and install it myself.
I'm sure there are people who care about cross platform meaning different OSes. But how about those who don't really care about that (say only run windows) but don't want to be tied to nVidia and want something that works on all cards like physics for DirectX?
well i have to say i think the physx card was doomed to fail
Being dedicated to such a small thing was a bad idea....i think with it they should have done it so in tandem it boosts perforamce and takes some of the load off the cpu and gpu then it would have done well
if there was a card that dealt soley with textures....or water affects that would be almost as ridicoulous
I mean if it increased frame rates say by a additional 10fps....and it had the physics boost as part of it to...would have been alot better
Since ATI performance and visual quality is equal or better than Nvidia per dollar, the marketing benefit from having Physics declined. Nvidia is slowly loosing this battle, because they were so greedy to not go the open source way. Same happened to Sony, they lost ground to Samsung...
Anyway, if I am to choose between ATI and Nvidia, the leading reason will be future expandability. ATI allows asymmetric CrossFire, and there are more motherboards supporting that than SLI. Physics alone is not good enough reason to lock myself to Nvidia hardware. At the very moment when GPU standard for calculation similar to CUDA and open source happens to ATI hardware, Nvidia sales will just dive....and that is coming with Win 7.
i don't see why with all these multicore cpu's sitting there mostly idle during gaming, that these calculations can't be offloaded to the sleeping cores.
heck, even some of the gpu stuff could probably be thrown there.
See this thread in their official forums, with lots of testing and screenshots to show that Ageia PPU cards do not function with PhysX software if using any of NVidia's last few driver sets. Software and games will work with the old cards if you install the older 809 drivers, including Cryostasis and Sacred 2 and Mirror's Edge. Why would NVidia render the cards useless? To force the purchase of their GPUs, maybe? Others report same findings - In NVidia Forum Hardware, General Discussion, titled ' Ageia PhysX PPU ' - http://forums.nvidia.com/index.php?showtopic=91172">http://forums.nvidia.com/index.php?showtopic=91172 - Zero response from NVidia. I also reported his via their Customer Feedback option on their site.
this site used to be good. now is even shhitier than toms hardware guide. get back on track guys. btw i bought a 4770. good card. And as always the drivers are fk terrible / not there. Also a huge chunk of mosfets, caps, fancy gpu and mem cooling is missing in action. But good job anand selling selling smoke and mirrors. your deceptive 4770 review worked just fine.
current card is a 9800GTX. traded a TRUE for it. good trade to boot, it's a good fast card at 1680x1050 still. i could have put more money into an ATI card if i wanted to, but i was hoping to wait for larrabee before putting a lot of money down on new hardware, since i have been waiting for project offset since 2005 and i want to play it when it launches. i choose my hardware based on graphical performance. if you (the GPU architects and manufacturers) want me to buy a card to accelerate physics, you damn well better make it cross platform, or else i wont spend a dime extra just to get a proprietary version.
i used to live next door to one of Nvidia's marketing department managers. i can tell you first hand they are overly agressive in trying to sell their PhysX API and CUDA support. not surprising, their closed mindedness about how they think things should be extends beyond just their own business. we ended up moving out because of them LOL
The problem with your cross platform argument is that it is cross platform already. any game on the unreal 3 engine uses physx. that means UT3, gears of war and warmonger all use it (and several others I dont know off the top of my head). What you are asking for is any game to support hardware acceleration of physics no matter what physics engine they use and that just doesnt happen. The api calls have to be specified exactly so that the interaction actually happens. one variable name change or calling for an answer in float vs integer and the game crashes. Thats why there are 3 ways to get physics in a game. you can 1. put the money and time in and build your own like in crysis and farcry2. 2. buy havok for $250,000 and get their scripted physics where things always blow up the same way and the only thing really random about their interaction is the death physics. and 3. use physx which is given out for free and is hardware accelerated. Yes you have to use their api calls and all of their specifics but thats understandable if you are using their engine.
So basically what you are asking for is everyone to use the same physics engine. Give it time and a clear winner will emerge.
Unfortunately this poll is heavily biased by whether your graphics card is red or green.
I wonder if physX was an Ati only thing how many people flaming it in every forum would be singing it's praises, and how many singing it's praises would now be the ones saying how useless it is?
If physx was ATI only Id be buying ATi cards exclusively. Hell I bought the p1 card to get physx when it was still only used in 1 or 2 games.
What I would like to see is ATi stop this stupid ass partnership with havok and license the physx from nvidia so that they can both be using the same technology and allow the consumers to benefit in the long run rather than having rival technologies all the time. At least with everything leading up to physx the differences in ati vs nvidia were more performance based or small things like ati's truform rendering. Now we are talking entire game engines not being supported... which will lead everyone who's an ATI fan to miss out on potentially several dozen or even hundreds of games in the future as physx gains popularity.
Though I would like to see more of what havok is cooking up with them as I havent see too many reports on that lately. If they are moving towards true hardware accelerated physics too then it might come down to who does it better. The problem is havok only really has the ragdoll death physics going for it and nvidia's physx can do that too if they wanted to.
I'm more interested in 8 Channel Audio over HDMI that PhysX. Also I doubt hardware physics will be a big thing with game studios till it's cross-platform. I see a lot of promise for it though in fully destructable environments. FULLY destructable, like in Worms World Party, but 3D, like Halo and Crysis and all that.
fully destructible - see warmonger (ok not 100% but 75% destructible)
8channel audio over hdmi... so what youd rather have better on board sound cards than more realistic interaction of objects in your games? are you visually impaired to the point of being legally blind or just inept?
Funny... I ran that same demo on high today at 12x10 on an i7 920 NOT overclocked with an 8800gtx and got an average of 33fps with a minimum of 16.9. Hes basically getting boned because hes recording at the same time (probably with his webcam). Maybe he has his video card's 3d rendering set to quality mode, maybe he turned off hardware physics on the nvidia control panel and set the benchmarking program to use hardware since the card does support it its just off... Or maybe he needs to update his drivers I dont know exactly. All I know is if I run at 10x7 on high I get an average fps in the 40's.
1. Games are usually GPU limited even with SLI setups, so shifting any job from CPU (where there are cores doing nothing) to GPU is a bad thing.
2. Detrimental to people who don't own PhysX hardware, or just don't want to waste GPU power for physics. Games that support hardware physics usually have better physics effects if you use hardware physics. However, if you run PhysX in software mode, they usually don't come close to maxing out all your CPU cores. This is proof that the developers (or Nvidia) are deliberately crippling the software implementation.
3. Detrimental to the advancement of Physics technology in games. If PhysX becomes the defacto standard, Nvidia will have no incentive to improve it. It will be like when Creative got a monopoly of the Gaming sound market. EAX 6.0 is basically the same thing as EAX 2.0 except it supports more voices.
Hopefully Microsoft puts Physics into the next DirectX. At least then it would be hardware neutral.
-first -Last week I put a spare 8800gts 512 in my machine for to go with my gtx 285 to finsh playing Mirrors Edge.
-did not really see anything improved- fast game not much time to look around.
-but what came to mind was the fact that most people mention power of a video card as seen in benches as good or bad and yet install a second\third card that add's no value to the game play , but up's the power by 100 watts plus so why all the power draw issuse for a given card and then they say it surports Phy X
-also why when these sites compare ATI to Nvidia buy ratings they never test a eg.4890 + a 4870 [old card ] for extra value to the product value.If that still works.
-So why I think Nvidia could have used the R&D money better would be to make some old cards give some help to the overall game FPS eg. 8800gts + gtx 285 sli thing , instead of the old card being in a closet.\ or if it's sold \ given away it's still one less card that someone does not buy retail.
-if given the two I would pick a second helper card , not some useless eye candy that might or might not work in any given game.
Ive been following the idea of hardware accelerated physx since ageia announced their idea at e3 in 03. Back then there were no games that utilized the thing except a few games here and there. Now anyone whos played ut3 or gears of war or any of several other games has played a game that can benefit from hardware accelerated physics. The coolest part of it is that the acceleration allows unscripted physics with hundreds of objects larger than a paint can at once. For the past decade and a half weve had to put up with thin fences of wood that cant be shot through with a missle launcher or run over with a tank. Hardware accelerated physics can fix things like that and make worlds more destroyable and more realistic. Now that nvidia has purchased ageia and incorporated their software into their cards we have the added bonus of getting physx on the video card itself and eliminating the Achilles's heel of the ageia p1 card... the pci bus. So now with every nvidia card you not only get one of the top performing cards in its class you also get hardware physics for added value.
I know most people dont agree with me and that's fine. The idea has been gaining ground for 5 years. Its not going away now that nvidia has bought them out and been pushing it. Not only can you can do havok physics, or proprietary physics but you can also do hardware accelerated physics (that they give away for free instead of charging 250k like havok does).
If you notice any software physics game that tries to come close to what you can do with PhysX is majorly taxing on a system. Prime example is Crysis, great physics but the guy abuses whatever hardware you throw at it.
I believe GPU physics is here to stay...but also...i've never see what OpenCl looks like. the problem with PhysX is nvidia is doing it to sell hardware with good marketing, if they wanted it to be adopted they would make it openly available and free and there would be no need for OpenCL...but if ATI was going to make it work on their cards they would have to pay Nvidia...so that's why ATI is going OpenCl. Now the funny thing is we consumers have to deal with this nonsense when all we want is pretty graphics with awesome physics...damn the politics. if only nvidia played nice Physics would boom within 6 months...nobody bases their decisions on PhysX when we all know this is a battle between Nvidia and ATI.
Best 3d performance bang for the buck is still king, physx is just a bonus...and that's if games you play use it. UT doesn't benefit from it unless you play the Physx mod...and nobody plays those online!
I first read about Ageia and PhysX 3 or 4 years ago. When they had a working card out there, I remember reading the reviews, seeing screenshots, and watching the tech videos. I was underwhelmed to say the least. What was most confusing to me at the time was the added performance hit. Wasn't the idea of hw physics acceleration to increase performance?
"But wait!", they said. "Check out these extra cool effects!" I can't speak for anyone else, but the effects weren't anything ground-breaking. Most of what I saw were added particles here and there. Ooh! To top things off, title support was dismal at the time.
Needless to say, I wasn't surprised when they were acquired by Nvidia in 2008. I was excited at the time because I owned an 8800 GTX. Now, I thought, Nvidia will take this potentially great thing and do it right. Accelerate PhysX via the GPU, get it out there, and push developer support.
Well, it's been almost a year and a half and what do they have to show for it? The only game I can recall doing anything cool with PhysX is Unreal Tournament III. And that implementation consisted of a pack of 3 add-on maps that play more like a tech demo than anything else. Really? Is this what I was looking forward too.
I am not endlessly loyal to ATI or Nvidia. It so happens that my 8800 GTX died this past month. I bought an HD 4830 (which I am amazed with for $70, but that's another story) to replace it until the DX11 cards start rolling out later this year.
The bottom line is this: When I do buy a new video card at the end of this year, NOTHING about that decision will be based on PhysX support. It's not even a passing thought.
Marginal is the only right answer to the first two questions. If it's not used to do something in software that you use or plan to use, how is it even relevant?
It's basically "it's good if it's useful", but the other choices are either "it's better than good even if it's not useful" or "it's bad because I'm not using it"
So, I guess I am the only one with a 4870 and a PCIe Physx card? To me, it's just like EAX on my sound card. I have it, and it is nice when a game supports it, but I don't buy or avoid games because of their support (or lack thereof) for it.
Why is there no Edit button? Anyway, the other thing I wanted to say is I have never been a fan of using the GPU for other (in-game) things. If you use it to transcode video (and I do), great. But modern games are taxing enough on my system (Q8200@3.2, 4870 1GB, 4GB DDR2@900, 300GB Velociraptor) that I don't want 10% of my video card working on something else. Maybe it's b/c I game on a large CRT that supports very high (2048x1536) resolutions, but I don't think I'm alone on this. As I said, I value Physx, just not enough to sacrifice noticeable performance for it.
I would love to have more realism in the physics of in game environments but find apps like Badaboom encoder and Photoshop CS4 acceleration infinitely more useful.
hw accelerated physics is a nice future option, but the current implementation is still not where it should be.
and when the future is heading towards multi-core CPUs, I don't see a reason to use GPUs for physics acceleration.
I can see some reason to use stream processors in large scale physics sims (scientific experiments) or large scale online games that can actualy use it (EVE Online). But tying this to any piece of hardware that has limited support is not a good idea.
I upgraded to a 4890 after owning an 8800GT. Before the 8800GT I had an X1800XT.
I couldn't care less about PhysX. When I upgrade next year, if I can find an Nvidia card for cheaper than an ATI card, then I'll switch back to Nvidia. I have no loyalty to either company, and will support whoever has the cheaper card, as long as the cards are equivalent. IMO the 4890 is equal to the 275, but I was able to get my 4890 for $185 after rebate and instant cash back. If the 275 had been cheaper, I would have gotten that card.
I think of PhysX as a bonus, kinda like getting a 8GB USB Flash drive when you purchase something else (HD, TV, Case, whatever). It's a nice surprise and bonus, but it wouldn't factor into my decision, ya know?
unless every graphics card will support it i dont care about it. its a nice extra, but nvidia only putting it on there own gpu's makes it nearly totally useless
Do any sports games make use of PhysX? Or is sport physics just not complex enough to need a separate card?
I'm thinking the impact of a tackle on a football carrier - their weight, their velocity, their balance could all be put together to create a more realistic tackle.
So far when I think of PhysX, I think of enhanced particle explosions, wavy fabric, and wavy water, and nothing that actually affects gameplay.
PhysX isn't useful for anything that actually affects game logic because of the time involved in moving data back and forth from the video card.
You could have something similar to Burnout where there's a neat looking tackle simulation after some simpler code determines whether or not there actually was a tackle.
Beyond this, developers aren't ready to tie any required feature or offer significantly different game play modes for PhysX because they aren't interested in alienating owners of AMD and Intel hardware or spending an extraordinary amount of time on things that only a subset of their audience will be able to benefit from.
There are things that could be hardware accelerated that are required for gameplay, but that will have to wait until people really want and need that first.
If I wanted to vote "Not useful," that's not necessarily the reason WHY I would vote "Not useful." You're assuming that is the ONLY reason it is not useful. Otherwise, at least include another option. The way the poll is going, you will get the result that "People want cross-platform PhysX" while that's not necessarily what the voters intended.
Thanks for the feedback -- I just removed the OpenCL reference to try and help balance it out again.
I do understand your sentiment -- the text following the useful-or-not bit was just supposed to be an example of the type of feeling that would be associated with the category, not the only thing that fits ... with no context at all it can be hard to understand what we intended and everyone would have different ideas of what the category would mean to them. At the same time, you are right that it could make people feel like the categories are overly narrow or non-inclusive.
I'll try to be more careful with this in the future. Thanks again.
well to be fair .. for many of us one of the main reason's it's not useful is because it's not cross platform. I would think the vast majority out there are take it or leave it types as it's not really a biggie to get excited about right now.
I agree with the bad form of voting. I find GPU physics to be a complete waste. The cards are using every last drop of power they have to run the graphics and give me good FPS, the last thing i need or want is resources being sucked away to run physics when i've got 2-3 extra cores in my CPU doing approximately nothing. GPU physics is a waste of time and marketing gimicry. There is more than enough CPU power to go around these days to have the need to run physics on my GPU instead of graphics. I uy a video card to give me graphics, not physics. If i wanted a physics card i would buy a physics add-in card. So would everyone else. I find running anything but the graphics processes on the GPU to be offencive (in terms of gaming. If you are not gaming and Folding or whatever, fine, but turn that crap off when i am gaming so i can get the best performance possible.)
Completely agree. Even the best single gpu graphic cards GTX285 can't run Crysis@MaxSetting@4X/8AA@1920X1200 at a comfortable 30fps. Why worry about Physics. Besides Physic effects has very minor impact on gameplay.
Sure the vote forming might be bad, made it hard for me to choose.
But if you want to use your GPU full for rendering you can put a extra GPU for PhysX.
If you don't want that, you can optional go for a faster GPU wich has enough power for both task.
It's basicly a luxory choice. And Dev must support mainstream but optional they can support upt to the extreem game rigs imanigable.
If budged is a problem then disable that much more PhysX option.
That the aim for mainstream, the budged restricted people who don't want to invest in extra hardware. So the get the normal game experience.
But other people apriciate those extra checkboxes or sliders for PhysX. For full PhysX experience with there favo PhysX game.
It's just like if 1900x1200 is to much. You settle for 1024x768.
Or no FSAA instead 4XFSAA.
But if you want 2560x1600x32 with FSAA AF motionblur depth of field bloom abd HDR. Then you can go for triple GTX285. That luxory.
GPU Physics means that game are much more Physics richer.
Because Gameplay Physics doesn't go well with this luxory choice. It would be most of all effectPhysics.
So if you want GPU just for rendering and Physics for CPU. You choose with that, for a low Physics setting.
That freedom of choice. Just like there is SLI and CF and 30" monitors. You can go for that extra hardware PhysX acceleration by GPU or PPU.
PhysX and FPS do have inflence on each other but arent the same.
The GPU is for rendering performance. So you choose your gPU to do that on a decent FPS level. For normal setting that it, if you don't care about PhysX.
If you want more PhysX. PhysX isn't about FPS but Physics. And Physics can be also very computing heavy. Wich mean you need extra GPU power to get that extra Physics and keep decent FPS.
So a 8600GT doesn't render games well trowing a heavy PhysX load gives a slide show.
Compare to a GPU PhysX disabled PhysX AAA game can stress a 9800GTX+ in decent setting.
If you want to run a decent heavy PhysX mode a GTX275 give decent performance and is stressed in balance with rendering and Physics task.
I go for that extra optional Physics wih decent FPS that means I must invest in some extra or better hardware.
That choice. And I would like it very much if dev give you that. instead only service the poor people. Or fps focused people.
Don't judge or review or bench physX with FPS performance.
PhysX is about Physics the core of a review would be a review about how Physic is used in the game. And how much more hardware it needs.
As I do have 8800 gs in my old machine it supprt physic but this is not my normal gaming rig. That goes to my machine with amd 4780 and intel 6600e overclocked to 3.4 ghz. So it would be better if was cross platform support.
I have a PhysX-capable nVidia card (GeForce 9600GT,) but it was replaced long ago with an ATI card. So I did answer that I own one; I'm just clarifying that it made no difference to me.
There are now decent number of Hardware PhysX powered games.
I see the hardware as a high potentional point.
But using it depend if the games deliver.
Have two PPU since May 2006 and a third in 2007.
Played GRaW and GRAW2 in full glory.
But Cell factor demo showed it's potentional.
And Graw2 ageia island show the potentional in the game genre and theme I like. There are more games but wich are not my thing.
I also was very early aware of the implication Physics brings in a Game develeopment project as how it can influence AI_Pathing. Net load and Game art development. Wich hold back decent or heavy accelerated Physx support.
Wel a lot more PhysX is also a bigger game development burden.
It doesn't come for free.
Mirror edge was a subtle touch of hardware PhysX partialy done right but suffer from crossplatform.
Now nV is pushin Dev's with there TWIMTBP devision. More devs and thus game project are actively doeing something with it.
But just like 9 out of 10 games are the average clone. Or in this case the PhysX feels very forched 9 out of 10 and often out of place. There could be a exceptions that shines on PhysX.
The flags high on the ceiling are more distraction or interactable decorations. How ever where the Player or NPC can be run or push trough are right in place.
But I don't like that kind of game.
Where still waiting on a physX killer game.
And it's very reasonable that it may never come.
So I keep my expectation realistic and enjoy some of the good uses of it in games.
As of now PhysX hardware isn't a necicerity except for some not so populair heavy Physics games like Warmonger and Cellfactor wich demand some hardware Physx support.
But no Class AAA title does that.
But up till now a PPU with a ATI do just fine.
But I wouldn't mind a AAA game that optional could use a dedicated midrange or higher G-card and puit it to good in game use.
I did have an 8800 GTS 320. After having tried the PhysX demos (UT3 map pack, Warmonger (meeh), and a few others that came with the physics packs), I felt thoroughly underwhelmed - the only really cool one was the great Kulu which showed off destructable softbodies. Most of the other ones, tornado effects, debris, cloth tearing, and hail for the most part looked pretty unrealistic and sometimes detracted from the gameplay experience. I can't say that buying my 4870 1GB in February was really affected by the lack of PhysX support, it doesn't seem to be a very killer feature anyway.
I think the vote could've drawn a better comparison if they'd asked if we had tried any PhysX demos using accelerating hardware, as well as if we owned some supported hardware.
I'm in the similar situation to you, only I had the 640meg version. Recently went up to a 1gig 4870 also. Really don't care at all for nVidia physx. If anything it was better before, as a seperate PCI-card. At least anyone could (in theory) get one and not be tied to a particular graphics vendor.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
55 Comments
Back to Article
lyeoh - Thursday, May 14, 2009 - link
I voted "not useful", not because it's "not cross platform".It's not useful because it is not ubiquitous. And so it's hard to actually add anything to gameplay (game mechanics).
If the CPU offloaded game stuff that actually mattered to PhysX it would cause the game to work very differently on PCs without PhysX. Game producers don't like that. Also makes multiplayer support harder - if PhysX mattered in the Game, then both players would need matching PhysX behaviour.
As it is, game makers will just use PhysX to help simulate eyecandy stuff like dust, flags flapping around etc.
I'd rather use the spare FLOPS to increase the frame-rate.
sanjeev - Thursday, May 14, 2009 - link
Current state of gaming titles and thier support looks marginal (I am on gtx260 ), but I hope and expect, if intelligibly implemented - one extra difficulty level added to most of the titles.Like to see environment deciding your skill ( i mean ,besides extra zombies and their accuracy at diffrente levels).
And I guess physX, helps in making things more natural. But does it help when u have game titles full of cartoons (no logic)?.
I would love to see pressure, gravity, humidity, wind ... take on interactively on the user.
hmm. i have a doubt. does physX alter the sound processing dynamically - say i enter a location with slightly high atomspheric pressure - does the pitch change? i mean, like how we see in movies. or will the developers play a pre-recorded sound at a particular Atm pressure?.
So comming to the questions(PhysX acceleration in buying software)
PhysX HW support + good implemented titles - most likey i'll buy most of the softwares (ok . i'll limit myself to FPS and RTS)
about next question (PhysX in hardware buying decisions)
- i did enjoy PORTAL, Crysis (besides others)- 10/10 for both.
what more could have been added to these title. ( i am not sure if physX was implemented in these title already -(i'm lazy to check now ). But, if it was implemented, i hardly noticed ).
i guess people on non-PhysX hardware too should have equally enjoyed. So when u have good titles on hand - PhysX - hardly matters - maybe a BONUS (when intelligibly implemented ): Add an EXTRA LEVEL to your game when you start comparing with non PhysX hardware.
zagood - Wednesday, May 13, 2009 - link
But that doesn't mean I don't want the effects that are _possible_ with physx in my games. If/when anyone makes a fully supported and useful hardware accelerated physics (or even raytracing) video or add-in card, I'll jump right on it.They're just not there yet.
mikepers - Monday, May 11, 2009 - link
How do you ask "How Important is PhysX when buying software/hardware?" and then not have a simple choice like "Not Important; Physx doesn't factor into my buying decisions."??Since there was no choice like that I picked "Not useful" for both. However, I don't care that it's not cross platform I just don't care about it at all at this point.
I happen to own an 8800gt and a GTX260 core 216. The main way that PhysX has impacted me is that the driver files increased in size and provide no benefit. They should leave PhysX separate. If there's ever a compelling reason to use it I can download and install it myself.
enki - Sunday, May 10, 2009 - link
I'm sure there are people who care about cross platform meaning different OSes. But how about those who don't really care about that (say only run windows) but don't want to be tied to nVidia and want something that works on all cards like physics for DirectX?eastyy - Thursday, May 7, 2009 - link
well i have to say i think the physx card was doomed to failBeing dedicated to such a small thing was a bad idea....i think with it they should have done it so in tandem it boosts perforamce and takes some of the load off the cpu and gpu then it would have done well
if there was a card that dealt soley with textures....or water affects that would be almost as ridicoulous
I mean if it increased frame rates say by a additional 10fps....and it had the physics boost as part of it to...would have been alot better
Ananke - Thursday, May 7, 2009 - link
Since ATI performance and visual quality is equal or better than Nvidia per dollar, the marketing benefit from having Physics declined. Nvidia is slowly loosing this battle, because they were so greedy to not go the open source way. Same happened to Sony, they lost ground to Samsung...Anyway, if I am to choose between ATI and Nvidia, the leading reason will be future expandability. ATI allows asymmetric CrossFire, and there are more motherboards supporting that than SLI. Physics alone is not good enough reason to lock myself to Nvidia hardware. At the very moment when GPU standard for calculation similar to CUDA and open source happens to ATI hardware, Nvidia sales will just dive....and that is coming with Win 7.
araczynski - Thursday, May 7, 2009 - link
i don't see why with all these multicore cpu's sitting there mostly idle during gaming, that these calculations can't be offloaded to the sleeping cores.heck, even some of the gpu stuff could probably be thrown there.
DON3k - Thursday, May 7, 2009 - link
See this thread in their official forums, with lots of testing and screenshots to show that Ageia PPU cards do not function with PhysX software if using any of NVidia's last few driver sets. Software and games will work with the old cards if you install the older 809 drivers, including Cryostasis and Sacred 2 and Mirror's Edge. Why would NVidia render the cards useless? To force the purchase of their GPUs, maybe? Others report same findings - In NVidia Forum Hardware, General Discussion, titled ' Ageia PhysX PPU ' - http://forums.nvidia.com/index.php?showtopic=91172">http://forums.nvidia.com/index.php?showtopic=91172 - Zero response from NVidia. I also reported his via their Customer Feedback option on their site.papapapapapapapababy - Thursday, May 7, 2009 - link
X Not useful; Hardware physics doesn't matter until it's cross platformX Not useful; PhysX is irrelevant as it is not cross platform
x yes
papapapapapapapababy - Thursday, May 7, 2009 - link
this site used to be good. now is even shhitier than toms hardware guide. get back on track guys. btw i bought a 4770. good card. And as always the drivers are fk terrible / not there. Also a huge chunk of mosfets, caps, fancy gpu and mem cooling is missing in action. But good job anand selling selling smoke and mirrors. your deceptive 4770 review worked just fine.micha90210 - Thursday, May 7, 2009 - link
... without playing few titles that support PhysX, experiencing the gameplay, and comparing it to non-PhysX titles?!gamerk2 - Thursday, May 7, 2009 - link
Much as DirectX and Glide helped standardize a 3d API, PhysX will help standardize a unified Physics API.faxon - Thursday, May 7, 2009 - link
current card is a 9800GTX. traded a TRUE for it. good trade to boot, it's a good fast card at 1680x1050 still. i could have put more money into an ATI card if i wanted to, but i was hoping to wait for larrabee before putting a lot of money down on new hardware, since i have been waiting for project offset since 2005 and i want to play it when it launches. i choose my hardware based on graphical performance. if you (the GPU architects and manufacturers) want me to buy a card to accelerate physics, you damn well better make it cross platform, or else i wont spend a dime extra just to get a proprietary version.i used to live next door to one of Nvidia's marketing department managers. i can tell you first hand they are overly agressive in trying to sell their PhysX API and CUDA support. not surprising, their closed mindedness about how they think things should be extends beyond just their own business. we ended up moving out because of them LOL
shin0bi272 - Friday, May 8, 2009 - link
The problem with your cross platform argument is that it is cross platform already. any game on the unreal 3 engine uses physx. that means UT3, gears of war and warmonger all use it (and several others I dont know off the top of my head). What you are asking for is any game to support hardware acceleration of physics no matter what physics engine they use and that just doesnt happen. The api calls have to be specified exactly so that the interaction actually happens. one variable name change or calling for an answer in float vs integer and the game crashes. Thats why there are 3 ways to get physics in a game. you can 1. put the money and time in and build your own like in crysis and farcry2. 2. buy havok for $250,000 and get their scripted physics where things always blow up the same way and the only thing really random about their interaction is the death physics. and 3. use physx which is given out for free and is hardware accelerated. Yes you have to use their api calls and all of their specifics but thats understandable if you are using their engine.So basically what you are asking for is everyone to use the same physics engine. Give it time and a clear winner will emerge.
sbuckler - Thursday, May 7, 2009 - link
Unfortunately this poll is heavily biased by whether your graphics card is red or green.I wonder if physX was an Ati only thing how many people flaming it in every forum would be singing it's praises, and how many singing it's praises would now be the ones saying how useless it is?
shin0bi272 - Friday, May 8, 2009 - link
If physx was ATI only Id be buying ATi cards exclusively. Hell I bought the p1 card to get physx when it was still only used in 1 or 2 games.What I would like to see is ATi stop this stupid ass partnership with havok and license the physx from nvidia so that they can both be using the same technology and allow the consumers to benefit in the long run rather than having rival technologies all the time. At least with everything leading up to physx the differences in ati vs nvidia were more performance based or small things like ati's truform rendering. Now we are talking entire game engines not being supported... which will lead everyone who's an ATI fan to miss out on potentially several dozen or even hundreds of games in the future as physx gains popularity.
Though I would like to see more of what havok is cooking up with them as I havent see too many reports on that lately. If they are moving towards true hardware accelerated physics too then it might come down to who does it better. The problem is havok only really has the ragdoll death physics going for it and nvidia's physx can do that too if they wanted to.
Hrel - Thursday, May 7, 2009 - link
I'm more interested in 8 Channel Audio over HDMI that PhysX. Also I doubt hardware physics will be a big thing with game studios till it's cross-platform. I see a lot of promise for it though in fully destructable environments. FULLY destructable, like in Worms World Party, but 3D, like Halo and Crysis and all that.shin0bi272 - Friday, May 8, 2009 - link
cross platform - see unreal 3 enginefully destructible - see warmonger (ok not 100% but 75% destructible)
8channel audio over hdmi... so what youd rather have better on board sound cards than more realistic interaction of objects in your games? are you visually impaired to the point of being legally blind or just inept?
san1s - Wednesday, May 6, 2009 - link
physx can't run with playable framerates even on an overclocked i7http://www.youtube.com/watch?v=erAaVDKrRkk&fmt...">http://www.youtube.com/watch?v=erAaVDKrRkk&fmt...
shin0bi272 - Friday, May 8, 2009 - link
Funny... I ran that same demo on high today at 12x10 on an i7 920 NOT overclocked with an 8800gtx and got an average of 33fps with a minimum of 16.9. Hes basically getting boned because hes recording at the same time (probably with his webcam). Maybe he has his video card's 3d rendering set to quality mode, maybe he turned off hardware physics on the nvidia control panel and set the benchmarking program to use hardware since the card does support it its just off... Or maybe he needs to update his drivers I dont know exactly. All I know is if I run at 10x7 on high I get an average fps in the 40's.Psynaut - Wednesday, May 6, 2009 - link
If you want only people who own Nvidea cards to vote, you might want to provide a way for people to view the results without having to vote first.Sureshot324 - Wednesday, May 6, 2009 - link
Detrimental1. Games are usually GPU limited even with SLI setups, so shifting any job from CPU (where there are cores doing nothing) to GPU is a bad thing.
2. Detrimental to people who don't own PhysX hardware, or just don't want to waste GPU power for physics. Games that support hardware physics usually have better physics effects if you use hardware physics. However, if you run PhysX in software mode, they usually don't come close to maxing out all your CPU cores. This is proof that the developers (or Nvidia) are deliberately crippling the software implementation.
3. Detrimental to the advancement of Physics technology in games. If PhysX becomes the defacto standard, Nvidia will have no incentive to improve it. It will be like when Creative got a monopoly of the Gaming sound market. EAX 6.0 is basically the same thing as EAX 2.0 except it supports more voices.
Hopefully Microsoft puts Physics into the next DirectX. At least then it would be hardware neutral.
rgallant - Wednesday, May 6, 2009 - link
-first -Last week I put a spare 8800gts 512 in my machine for to go with my gtx 285 to finsh playing Mirrors Edge.-did not really see anything improved- fast game not much time to look around.
-but what came to mind was the fact that most people mention power of a video card as seen in benches as good or bad and yet install a second\third card that add's no value to the game play , but up's the power by 100 watts plus so why all the power draw issuse for a given card and then they say it surports Phy X
-also why when these sites compare ATI to Nvidia buy ratings they never test a eg.4890 + a 4870 [old card ] for extra value to the product value.If that still works.
-So why I think Nvidia could have used the R&D money better would be to make some old cards give some help to the overall game FPS eg. 8800gts + gtx 285 sli thing , instead of the old card being in a closet.\ or if it's sold \ given away it's still one less card that someone does not buy retail.
-if given the two I would pick a second helper card , not some useless eye candy that might or might not work in any given game.
AznBoi36 - Wednesday, May 6, 2009 - link
I wonder what would happen if a game such as World of Warcraft supported PhysX...shin0bi272 - Wednesday, May 6, 2009 - link
Ive been following the idea of hardware accelerated physx since ageia announced their idea at e3 in 03. Back then there were no games that utilized the thing except a few games here and there. Now anyone whos played ut3 or gears of war or any of several other games has played a game that can benefit from hardware accelerated physics. The coolest part of it is that the acceleration allows unscripted physics with hundreds of objects larger than a paint can at once. For the past decade and a half weve had to put up with thin fences of wood that cant be shot through with a missle launcher or run over with a tank. Hardware accelerated physics can fix things like that and make worlds more destroyable and more realistic. Now that nvidia has purchased ageia and incorporated their software into their cards we have the added bonus of getting physx on the video card itself and eliminating the Achilles's heel of the ageia p1 card... the pci bus. So now with every nvidia card you not only get one of the top performing cards in its class you also get hardware physics for added value.I know most people dont agree with me and that's fine. The idea has been gaining ground for 5 years. Its not going away now that nvidia has bought them out and been pushing it. Not only can you can do havok physics, or proprietary physics but you can also do hardware accelerated physics (that they give away for free instead of charging 250k like havok does).
fausto412 - Wednesday, May 6, 2009 - link
If you notice any software physics game that tries to come close to what you can do with PhysX is majorly taxing on a system. Prime example is Crysis, great physics but the guy abuses whatever hardware you throw at it.I believe GPU physics is here to stay...but also...i've never see what OpenCl looks like. the problem with PhysX is nvidia is doing it to sell hardware with good marketing, if they wanted it to be adopted they would make it openly available and free and there would be no need for OpenCL...but if ATI was going to make it work on their cards they would have to pay Nvidia...so that's why ATI is going OpenCl. Now the funny thing is we consumers have to deal with this nonsense when all we want is pretty graphics with awesome physics...damn the politics. if only nvidia played nice Physics would boom within 6 months...nobody bases their decisions on PhysX when we all know this is a battle between Nvidia and ATI.
Best 3d performance bang for the buck is still king, physx is just a bonus...and that's if games you play use it. UT doesn't benefit from it unless you play the Physx mod...and nobody plays those online!
drwheel - Wednesday, May 6, 2009 - link
I first read about Ageia and PhysX 3 or 4 years ago. When they had a working card out there, I remember reading the reviews, seeing screenshots, and watching the tech videos. I was underwhelmed to say the least. What was most confusing to me at the time was the added performance hit. Wasn't the idea of hw physics acceleration to increase performance?"But wait!", they said. "Check out these extra cool effects!" I can't speak for anyone else, but the effects weren't anything ground-breaking. Most of what I saw were added particles here and there. Ooh! To top things off, title support was dismal at the time.
Needless to say, I wasn't surprised when they were acquired by Nvidia in 2008. I was excited at the time because I owned an 8800 GTX. Now, I thought, Nvidia will take this potentially great thing and do it right. Accelerate PhysX via the GPU, get it out there, and push developer support.
Well, it's been almost a year and a half and what do they have to show for it? The only game I can recall doing anything cool with PhysX is Unreal Tournament III. And that implementation consisted of a pack of 3 add-on maps that play more like a tech demo than anything else. Really? Is this what I was looking forward too.
I am not endlessly loyal to ATI or Nvidia. It so happens that my 8800 GTX died this past month. I bought an HD 4830 (which I am amazed with for $70, but that's another story) to replace it until the DX11 cards start rolling out later this year.
The bottom line is this: When I do buy a new video card at the end of this year, NOTHING about that decision will be based on PhysX support. It's not even a passing thought.
McRhea - Wednesday, May 6, 2009 - link
+1 to the comment above.
GaryJohnson - Wednesday, May 6, 2009 - link
Marginal is the only right answer to the first two questions. If it's not used to do something in software that you use or plan to use, how is it even relevant?It's basically "it's good if it's useful", but the other choices are either "it's better than good even if it's not useful" or "it's bad because I'm not using it"
Daeros - Wednesday, May 6, 2009 - link
So, I guess I am the only one with a 4870 and a PCIe Physx card? To me, it's just like EAX on my sound card. I have it, and it is nice when a game supports it, but I don't buy or avoid games because of their support (or lack thereof) for it.Daeros - Wednesday, May 6, 2009 - link
Why is there no Edit button? Anyway, the other thing I wanted to say is I have never been a fan of using the GPU for other (in-game) things. If you use it to transcode video (and I do), great. But modern games are taxing enough on my system (Q8200@3.2, 4870 1GB, 4GB DDR2@900, 300GB Velociraptor) that I don't want 10% of my video card working on something else. Maybe it's b/c I game on a large CRT that supports very high (2048x1536) resolutions, but I don't think I'm alone on this. As I said, I value Physx, just not enough to sacrifice noticeable performance for it.aguilpa1 - Wednesday, May 6, 2009 - link
I would love to have more realism in the physics of in game environments but find apps like Badaboom encoder and Photoshop CS4 acceleration infinitely more useful.bludragon - Wednesday, May 6, 2009 - link
Is how I wanted to answer the 2nd question...haplo602 - Wednesday, May 6, 2009 - link
hw accelerated physics is a nice future option, but the current implementation is still not where it should be.and when the future is heading towards multi-core CPUs, I don't see a reason to use GPUs for physics acceleration.
I can see some reason to use stream processors in large scale physics sims (scientific experiments) or large scale online games that can actualy use it (EVE Online). But tying this to any piece of hardware that has limited support is not a good idea.
McRhea - Wednesday, May 6, 2009 - link
I upgraded to a 4890 after owning an 8800GT. Before the 8800GT I had an X1800XT.I couldn't care less about PhysX. When I upgrade next year, if I can find an Nvidia card for cheaper than an ATI card, then I'll switch back to Nvidia. I have no loyalty to either company, and will support whoever has the cheaper card, as long as the cards are equivalent. IMO the 4890 is equal to the 275, but I was able to get my 4890 for $185 after rebate and instant cash back. If the 275 had been cheaper, I would have gotten that card.
I think of PhysX as a bonus, kinda like getting a 8GB USB Flash drive when you purchase something else (HD, TV, Case, whatever). It's a nice surprise and bonus, but it wouldn't factor into my decision, ya know?
Repr - Wednesday, May 6, 2009 - link
unless every graphics card will support it i dont care about it. its a nice extra, but nvidia only putting it on there own gpu's makes it nearly totally uselesscrimson117 - Wednesday, May 6, 2009 - link
Do any sports games make use of PhysX? Or is sport physics just not complex enough to need a separate card?I'm thinking the impact of a tackle on a football carrier - their weight, their velocity, their balance could all be put together to create a more realistic tackle.
So far when I think of PhysX, I think of enhanced particle explosions, wavy fabric, and wavy water, and nothing that actually affects gameplay.
Slakey - Wednesday, May 6, 2009 - link
PhysX isn't useful for anything that actually affects game logic because of the time involved in moving data back and forth from the video card.You could have something similar to Burnout where there's a neat looking tackle simulation after some simpler code determines whether or not there actually was a tackle.
DerekWilson - Wednesday, May 6, 2009 - link
I think that might only be a part of the issue.Beyond this, developers aren't ready to tie any required feature or offer significantly different game play modes for PhysX because they aren't interested in alienating owners of AMD and Intel hardware or spending an extraordinary amount of time on things that only a subset of their audience will be able to benefit from.
There are things that could be hardware accelerated that are required for gameplay, but that will have to wait until people really want and need that first.
aigomorla - Wednesday, May 6, 2009 - link
Mirror Edge is suposed to support physx, but that game blows.bobsmith1492 - Wednesday, May 6, 2009 - link
Would you please avoid leading questions such as:"Not useful; Hardware physics doesn't matter until it's cross platform (OpenCL) (231 votes)"
If I wanted to vote "Not useful," that's not necessarily the reason WHY I would vote "Not useful." You're assuming that is the ONLY reason it is not useful. Otherwise, at least include another option. The way the poll is going, you will get the result that "People want cross-platform PhysX" while that's not necessarily what the voters intended.
DerekWilson - Wednesday, May 6, 2009 - link
Thanks for the feedback -- I just removed the OpenCL reference to try and help balance it out again.I do understand your sentiment -- the text following the useful-or-not bit was just supposed to be an example of the type of feeling that would be associated with the category, not the only thing that fits ... with no context at all it can be hard to understand what we intended and everyone would have different ideas of what the category would mean to them. At the same time, you are right that it could make people feel like the categories are overly narrow or non-inclusive.
I'll try to be more careful with this in the future. Thanks again.
just4U - Wednesday, May 6, 2009 - link
well to be fair .. for many of us one of the main reason's it's not useful is because it's not cross platform. I would think the vast majority out there are take it or leave it types as it's not really a biggie to get excited about right now.icingdeath88 - Wednesday, May 6, 2009 - link
You could also give multiple reasons for the same choice, such as:no - it's not cross platform
no - none of the games i'm interested use it
i put no, but i meant the second one.
ssj4Gogeta - Wednesday, May 6, 2009 - link
and in the hardware question, shouldn't it be "Marginal; PhysX is a bonus if a CARD I like supports it" (card instead of game)?ssj4Gogeta - Wednesday, May 6, 2009 - link
you could add an "etc." there.ThePooBurner - Wednesday, May 6, 2009 - link
I agree with the bad form of voting. I find GPU physics to be a complete waste. The cards are using every last drop of power they have to run the graphics and give me good FPS, the last thing i need or want is resources being sucked away to run physics when i've got 2-3 extra cores in my CPU doing approximately nothing. GPU physics is a waste of time and marketing gimicry. There is more than enough CPU power to go around these days to have the need to run physics on my GPU instead of graphics. I uy a video card to give me graphics, not physics. If i wanted a physics card i would buy a physics add-in card. So would everyone else. I find running anything but the graphics processes on the GPU to be offencive (in terms of gaming. If you are not gaming and Folding or whatever, fine, but turn that crap off when i am gaming so i can get the best performance possible.)joeysfb - Wednesday, May 13, 2009 - link
Completely agree. Even the best single gpu graphic cards GTX285 can't run Crysis@MaxSetting@4X/8AA@1920X1200 at a comfortable 30fps. Why worry about Physics. Besides Physic effects has very minor impact on gameplay.SuperGee - Thursday, May 7, 2009 - link
Sure the vote forming might be bad, made it hard for me to choose.But if you want to use your GPU full for rendering you can put a extra GPU for PhysX.
If you don't want that, you can optional go for a faster GPU wich has enough power for both task.
It's basicly a luxory choice. And Dev must support mainstream but optional they can support upt to the extreem game rigs imanigable.
If budged is a problem then disable that much more PhysX option.
That the aim for mainstream, the budged restricted people who don't want to invest in extra hardware. So the get the normal game experience.
But other people apriciate those extra checkboxes or sliders for PhysX. For full PhysX experience with there favo PhysX game.
It's just like if 1900x1200 is to much. You settle for 1024x768.
Or no FSAA instead 4XFSAA.
But if you want 2560x1600x32 with FSAA AF motionblur depth of field bloom abd HDR. Then you can go for triple GTX285. That luxory.
GPU Physics means that game are much more Physics richer.
Because Gameplay Physics doesn't go well with this luxory choice. It would be most of all effectPhysics.
So if you want GPU just for rendering and Physics for CPU. You choose with that, for a low Physics setting.
That freedom of choice. Just like there is SLI and CF and 30" monitors. You can go for that extra hardware PhysX acceleration by GPU or PPU.
PhysX and FPS do have inflence on each other but arent the same.
The GPU is for rendering performance. So you choose your gPU to do that on a decent FPS level. For normal setting that it, if you don't care about PhysX.
If you want more PhysX. PhysX isn't about FPS but Physics. And Physics can be also very computing heavy. Wich mean you need extra GPU power to get that extra Physics and keep decent FPS.
So a 8600GT doesn't render games well trowing a heavy PhysX load gives a slide show.
Compare to a GPU PhysX disabled PhysX AAA game can stress a 9800GTX+ in decent setting.
If you want to run a decent heavy PhysX mode a GTX275 give decent performance and is stressed in balance with rendering and Physics task.
I go for that extra optional Physics wih decent FPS that means I must invest in some extra or better hardware.
That choice. And I would like it very much if dev give you that. instead only service the poor people. Or fps focused people.
Don't judge or review or bench physX with FPS performance.
PhysX is about Physics the core of a review would be a review about how Physic is used in the game. And how much more hardware it needs.
masterbm - Wednesday, May 6, 2009 - link
As I do have 8800 gs in my old machine it supprt physic but this is not my normal gaming rig. That goes to my machine with amd 4780 and intel 6600e overclocked to 3.4 ghz. So it would be better if was cross platform support.Anonymous Freak - Wednesday, May 6, 2009 - link
I have a PhysX-capable nVidia card (GeForce 9600GT,) but it was replaced long ago with an ATI card. So I did answer that I own one; I'm just clarifying that it made no difference to me.SuperGee - Thursday, May 7, 2009 - link
There are now decent number of Hardware PhysX powered games.I see the hardware as a high potentional point.
But using it depend if the games deliver.
Have two PPU since May 2006 and a third in 2007.
Played GRaW and GRAW2 in full glory.
But Cell factor demo showed it's potentional.
And Graw2 ageia island show the potentional in the game genre and theme I like. There are more games but wich are not my thing.
I also was very early aware of the implication Physics brings in a Game develeopment project as how it can influence AI_Pathing. Net load and Game art development. Wich hold back decent or heavy accelerated Physx support.
Wel a lot more PhysX is also a bigger game development burden.
It doesn't come for free.
Mirror edge was a subtle touch of hardware PhysX partialy done right but suffer from crossplatform.
Now nV is pushin Dev's with there TWIMTBP devision. More devs and thus game project are actively doeing something with it.
But just like 9 out of 10 games are the average clone. Or in this case the PhysX feels very forched 9 out of 10 and often out of place. There could be a exceptions that shines on PhysX.
The flags high on the ceiling are more distraction or interactable decorations. How ever where the Player or NPC can be run or push trough are right in place.
But I don't like that kind of game.
Where still waiting on a physX killer game.
And it's very reasonable that it may never come.
So I keep my expectation realistic and enjoy some of the good uses of it in games.
As of now PhysX hardware isn't a necicerity except for some not so populair heavy Physics games like Warmonger and Cellfactor wich demand some hardware Physx support.
But no Class AAA title does that.
But up till now a PPU with a ATI do just fine.
But I wouldn't mind a AAA game that optional could use a dedicated midrange or higher G-card and puit it to good in game use.
Lightnix - Thursday, May 7, 2009 - link
I did have an 8800 GTS 320. After having tried the PhysX demos (UT3 map pack, Warmonger (meeh), and a few others that came with the physics packs), I felt thoroughly underwhelmed - the only really cool one was the great Kulu which showed off destructable softbodies. Most of the other ones, tornado effects, debris, cloth tearing, and hail for the most part looked pretty unrealistic and sometimes detracted from the gameplay experience. I can't say that buying my 4870 1GB in February was really affected by the lack of PhysX support, it doesn't seem to be a very killer feature anyway.I think the vote could've drawn a better comparison if they'd asked if we had tried any PhysX demos using accelerating hardware, as well as if we owned some supported hardware.
Laitainion - Thursday, May 7, 2009 - link
I'm in the similar situation to you, only I had the 640meg version. Recently went up to a 1gig 4870 also. Really don't care at all for nVidia physx. If anything it was better before, as a seperate PCI-card. At least anyone could (in theory) get one and not be tied to a particular graphics vendor.