PhysX vs PhysX GPU Particles

Discussion in 'Player Support' started by codeForge, Feb 15, 2013.

  1. codeForge

    PhysX is our physics engine. It runs on any hardware. PhysX is owned by nVidia, but it is a software library running on Intel and AMD processors alike.

    PhysX GPU Particles require nVidia cards, and newer drivers. PhysX GPU Particles allows us to do some nifty particle effects on the GPU that would be impractical to do on the CPU. In some cases, we forgo doing any of a particular particle effect on the CPU and instead do it all on the GPU if you have GPU Particles turned on, which CAN cause a slight speed increase in those cases.

    Mostly, however, PhysX GPU Particles is about eye candy, not speed.

    Hope that clears things up!
    • Up x 12
  2. SolLeks

    I have been running (forced) PhysX and I have to say it makes the game look even better!
    I would not suggest doing so without a dedicated card though, It tanks some of my outfit mate's FPS currently.

    (I have a GTX 680 main card, with my old GTX 570 as a PhysX dedicated card and don't see any difference in FPS with it on or without it)
  3. LordMondando

    I massively, massively appreciate the effort the coding team is putting into giving us timely and informative updates on the forums.
    • Up x 7
  4. Hyperz


    Shame on you for supporting this vendor lock-in gimmick. None of the stuff that it enables could not be done on modern multi-core CPU's (or AMD GPU's using alternatives such as Bullet Physics). Especially since the game itself barely manages to use 2 cores the idle ones can EASILY handle those effects which have been done on the CPU for decades anyway. Stop spreading marketing BS.
    • Up x 2
  5. UnrealGaz

    Who are you too demand who sony works with?As codeforge just said its eye candy nothing more.stop getting upset over pixels.
    The PhysX via GPU particles is great but not a necessity too play nice work here.
  6. Hashlak

    I agree ! Particle effects are indeed very cool looking but at the end of the day, PS2 does NOT makes full use of any of our systems ! We have never had any official word or explanation on that ! Its like engine is broken lol, was jus not designed properly or something and no one has any explanation..

    The only stupid official responses ive heard about this goes alone the lines of "Oh its not just some DX11 or 64bit magic that allows us to make the engine use all cores.. But dx11 or 64bit in many cases would help the performance" THEN WHY THE HELL ARE WE USING OLD TECH?

    All the focus of the SOE team working on planetside should just focus and resolve the lack of issues with the engine+game so that it actually runs smoothly.. Even on the highest end rigs it doesnt run over 60fps in big battles..
  7. SolLeks

    There are still a lot of people using windows XP, 32 bit. that is the only reason I see them not using DX11 + 64 bit. As for the multi core issue, IDK why its like that since the min sys req say a multi core CPU.
  8. Hypersot

    So, let me get this more clear. If we *do not* turn physx on, then we *do not* need a dedicated physx card, right?...

    ..I mean all those effects put the same weight on both AMD and nVidia cards, as long as we don't turn physx on.. right?

    btw. what's the difference between particles and effects in game's options?
  9. SolLeks

    don't quote me on this, but I believe you are correct.

    the difference with having it on or off is this video compaired to what you are running right now.
    • Up x 1
  10. DonkeyDoodah

    Would it not be possible to run things on another physics engine that gives all players nice eyecandy, or do you just want to look good for the nVidious?

  11. SomeRandomNewbie

    Lets gets this straight; you want them to try to do on the CPU something that no other game can manage? Pro tip; the games that do these effects on the CPU have maybe six or seven entities producing particles at any given time, and they still have to de-spawn the particles after a few seconds. Throw 600 entities into it, and the even the $1000+ CPU's are going to crawl.

    Don't believe me? Look at EvE online. Anytime they get a huge fight going on the server has to slow the action down to 1/10th of its normal speed to keep up. Even running at that rate, clients still regularly get single digit framerates. Hell, look at any MMO's frame rate when you get a huge raid going on.

    I'm going to highlight this next paragraph, because it answers a few other people questions as well:

    As to why Bullet's not being used, the answer's pretty simple. Bullet was a CPU only solution at the time Planetside2 was being developed; it didn't support openCL until over a year after PS2 was publicly known about, and that support was sketchy (due to GPU vendors initially limiting access to OpenCL to certain cards) until quite late in PS2's development. Havoc still doesn't support openCL well either.

    There's little reason for a game that starts development today to use PhysX over Bullet, unless the developer is getting help from nvidia. But PS2 wasn't developed today, and by the time the options became available, the choices would have already been made. Rule 1 of software development, you *don't* go back and change your entire architecture everytime something new comes along. See Duke Nukem Forever as a case study in why.
    • Up x 2
  12. DonkeyDoodah

    a) No - he said use bullet which runs on a GPU.
    b) Depends on what you are doing. Fiddling Borderland 2's Physx to run at full tilt on CPU apparently adds about 5% CPU load ...



    Similarly I ran Metro 2033 fine most of the time, the only time I got noticable slowdown was with the smoke from grenades.

    1) Physx is a CPU only solution for those without an nVidia card and always will be

    2) GPU accelleration came to Bullet in 2009 about the time PS2 started development. Since they have yet to include GPU enabled Physx in the production version, why the hurry anyway?

    3) The reason they used physx was because the PS3 has an nViidia GPU and Sony signed a strategic partnership to use physx on their games as a result ...

    http://www.prnewswire.com/news-rele...-partner-to-drive-online-gaming-76574017.html

    On the plus side, I guess it is nVidia money that keeps games like PS2 going, so even an AMD owner like me shouldn't really complain.
  13. TomaHawk

    Are you serious or just trolling. Either way, please pull your bottom lip over your head and swallow.
  14. TomaHawk

    I got a significant difference on PhysX output in PS2 when switching Nvidia control panel PhysX from GPU to CPU. On GPU, I got the same effects people have posted here already; the air pad elevators effects, for example. And a loss of around 10+fps. When switching to CPU and restarting the game, I got my fps back to normal, but the particl effects weren't going up the length of the pad corridor. Instead, the particles were staying at and near the ground level. It was underwhelming compared to the sights when on GPU.

    I hope my GTX 260 installed tonight in tandem with my 670 will help give me the fps with the PhysX eyecandy in full effect.
  15. Sliced

    PhysX running 2-3 flags and an entire game are two totally different things.
    If 5% load = 3 flags and 2 buildings then I wouldn't want to know the CPU load that PS2 would cause.

    Personally I would believe that PS2 decided to use Nvidia Physx for the partnership which you can't fault them for, at the end of the day it is a business and 2, AMD has nothing better on the table.
    PhysX is also updated every 1-3 months-ish so if there are problems with the PhysX side then you know it will get fixed soon.
    PhysX is also owned by Nvidia to be used on Nvidia cards, because of that there is no chance of incompatibility or extra installations needed.
    SoE would not need to do much maintenance on Physx once it is working
    Who is to say that the other company's will stay around in 5-10 years time (if PS2 lasts that long)
    How much would the other company's charge for using their software?
    People generally have better GPUs then CPUs
    The game is already CPU hungry so why give it more tasks?
    PS2 is a free game and they could get free publicity to all Nvidia users (as the PS2 picture was shown within the newest Beta drivers) which means more customers to a F2P game, which is what SoE needs.

    The list could go on I guess.
    but at the end of it all, AMD offers nothing better, no other company has proven their products would work on a massive scale, they also have not proven they will be around for the life time of this game. With all that taken in it is far to risky to use another company that really puts nothing on the table.
  16. TheEvilBlight

  17. BenYeeHua

    Did the software library don't bias like Intel software library, support SSE and also AVX on both side?
    http://www.agner.org/optimize/blog/read.php?i=49
    http://www.agner.org/optimize/blog/read.php?i=25
    Yay!
    For much more easy to see the enemy that treating by medic.
    Just wonder how the OpenCL performance, and the work that need to coding with it.
    • Up x 1
  18. Hydragarium

    Hopefully AMD comes up with a counterattack in the next generation - nVidia is finally starting to reach the point where they can successfully market PhysX as something more than as a "moving flags" gimmick. And ideally we'll have compatibility between the two so it won't matter which brand you own. :)
  19. BenYeeHua

    Yup, I like fair fight, not the fight like this...
    And let us waiting for HSA, and also the OpenCL for using in game phys, and show that is AMD or Nvidia graphic card is more powerful for calculation in game.:)
  20. Hyperz

    I'm not demanding anything. I'm merely stating my opinion and some facts.

    If XP is THE reason for not doing DX11 then they have access to all the DX11 stuff trough OpenGL 4.2 which works fine on XP and they already need an OpenGL render path anyway for the Mac version.

    Good Sir, what magic box are you pulling your numbers out of? Educate yourself on the subject. Even Mafia 2 with it's horribly inefficient PhysX 2.x implementation can easily run on the CPU at it's medium setting of about 3000 (or was it 6000) particles. Furthermore, AFIAK you can easily toggle between CPU/GPU in the API so it'd take like 10 minutes to add a checkbox option to let us run the effects on the CPU, multi-threaded.