PhysX vs PhysX GPU Particles

Discussion in 'Player Support' started by codeForge, Feb 15, 2013.

  1. BenYeeHua

    Read the Nick Name.
    They are difference person.;)
  2. Hyperz

    Yeah noticed it heh. You quoted it before I could edit it :p.
  3. BenYeeHua

    Yup, on the other forum, the moderator also telling me this, when I say
    "The great on Forum is you can edit it."
    Then he answer
    "But not with quote.:D "
    PS:Except you are a moderator and you can edit the other post too.;)
  4. ZogrimGamma

    Sigh..
    There is no production ready GPU acceleration in Bullet even today.

    Few years ago AMD has choosen a line to promote their alternative "YaY open-standarts" way as opposed to "Meh proprietary NVIDIA's CUDA" way, but in case of GPU accelerated physics in games they have failed completely.

    Nor the OpenCL version of Havok
    http://techreport.com/news/16640/havok-amd-demo-opencl-based-physics-at-gdc

    nor the OpenCL Bullet-DMM Open Physics Initiative "solution"
    http://www.xbitlabs.com/news/multim...s_Open_Physics_Initiative_with_New_Tools.html

    were not released or, most importanly, used in any games.

    Firstly, AMD has started with aggressive marketing and even Black PR, saying that GPU PhysX will die eventually
    http://www.bit-tech.net/news/hardware/2008/12/11/amd-exec-says-physx-will-die/1

    promoting "Accelerated physics processing" as a feature of their GPUs
    http://www.amd.com/us/products/desk.../Pages/ati-radeon-hd-4870-specifications.aspx

    but now they are completely silent about it - because they have nothing to be proud of.

    So, "stop spreading marketing BS", as you said, would you ? :)
    • Up x 1
  5. BenYeeHua

    Yup, old news.
    And when PhysX 3 came out?
    http://physxinfo.com/news/8336/physx-sdk-3-2-final-version-is-available/
    So why this game is using PhysX 3 when this game is 2009 start development?
  6. Wolvers

    Does anyone know if it's possible to install the latest Nvidia driver on an AMD system, and select the physx on CPU option to get these extra effects in PS2?

    I've toyed with the idea of a low end Nvidia card just for physx, is anyone here doing this 'hybrid' approach?
  7. ZogrimGamma

    That's what I'm talking about. Old news cause there are no new ones.

    3.0 in May 2011, alpha-beta version were available around a year before that.

    Maybe they have started with PhysX 2.8 and than switched, like other companies, or maybe they have choosen it based on early tests and demos.

    What is your point? NVIDIA came in 2012 with bags of money and said "trow off your shiny OpenCL Bullet you were using since 2009 and integrate our PhysX instead"? :)

    You must understand, that PhysX 3 in PS2 - is a core of all physics, collision and hit detection. That's what the developers most interested in. They don't care about GPU acceleration. GPU Particles and stuff - is just a by-product, and is mostly implemented by NVIDIA.

    PhysX SDK, as physics engine, is very popular - around 400 released games. You know how many games are using Bullet ?
  8. BenYeeHua

    The website has telling you.
    https://en.wikipedia.org/wiki/Bullet_(software)
    But all of them are using CPU only.
    And
    Which showing part of Sony is using it with famous movie.

    But ya, let stop this wars, as this game is using PhysX 3 already.:)
  9. ZogrimGamma

    That's it - there are few games.
    Bullet is better suited for VFX, but is not very popular among game developers.

    What wars, man ? :)
    Just trying to clarify certain things, so there can be a healthy discussion)

    --------
    Now, there is another question.
    Is it possible to calculate PhysX particles on CPU (for non-NVIDIA users) and should one do that ? Well, this is not that simple.
    I may explain that in more details if someone is interested)
  10. Wolvers

    Well that is what I asked!
  11. BenYeeHua

    That why I ask this.:)
    If you don't support the newest Instruction Set of the CPU, then it must be running slow on CPU.
    But how about a AMD 8 core that having nearly 6 core is running too less.
  12. HellasVagabond

    Some of you may not be into things but aside the cash issues that AMD has right now they are also withdrawing from the high end GPU market. So why are we even arguing about this ?
  13. BenYeeHua

    The last hope, PS4 and xbox720, which can low access to the GPU and doing thing on there more easy.:)
  14. Wolvers

    That must be a typo. AMD are not pulling out of the high end GPU market, it's one of the strongest parts of their business!

    They'be stopped trying to fight Intel with top end CPUs, if that's what you mean.
  15. HellasVagabond

    No, due to extreme financial difficulties (they owe too much and they keep borrowing money) their intention is to stop high end GPU development soon. But then again AMD always was about price/performance so that shouldn't reflect badly on sales.
  16. Wolvers

  17. Sliced

    AMD lost millions if not billions from FailDozer and Faildriver, God knows how much they lost from their GPU market as well.
    As they own both, if the CPU market does badly their GPU market will eventually feel it as well.

    Me personally, I would just laugh if AMD failed tomorrow and never got back up.
    Yes I'm a fan of Intel + Nvidia and I'm proud of it!
  18. Wolvers

    You wouldn't laugh at how much your hardware would cost. :rolleyes:
  19. BenYeeHua

    How about 7970m and 680m.:D
  20. Hyperz

    First of all I didn't write the stuff in that quote. And secondly, if there's no production-ready GPU acceleration then how come it's used in products (3DMark, Cinima 4D, Blender, Pixar movies, etc)? Plenty of games already use the CPU version (GTA 4//5/Max Payne 3 for example). The reason you don't see it much as GPU accelerated is because of console ports, not being "sponsored" to add it into ports, and the fact that most games are GPU limited with CPU cycles to spare. Lastly, AMD has been supporting OpenCL and Bullet to this very day. And CUDA/PhysX will go the way of Glide. It's only a matter of time.