Why is the FPS still an issue? Little to no improvement has been made in 4 months.

Discussion in 'Player Support' started by Paisho, Mar 18, 2013.

  1. JonboyX

    "Enable physics already"
    "I want draw distance so my 12x scope renders all enemies, all of the time"
    "My mid range 2 generation old hardware can't cope at a consistent 60 fps"

    ^ the problems the developers face.

    The Witcher2 is DX9 and it batters my rig, far more than PS2 does. I've 8gig ram, I'm on a 570gtx and a 4.7ghz i5-2500k. Neither of these are cutting edge, however I've got a trimmed OS installation - Windows 7, latest drivers, antivirus, and Mumble. That's it. No adobe rubbish. No Apple slowtimeTM. I've also turned off blurring and shadows are on medium, but everything else is on high at 1920x1080. It might be luck, but I've set it up as best I can to play games, and the game runs fairly well for me.
  2. ShopTrain

    Not just the OP but all the whiny, crying people that keep complaining about everything in game, and that it should the way they want it.
  3. OutbreakMonkey

    A 5870 is a dinosaur. Keep telling yourself that it's all just marketing. I have a 5870 (gathering dust), a 6870 and a 7970. They are each worlds apart in terms of performance. High end PC gaming has never been and never will be for the gamer with shallow pockets.
    • Up x 2
  4. LordMondando

    Great argument, simply restate what your said earlier with no further justification. No dicussion of the game engine or relevant hardware architecture necessary.

    Buddy I'm won over, all those massive architecture improvements that I didn't see before, suddenly expanding before my EYES!!! THE ENLIGHTENMENT. For the card... It hath collected the dust.

    FINALLY I CAN HAVE THE TRANQUIL SOLACE OF MAD FUPUSES AT PS2.

    MONEY = FUPUSES.

    I HATH ATTAINED NIRVANA.


    Meanwhile in the real empirical world, yes even 2 and a smidgen year old cards which were perviously at the high end range will be fine for Ps2 as - most if not all of the problems people are experience are due to processes that run entirely on the CPU, not GPU. Hell the fact the OP has an 2600 makes this entire debate super moot. Given such CPU intensive processes (largely to do with tracking other players, checking fi they need to be rendered, that sorta jazz) are a large percentage of the problems and A 2600 in terms of actual architecture is still one of the best CPU's in the world. The problem lies in the code atm.

    And i'm over 9000% confident, he could upgrade to a GTX 690 megatron 5billion or a Titan demigod maximum freshness GPU and see little to no improvement.
  5. Kanzy

    Well said mate, well said.
    I have a 5850 and I know when to turn my setting down and when not too, as far as PS2 is concerned, I need high FPS so I turn everything down to low, get low textures but I get the FPS I need to not play the game handicapped which at the end of the day is whats important to me, the game play smoothness not how cool the game graphics look.
  6. OutbreakMonkey

    Argue and rant like a child, be my guest.
    One google search later.
    • 3DMark11 Performance Preset
    • HD 5870 1GB: 4832
    • HD 7870 2GB: 6601 (+36%)
    • HD 7850 2GB: 5497 (+13%)
    If your CPU, mobo and memory is also two generations old then it's only going to compound that 36% performance deficit. The 7870 isn't even a top performer currently.
  7. LordMondando

    AS has been dicussed over 9000 times. Turning everything to low, will more likely than not downclock your GPU and thus, overall make your system perform worse.

    With a 5850 you shouldn't be turning anything down beyond medium. Any performance gains, you think you may be having are most likely placebo.

    And again, GPU is frankly auxilary to most of the most intensive processes in the game atm. PS2 loves the CPU.
    • Up x 1
  8. LordMondando

    Actually I doubt many children could properly deploy the concept of Nirvana in a joke, but what evs, just mocking you brah.

    I guess that settles it then. Only a 1/3rd increase in performance over 3 years across comparative models. That.. proves.. your.. wait a second.
    And star trek generations, being 3 generations old now, was the worst startrek.

    He has an i7 2600 and a dedicated graphics card that's either 1 or 2 gig GDDR5 with a core clock of 850mhz and a comparitive overall computational performance (that is in Gflops) to a 660 GTX

    Problem taint at his end.
  9. Dudster4

    GPU does little to nothing to improve FPS in this game. I'm running a system I built 6 months ago, but sadly I chose an AMD which seems the have been my single worst decision ever...I can play but I would be miles better if i had even 10 more FPS. Also this game just leaks memory like a bastard after about an hour and 15 minutes of gameplay everything just turns crappy. Stable...but crappy my voice doesn't work right my frames drop about 10 and i'm running with 32 gigs of ram. The sad thing is my buddy runs an Intel system and I know he get's better performance than me but not stability, I don't think i've crashed in months but he crashes a few times a session. They just need to gather help from some more manufacturers to figure out WTF is wrong.
  10. oLd.Sneakers

    Agree with OP would be good to hear some exponations from the guys actually in charge of the coding and who have an understanding of the issues regarding the cpu boundness of the game.

    Anyone who knows a little bit of computors and how the hardware interact understand that the game is massivly bottlenecked by the CPU atm due to the game only utilizing 2 cores.

    Would also be a good read and interesting to know why the games performance which on high end systems was at an acceptable ( not awesome ) rate pre GU4 have suddenly deteriorated to the point were many people sporting i7s that are clocked beyond 4 ghz feel the game to be borderline unplayable/unenjoyable due to massive frame drops, especially when flying.
    • Up x 1
  11. Scure

    And who said its easy to use DX11 after 9? I just said this game uses DX9 only, because someone said HD5870 is outdated.
    The problem is, the developers used DX9 from the start.
    And OFC DX11 is faster than 9. What do you guys think? Then DX9 is not better than 7?
    They need to optimize the API, because this is very important for performance. Never heard about gather4 or local data share for Dx10,11? Then ****.
  12. roDDo

    Um... no.
    Go try and tell that to the hundreds of people on this forum (like me) with Sandy and Ivy Bridge quad cores that can handle every single game on the market with ease. Except for PS2.
    Do you expect them to keep their mouths shut when their frame-rate drops to single digits? If you're willing to play at 15 fps, that's your decision. Other people aren't, especially after the game used to run at an acceptable level.
  13. roDDo

    D-D-D-D-D-D-DOUBLE-POST... POST... post...
  14. iccle

    The short answer is the game is written with a view to supporting future machines, ie those that will be coming out in the next year or two. There is no point in spending 3-5 years writing a game engine that targets only technology that is available at launch, or older tech.

    When planetside1 first came out there were similar issues, you needed quite a beefy rig to make it at all playable, a couple of years later 200-300fps.
  15. SooperDog

    I just built my computer 6 months ago. 40-60 FPS in PS2 when not in a battle, 25-30 in battles. Pretty horrible on medium settings.
    Specs:

    i5-3570K
    16GB DDR3
    Sabertooth Z77
    GTX 660 Signature 2 (2GB)
    256 GB SSD
    Running it on a 27" IPS @ 2560 x 1440

    I get 120+ FPS on ultra settings on other FPS games. The problem isn't people's hardware lol...
  16. noobindo

  17. noobindo

    Game use only one core.
  18. XRIST0

    Does not .
  19. oLd.Sneakers

    Im not gonna argue with you about it, but you can easily test it yourself by shutting off one core at a time through bios and then look at the performance.

    Going from 2-3 cores does nothing for the game going from 1-2 almost dubbles frame rates.

    Anyway if you feel the game only runs on 1 core then why don't you turn off the other 3 and enjoy the experience.
    • Up x 1
  20. roDDo

    The short-sighted answer maybe. Even CPUs from three generations ago aren't fully utilized. I'm talking Core 2 Quads here. Quad cores in general are used to about 50% of their capacity.
    The sad reality is that we simply aren't going to see the kind of CPUs that this game currently requires to run well in its lifetime. Going from a Kentsfield/Yorkfield Core 2 Quad to an Ivy Bridge i7 - three CPU generations - you'll see something like 50% higher IPC. These CPUs span about 5 years of rapid development and ever more ridiculous power draw. Things are different today. We probably won't see a 50% increase in IPC within 5 years. The focus is now on efficiency.
    It's even sadder when you consider that, even then, the game would still only run at 45 instead of 30 fps in 2018. We could have better performance today were this game to fully utilize the hardware we already have.