AMD: severe performance issues

Discussion in 'Player Support' started by N4poleon, May 26, 2014.

  1. BlackDove

    First off the OS settings im talking about arent performance related. Theyre security related. My point is that rarely is stuff setup right by default. Why? Have to consider the least common denominator. Why are TVs almost always miscalibrated? So they look "vivid" in the showroom.

    Second i never said anything about using MSAA in PS2. I explain how to make it less blurry by adjusting the texture filtering on the driver level.

    And if you use adaptive Vsync it allows tearing BELOW your refresh rate. Having "more fps than necessary" is actually "more fps than can be displayed" and is detrimental to gameplay.

    Unless you have G-Sync your monitor wont display those extra frames and youll ALWAYS get tearing because the GPU just sends frames to the monitor when theyre done so youll be seeing half of one frame and half of another sinnce the monitor refresh and the GPU sending the frames will ALWAYS be out of sync!
  2. BlackDove

    I get what youre saying but this game hasnt gotten "easier to run" in a linear fashion with their optimizations and patches. My suggestion is that sith OMFG maybe they tweaked something thats making it run worse on specific CPU architectures. If they made it even more dependent on intra core bandwidth or cache performance and you have a situation where you get deteriorating performance.

    The Jaguar architecture doesnt use modules and the module cores are KIND OF like Hyperthreading which also does nothing for PS2. They are optimizing it for the PS4 Jaguar APU too.

    Thats mostly my speculation though. Im not saying thats definitely whats going on. As i said maybe the OP has failing hardware.
  3. LordMondando


    Ok, first off I asked for an explanation of something your touting not benches, why is single core performance and the massive gulf so critical to playing PS2 at all - feel free to get technical im a junior software developer. Well aware of how the game performs.

    Secondly, I'd look into that data a lot more, its everything on maxium. Which is a bad idea for anyone. Nothing in that data allows the logic inference 'lol AMD no do', infact, the data actually shows if your on a budget and want to play PS2, get a fx6300 and run at medium settings.
  4. LordMondando

    Mostly due to bugs it looks like.

    My own hypothesis is three fold.

    1) They have introduced far more stringent consisencey constraints on the gameworld. Before the client and server would happily go well out of sync in terms of where it thought assets were. Consistenct is now being enforced to the point where if there is a sigificant amount of packet loss. The game will not render frames.

    2) The new nvidia drivers are wonky as **** and being dx9 whilst overhead is on a general downwards trend, some times driver batches are taking far longer than normal to process.

    3) Memory leak is back, yay! These can creep in so easily.

    Yes, but unless they are also writing new compilers the low level architecture is actually pretty irrelevant.

    I'm also not on board about the hyperthreading statement sorry. Hyperthread is a scheduling system that basically plays tertis with feeding threads into a pipeline.
  5. Utrooperx

    I've been following this discussion quite closely since my previous post...and took the further step of dropping a note to AMD to see what they had to say...

    Their response is something I'm sure we all can agree with...

    Thanks for your attached files, according to your game issue description, , this is more likely an issue related to the game optimization, just like you say, A10-5800k is not overclocked with the frequency of 4.2Ghz, it is still under the specification of our A10-5800K design. Besides, some other users with another kind of CPU had the same problem and your computer had no problem with other games like BF4, Far Cry 3, so in a word, it doesn’t make sense that APU is the one to blame.
    Thanks for contacting AMD.

    Can we all agree that IF the game was properly optimized, AMD processors would perform immensely better?
    We seem to be debating the wrong point...that it is simply a "cop out" on SoE's part to not make the game run properly, whether we are using a Intel or a AMD CPU...
    To put it more simply...we are debating whether a manual or a automatic transmission is better for our car, while completely ignoring the fact that the motor has a rod knocking. Either type of transmission will do, but it doesn't matter if the engine is messed up...
    Fix the engine first.
    • Up x 1
  6. Kirppu1


    Why would security be anyways tied with performance? Texture filtering is completely different from Anti-aliasing. And no they won't be "always" out of sync
  7. Dragam

    LordMondando :

    I dont care what you asked for - youre obviously on a crusade for AMD as usual. The benchmark clearly shows what everyone already knows - that the game runs smooth on Intel haswell processors with ultra settings, while AMD processors get horrible fps drops.

    I run ultra aswell, and my performance is always top notch - i imagine that isnt the case with your bin can AMD processor.
  8. LordMondando

    I'm asking it because your inability to answer it demonstrates you have no understanding of the underlying technology.

    The cherry picking of what to respond to doesn't look great either btw.

    From which can only be inferred you've not actually read most of what i've written.

    No it shows lower general performance, which i've never contended.

    Good for you.

    And you accused me of bias. I'm glad your computer makes you feel special.

    Now can you please stop presenting your tribal nonsense as 'tech advice', when someone says 'well It used to preform at X, but its now at X-N, what gives any ideas', 'lol AMD' is not an appropriate response unless you want to look like an idiot who treats chip makers like football teams.

    Which alas, is where we are at - yet again.
    • Up x 1
  9. BlackDove

    Re read what i said for the first two. You obviously didnt read what i said. I said default settings in software are usually wrong. Why are you confused about what i said about antialiasing and texture filtering?

    And yes if the monitor scans the image without Vsync enabled it will never be synced because of how monitors actually display images!

    Even if its at exactly 60fps it will NEVER be synced unless the GPU sending the frame is synced to the monitor displaying it!

    Heres an hour long video with a detailed explanation of how Vsync works. It also explains G-Sync but its the best explanation of Vsync i can find.

    http://www.pcper.com/news/Graphics-Cards/PCPer-Live-NVIDIA-G-Sync-Discussion-Tom-Petersen-QA
  10. BlackDove

    The Nvidia drivers only crash for most people when they are in a scaleform menu and their GPU spikes in load to beyond 100% TDP. It also did it with old drivers. It was not really the drivers. It was the game all along.

    Why arent you on board with it. I know hyperthreading is not separate execution resources and i know how it works but im saying two AMD cores in a module give you the kind of performance increase hyperthreading does.

    Some applications get a HUGE benefit from hyperthreading. Others get none. Adding two of AMDs CMT cores in a module doesnt scale the way adding an intel or k10 core does. An i5 basically doubles the performance of a Pentium. An i7 barely outperforms an i5 in a lot of cases but if the application uses hyperthreading it can be more significant. The modules behave more like hyperthreaded cores in that way. I know theyre architecturally totally the opposite.
  11. Kirppu1


    When the game runs at a 60fps on a 60hz monitor.. the monitor gets a new frame with each refresh of the picture, thus making it look smooth, i don't understand how that's not synchronized(unless you are running interlaced, in which case the upgrade of the monitor is a good thing)

    And you said IN operating systems allow me to quote you: "The vast majority of operating system, driver and game default settings are wrong." And the second quote :"First off the OS settings im talking about arent performance related. Theyre security related." You did not say anything about other software such as gamebooster, ccleaner, Anti-viruses.

    3rd quote: "Second i never said anything about using MSAA in PS2. I explain how to make it less blurry by adjusting the texture filtering on the driver level." This implies that you confused the two
  12. BlackDove

    No. It will literally NEVER BE SYNCED even if the refresh rate is the same as the fps! Please learn how Vsync and monitors work. That video explained WHY it will never be synced if you watched it.

    I said a lot of default settings are wrong by default. What is confusing about that?

    YOU asked me how to use MSAA in ps2. And you ae also confused about vsync here too.

    "In-game v-sync is no different to adaptive except that it's well adaptive, and please do tell how are you going to force planetside 2 to use MSAA when planetside (Nor any other dx9 game that uses deferred rendering) can have Anti-alias from drivers? Forgelight uses post-process AA which works even with deferred rendering.”

    Idk why you brought up MSAA in ps2. I brought up how to make TEXTURES less blurry by adjusting TEXTURE FILTERING. NOT antialiasing.

    And adaptive vsync is totally different. Watch the video.
  13. Kirppu1

    You have bought it up in your thread(the AF isn't going to make any diffirences to Blurry anti-aliasing. It really shouldn't matter blurriness wise if it's on 4x MSAA or 8x), and you did not say how it isn't synced, instead you insist me to watch a video instead of saying it in a nutshell
  14. Octiceps

    FXAA blurs textures universally whether up close or faraway. Low levels of AF blur textures receding into the distance and at oblique angles to the player's perspective. And I don't know how anybody can play a game, especially an FPS, with V-Sync on as it adds a massive amount of additional input lag to an already laggy game and the inherent input and pixel response lag of your display.
  15. BlackDove

    Lets say your monitor refreshes 60 times a second. Ok? Or refresh rate and fps are identical numbers. 120 and 120 or 144 and 144.

    Unless the monitor is synced to BEGIN DISPLAYING THE FRAME AT THE SAME TIME that the gpu sends it from the framebuffer they will ALWAYS be out of sync.

    You need to understand how monitors display images and how buffering works in order to understand that "in a nutshell" explanation. Thats why i said watch the video. I dont feel like explaining ALL OF THAT to you as well.


    And also look up what the high quality texture filtering setting in the Nvidia control panel ACTUALLY DOES TO UNDERSTAND WHY IT MAKES THINGS LOOK BETTER.
  16. BlackDove

    I manage to play pretty well and get headshots with my 280 ping to Briggs.

    I cant understand how anyone with a decent pc DOESNT use vsync. Everything looks like **** without it.
  17. entrailsgalore

    Games do look ugly without Vsync, and I an OCD about always using it, however with a 60hz monitor, with Vsync on, you are either getting TRUE 60 fps, or 30 FPS. Even if it says 52 fps or 45 fps, anything below 60 FPS is really 30 FPS. So recently I started playing PS2 without Vsync, and even though I still see the screen tearing, it is one of those things where you decide what is worth more, no screen tearing at 30 fps (sometimes 60fps), or screen tearing but allowing your real framerate to be rendered by your GPU to your screen.
  18. Octiceps

    Because of the input lag and jutter of course. I can't understand why anyone would use V-Sync in the first place. Is it really worth sacrificing performance and competitive advantage for no tearing? I don't think so.
  19. Dragam

    Octiceps : you could go through the golden middle ground though... adaptive v-sync :)
  20. Octiceps

    There's nothing golden at all about Adaptive V-Sync. There's still V-Sync induced input lag and judder when FPS pegs refresh rate, and when FPS drops below refresh rate, V-Sync is turned off and there's tearing. Triple-buffered V-Sync is a better solution than Adaptive V-Sync if you truly want the tear-free experience at any frame rate less than or equal to refresh rate with no FPS drops, but again the input lag and judder is not tolerable for me.

    No, the real solution is a variable refresh rate technology like G-Sync or FreeSync, but even that is not perfect. For example, the ideal range for reaping the benefits of G-Sync is 30-60 FPS. Drop below 30 FPS and the experience becomes substantially worse as the display has to duplicate frames, which causes stuttering. Plus, there is a small performance hit with G-Sync due to overhead.