Optimization for AMD Users.

Discussion in 'Player Support' started by Foxknight19, Feb 11, 2013.

  1. Foxknight19

    Will there eventually be optimization for AMD users? I have several friends that are running Intel and they get 60+ FPS even in big battles but most all AMD Users have talked to and seen on forums say they get roughly 30 FPS in big battles.
  2. dr_Fell

    It is up to discuss if PS2 is optimized as a whole game or not. But as for AMD - lower performance is mostly effect of clock to clock, single core performance of the today AMD processors being much lower than modern Intel processors.
  3. ChadJust

    i think eventually is the perfect word here. you cant expect anything but maybe on day....
  4. LibertyRevolution

    I feel your pain:
    AMD X6 1090T 3.2GHZ
    "CPU" bottleneck... 30fps in big battle.
    [IMG]

    Well, at least it is using 3 cores..sorta

    Sony vegas has no problem using all my cores to render videos of their game. :rolleyes:
    [IMG]

    Maybe they should fix their game?
  5. TheAppl3

    Entirely a matter of weaker cycle-to-cycle performance from AMD chips.

    Eventually people will realize it has little to do with being optimized for either brand. It's solely a matter of AMD having weaker individual cores and compensating for that by having more cores. Unfortunately that "solution" doesn't cut it where most games are concerned.

    Intel i7-3770k 4.3GHz. Same idea:

    [IMG]

    Difference is I get 45+ in big battles. Same problem with core usage, better CPU.
  6. LibertyRevolution

    Yep, my stuff is old, got it in 2010..and Intel has always been better when it comes to FPS.
    [IMG]

    I'm cheap, I go AMD.. I got 3 years out of this $200 CPU, not to bad.
    I do video editing on this box, and all that software uses mutlicore..
    On the bright side, I can render video with the idle cores while playing planetside2, with no performance hit...

    (shameless pic and video)

    [IMG]




    New AMD chips seem like an nightmare.. I get better numbers then them.
    I will be going intel next build.
  7. TeknoBug

    Should bump that 1090T to 3.6 or 3.8GHz (bump voltage to 1.475v), just up the multiplier to 18x or 19x.
  8. dr_Fell

    No, it is unfair to say so. Years ago with first Athlons (which were better than pentium those days) and Durons (which devastated analogue Celerons by up to 60% ESPECIALLY in games), and then with Athlons X2, that were better than Pentium Dual Core as well - AMD was company of choice for games, not Intel. AMD was cheaper and was better. Unfortunetely, it was many years ago.

    But even with Phenoms II, compared to Core 2 processors - they were equal, maybe Phenoms II were even better. Since that time Intel made 2 steps forward with sandy and ivy bridge and AMD made step backward with bulldozers, that in terms of single core performance (clock to clock) are worse than Phenoms II. That's the problem. AMD is 3 generations behind Intel now.
  9. drednok

    Cheap doesnt mean bad in regards to AMD. Intel is just expensive. Im sure eventually saving a lot of money on a cpu will pay off for this game. btw I'm pretty sure you dont ever need more than 4 cores for gaming

    btw 2 years old isnt bad at all nowadays. the days of things going obsolete in a year or 2 are over, unless they make a serious breakthrough. thats still the best AMD processor for gaming imo
  10. LordMondando

    Facts-

    1) Distinction between AMD and Intel optimisations is nonsense, x86-64 architecture is what it is. You don't write a program for a specific process in this context.
    2) Largely due to the tri-gate transistors ivy bridges do currently significantly outperform their Piledriver competators at per core performance. however, depending on whats going on this does not equate to sheer performance better. In highly mutli-threaded tasks FX tends to beat the ivy bridges still. That being said, having 8 cores current gets use on all them, just mostly on 1 and then only slightly 10%ish on 3-4 of them.
    3) FX Piledriver range is drastically better then the bulldozer. I have a FX 8350 currently underclocked at @ 4.1 and at 28 minimum, median 32-35 in large battles (miller large..) with shadows on.. comparable performance to an i5 35xx.
    4) Dev's have been screaming they are dispointed with phenom performance for a while, and have been suggesting more mutli-threading optimization is a big goal. I expect things to get better for Deneb and Thurban owners.
    5) Irregardless of what various marketing departments would have you believe, there have not been the big jumps forward in actual CPU performance that it is claimed. Tri-gate transistors are a notable exception, but we are talking 15% max. Big gains in power use are irrelevant to gaming. Moore's law is reaching its climax.
    6) A lot of people claiming 'I get 50+ in battlez' are I imagine playing on low populated servers. I get that in medium battles myself. On Miller the difference between 30 and 40 fps in a large battle, at best 25% performance boost is hundreds of pounds.
    7) Given the game doesn't appear to make use of hyperthreading the best cpu for the game is the i5 3570k

    So yeah. Please stop with the misinformation people.
  11. dr_Fell

    I am definitely not an authority in this area, but my opinion differs a bit, so I'd like to state it too :)

    According to most sources, there are some differences. I can't judge if those are very significant in terms of gaming performance or not. Since I am just a Gentoo Linux user, that rather compile programs than writes them, I was curious if I find anything on google about optimizing programs for certain architecture.
    One of the first thing I have found (and it seems one of the most valuable too) was this:
    http://www.agner.org/optimize/microarchitecture.pdf
    It seems to describe (I have just taken a look, didn't read it all :) ) differrences not only between AMD and Intel 64 bit processors but even between different generations of 64-bit processors inside of every of the two companies.
    short quote from the introduction:
    You can read more about the author here:
    http://jsandber.wordpress.com/2012/04/29/agner-fog-26-2/

    Also (not as deep of course): http://en.wikipedia.org/wiki/X86-64#Differences_between_AMD64_and_Intel_64

    I agree here - this is mostly visible in 3d rendering tasks, that are perfectly threadable. In other tasks, threadable as well, it is sometimes ivy bridge and sometimes Piledriver that performs better. Piledriver cost is much lower too, so for applications that can be threaded almost perfectly it is usually best choice.

    It is up to discuss (or better not :) ) if we trust benchmarks or not. But almost every benchmark that I have seen suggests 30 to over 50% advantage of core i5 3570 over FX 8350 in single-threaded tasks and in games.

    With 30-50% advantage in terms on performance core stock clocked i5 35xx shouldn't have problems with 40 or more FPS where FX 8350 gives 30. Add to this, that some of them are overclocked by 20-30% and 50-60 FPS doesn't seem so unreal.

    Hope so, hope so... :) Deneb cores definitely need more devs' love :D

    It seems so :)
    • Up x 1
  12. TheAppl3

    People can spend an entire day repeating "My FX-8350 performs just as well as a 3570k in single-threaded tasks, I get the same framerate" and that won't make it true. Fact is, the 3570k works better even at lower clocks.

    Most 8350 users seem to get 32-40fps in massive battles. Most 3570k users see around 38-47. Before overclocking my 3770k, which is just a 3570k with a 100MHz advantage, more cache, and Hyperthreading, I still only very rarely dipped below 40fps to around 37 in the largest engagements at major facilities.

    Every time someone says that Intel does something better, they're accused of being fanboys (incredibly stupid term if you ask me). Guess what, still have my Athlon64 3200+ which kicked the teeth out of Pentium 4s. AMD had its time of being better. Right now, it's Intel. AMD is better for the budget-minded, but trying to state that an AMD chip works through heavily single-core dependent applications like PS2 just as well as a 3570k is blind misinformation.


    It's not a matter of silly devotion to one brand. It's a fact based on the actual performance of each type of processor. If games had suddenly recently jumped up to be able to use eight threads with full efficiency, AMD chips would be top dog and only a select few Intels like the 3770k would be even comparable for nearly double the price. That didn't happen yet.

    Also, 50fps is not that unreal. With a moderate-mild overclock to 4.3 peak from 3.9, I get mid 40s to 50 in large battles. At 4.8 (waiting for the D14 for that) I might be able to sustain 50+.
  13. Foxknight19

    This isn't a n "Intel is better than AMD" forum. The truth is they should perform relatively similar in gaming. The places where Intel out performs is in sheer multitasking. They are both good processor. Thubans on use SSE4a where the game was designed for SSE4.1 which the bulldozers use (that's why not a lot of FX users complain to see a FPS drop) that is why I wanted to see if the Devs are going to optimize for the slightly older AMD CPU's.
  14. Prodigal

    So bulldozers should rather perform well? I have one and it isnt anywhere near good. Can you give any advice? Many thanks!
  15. Foxknight19

    Make sure your power settings are for high performance, it consumes more power and prevents CPU from throttling down. See if that helps.
    • Up x 1
  16. NC_agent00kevin

    Im going to be frank here. This whole 'i7s are better than AMD Chips' thing is the dumbest argument Ive have seen yet. Based on the context of this argument, everyone should be running out to buy i7s so they can play this game at an acceptable framerate.

    PS2 players shouldnt be relegated to a single brand of CPU. AMD CPUs perform just fine for every other game. We have 2 gaming PCs that run poor old Athlon II x2 CPUs and guess what? We can still run every game we have at high settings with a few med sprinkled in here and there. Except PS2.

    You also shouldnt have to OC one of those to a ridiculous 4.8 ghz to attain framerates in the 50s. That is an over-the-top amount of processing power, especially for a free to play title. Add to the the gap between the price point of a GPU that will get you good frames and a CPu that will do the same, and you can see the imbalance. All it takes to get frames in the 50s is a GeForce 9800 GT 1GB, but yet you need a ridiculously fast CPU to achieve the same thing.

    All that said, they need to make the engine utilize all cores on any chip more efficiently. Why recommend a 6 core CPU for this game only to get the same FPS as a quad? If this game is supposed to be a 10 year game then its time to step up and start acting like it. Use my all cores to their maximum capability. :)
  17. TheAppl3

    I didn't say the PS2 situation made sense. The i7/i5 > FX argument isn't stupid, it's a fact for PS2 and other heavily single-threaded applications. That doesn't mean it should be that way. It's dumb that there is so much of a difference, but the argument is sound because all it does is state a fact. Each is superior in its own situation, but overall they're pretty much the same. Then you have the exception situations like PS2.

    PS2 as a whole is borked because unlike every other game you're relegated to the absolute best individual core performance CPUs to even get acceptable performance. Those are the newer i5/i7 chips. It's stupid, I agree. AMD has a great market in people who are fine with getting 82 fps instead of 90 because it doesn't matter and for much less money they play the game exactly the same. It just doesn't work in PS2 because performance is so incredibly poor that it highlights unfairly the performance difference between an FX and an i5/i7. The difference of several fps doesn't matter at 60+, it just matters here because we're straddling the line between barely playable and unplayable.

    I agree it's stupid that I would have to pull a 1GHz overclock on a $300 chip to maintain a frame rate that is still under a standard monitor's refresh rate.
    • Up x 1
  18. Prodigal

    Will I do that in the bios or somewhere else?
  19. dr_Fell

    Sorry to say that, but the truth is almost opposite to what You have written. In multitasking and application that can fully utilise 8 or more cores it is 8-core AMD that will outperform Intel CPU's (at least in most cases). In games, since they are not as easily threadable, CPU that delivers better single core performance will usually perform better. And today Intel CPU have much higher single core performance.
    Since most games don't need as much CPU power as PS2, in most cases both Intel and AMD today processors will perform "well enough", so You won't see much difference. You will be just GPU limited or limited by refresh rate, etc. In some cases, when the game needs very much CPU processing power, Intel Ivy Bridge processors will perform better.

    And we are not talking about what CPU is "better", we are talking which one has better PS2 performance. This is exactly a place for doing that.


    If You have money, that You want to spend for CPU that at this moment has the best performance in PS2, then yes, You should buy at least i5 ivy bridge. I7 is not necessary, and why - that has been explained some posts above.

    But we are talking about PS2 here, not any other game. There is another 100 of threads with discussions about how unoptimized this game is, and it is probably true, at least partially. Maybe, after further optimizations, this game will run fine on Bulldozers, Piledrivers, Sandy and Ivy Bridge. As for now - look above.

    In other games difference is usually between having 70 FPS with Bulldozer and 110 FPS with Ivy Bridge and You won't see any difference anyway. Here it is difference between 30 and 40 FPS and it is visible as hell.
    • Up x 1
  20. Hypersot