[BUG] Only 40 FPS with an overclocked GTX970... REALLY????

Discussion in 'PlanetSide 2 Gameplay Discussion' started by SPL Tech, Aug 9, 2016.

  1. travbrad


    The only engine I know of that does it consistently is the Frostbite engine (used for all the Battlefield games since BF3). That maxes out all 4 of my cores, and even people with 6 core Intel and 8 "core" AMD CPUs say it maxes out those too. GTA5 is the only other game I can think of that I've seen make full use of my 4 cores.

    So making games heavily multi-threaded is certainly possible, but it's obviously difficult to do. Both Frostbite and GTA have huge budgets and tons of people working on them. There are still A LOT more games that use about 1.5-3 cores worth of CPU power though.

    Theoretically we should see more games starting to use multiple cores since the new consoles have 8 really slow cores and coding games that way is the only way to get good performance out of them, but whether that work gets ported over to PC remains to be seen....widespread 4+ core support has been "just around the corner" for the last decade.
  2. Taemien

    Ironically, Everquest 2 was designed back in 2002-2004 to be 'future proof'. The idea was it would be so graphic intensive (for the time) that no one could run it at max settings. That everyone would be on low and medium. And over time as people got better computers they would be able to increase the settings.

    They did not take GPU into account, nor did they think CPUs would get slower with more cores instead of faster. The game was literally designed around there being eventually a single core 6.0Ghz CPU with a moderate GPU (just enough vram to hold textures). For much of EQ2's history it ran crap on decent machines. They eventually optimized the code a bit. But its still on the sketchy side.

    I do remember my old laptop running it well with 1.8ghz dual core with a GT 7600 Go and even better than a 2.2ghz quadcore with a SLI 2x GTX 8800 setup. How much sense does that make? Well the laptop had a core duo intel, and the PC had a AMD.. that's why.

    It eventually got better, but like planetside, you need the most efficient CPU you can get your hands on. Clock speed and cores don't matter so much as that efficiency. And for GPU, just about 1-2gig of Video Ram will likely do it. I know it sounds stupid, but that's just the way PS2 works, and alot of other Daybreak games.

    Most modern games are quite as bit better though. relying more on GPU than CPU. Which is why we see many people in PS2 struggling. They have decent GPUs.. usually GTX 780s or R9 280/290/390. But are slower in the CPU, or an AMD CPU. AMD just doesn't like Daybreak games from my experience.

    But some have decent i5s and i7s and run the game at over 150FPS


    As a bit of a indie developer myself, I can see why they don't use modern technologies. They don't wish to invest in things that most of their end users will not have. Think about it this way. Some people are still running XP and Vista. Which means Directx 11 and 12 is right out. Up until two years ago, 32 bit OS's were still a thing.

    Then you've got the Intel and AMD with HT and whatever AMD's version is (if its out yet). Which do you code your game for? Some people have dual cores, quad cores, hex cores, and more. What do you go for? Its not easy to code for all of those possibilities. And it takes forever and starts getting crazy to test on.

    For GPU standards its getting nuts too. Directx 9 is slowly becoming less of the plateau that it was. So now we can finally focus on 11. But 12 is right around the corner. But what about Physx and ATI's version? Use both, neither, or something else? Maybe Vulkan? Which one will actually STICK in the industry?

    At the end of the day, you code your game for the lowest common denominator to reach the widest audience. Within reason. Sometimes your game just requires a certain level of oomph. Not every developer is going to make something like Terraria or whatnot.

    But yeah, there you go.. That's why we see games that don't take advantage of advanced hardware technologies. Doesn't mean they won't. Some games do. But this topic is specifically about PS2. Which really doesn't.
  3. SPLTech

    The photo in my OP very clearly shows the game using all four cores. All four cores are running at near 100% capacity, which is otherwise running at under 10% when the game is closed. So yea, the game will use as many cores as you have, as does nearly every game out there. I've never opened the task manager to see only one or two cores being used and the other sitting idle.
  4. ButterNutts

    Eh, I don't know what fps I get nor do I care, all I know is that PS2 is smooth for me even in the largest battles I've ever seen.

    As long as the frame rate is silky smooth ( which it always is ), then I'm happy.
  5. travbrad


    You have one of those magical CPUs though. Here is what I get on my 4 core (2500K) and GTX 970:

    [IMG]
  6. SPLTech

    That is strange. I would think at least one core would max out. All four of my cores are used in every game I play. Even when I had the E8400, both cores would max out in every game. Are you running Windows 10? As Windows progressed, it became better able to force programs to use multiple cores, even if only designed to run on one core.

    If I have a quad core processor running at 4.6 GHz and I cant even maintain 60 FPS I cant imagine what you would be running at. 30 FPS? Talk about an extremely poorly written game. Firefall is an MMOFPS just like PS2, but it has far better graphics and I can get 100+ FPS in it. It will max out my GPU. Like PS2, Firefall has battles with 100 players in the same area.
  7. Eyeklops

    Curious to know what your RAM speed is.
  8. orangejedi829

    This list:
    https://www.cpubenchmark.net/singleThread.html
    will tell you, in order, which CPUs are best for gaming. This is because if a game is CPU limited, it will almost always (in a quad-core or higher) be limited by the primary thread, which is limited by the CPU's single-threaded capability. (You'll notice that more cores =/= better performance in this case. Often it means less performance because of power constraints. As time goes on, and things like DirectX 12 improve multithreading in games, having more than 4 cores might become useful. But for now, that's not the case.)
    Anyway, the i5-4570 is #94 on this list.
    So not terrible by any means, but not the biggest and baddest out there either. 40fps in large fights sounds about right to me.
  9. fumz

    This link does not list "best" anything. The only thing it is is a list, and it's in no particular order either.

    There's a reason why passmark is universally despised and never cited by any legitimate reviewer, ever: it's trash and completely unreliable. The sample sizes are all over the place and within each sample there's zero uniformity. Passmark is paid advertisement, and so it shows up as the first google hit, and so a lot of people who have no clue what they're doing dl it and submit scores... again though, there's no standard or baseline. If you're looking for an apples to apples cpu test, which is hard because the platform matters, it's best to look elsewhere.

    The simple truth is that the op's 970 is underperforming because of cpu strength. Overclocking the card is pointless for this game. He needs to oc cpu instead... as you said, accurately, to get better single-threaded performance.
  10. fumz

    There's a difference between "using all 4 cores", "4 cores are in use", and "efficiently".

    Earlier you said that with the game closed cpu usage was < 10%. That's actually pretty high. I have Edge and FF open typing this out and cpu usage is 1%, so something else is going on which is restricting game performance.

    You have more than enough cpu strength to run 120fps, at least 60-70fps in large 96+ battles, and your card is more than strong enough. It's something else running that's messing you up.
  11. orangejedi829

    With enough samples, it doesn't matter who downloads the benchmark and whether or not they "know what they're doing". That's basic statistics. And the sample sizes do not have to be uniform either, as long as they are sufficiently large: https://en.wikipedia.org/wiki/Central_limit_theorem
    People who bash Passmark for these reasons do not know what they are talking about.
    Passmark is fine for comparing CPUs and for judging single/multi-core performance, even if it doesn't perfectly represent the "target" workload.
    Plus, the list (which you claim doesn't list the "best" anything) does list the best processors in Intel's lineup, just as one would expect it to. Unless you think CPUs like the i7-6700k, i7-4790k, and latest Xeons don't have the best single-thread performance?
  12. SNEAKYSNIPES1

    I feel your pain dude. I'm rocking a gtx 1080 (which made me broke) and I'm only getting about 100 fps where as bad company 2 gives me a solid 144 fps which is my monitor cap.
  13. fumz

    There aren't enough samples. If you click links you can see for yourself.

    Yes, everyone who's been reviewing cpu's for the last 20 years doesn't know what they're talking about... ;) Cause orangejedi said so...

    We actually have no idea what it is exactly that passmark is doing, which is one of the fundamental problems with passmark.

    It lists any cpu that's been submited, period. Everything else, like clock speed, the platform, or even whether or not the knucklehead who made the submission still hasn't loaded the optimized defaults does not matter. It is asinine to use it to judge cpus... which is why nobody does. Well... almost nobody, amirite? o_O
  14. Taemien


    I'll add this to the point.

    I do have a GTX970 and a i7-4790K. And I can tell you all, there's a correlation between that list and performance in PS2. While I'll agree that passmark doesn't tell the whole story for most games. It is pretty accurate for Planetside 2.

    Building PCs over the last 6 years has been, frustrating to say the least.

    I used to run PS2 on a AMD Phenom II x4 970 @3.5Ghz and a ATI Radeon HD 5870. The performance was good. I built that PC back in 2010. But what frustrated me is in 2014 I wanted to upgrade. I saw performance starting to drop in many other games and I wanted to keep my smooth gameplay. So I began to look for upgrade solutions.

    What I saw people having issues with in PS2 was stupid. Absolutely stupid. NEW hardware in 2014 compared to what I had, which was 4 years old, was getting HALF the frames of what I was. Specifically AMD/ATI products. I didn't make any damn sense. Its like AMD took a step backward.

    Intel wasn't too much better. I was looking at spending $300 for a CPU that may not be a large upgrade.

    900 series GPU's come out and at a decent price in 2015, so I jump on them. Liking the performance they can give. I used passmark to judge the 4790K I now have and it has performed admirably. And from my own research, the only other CPU that can match it is the 6700K (which ironically, or not so ironically is #2 on that list above). My current system is the GTX970 and the i7-4790K.

    Here's another interesting tidbit:

    My GTX970 failed after a month. Using warranty I sent it back for a replacement free of charge (EVGA has got a decent service for this). During that time I was out a GPU. So I stuck in the ATI Radeon 5870. You know what? The performance in PS2 was still the same. Well slightly worse.. but still very playable and smooth at 60-70FPS.

    But yeah.. we get these self proclaimed 'experts' saying passmark is bad. But they don't have any sort of counter point other than that. Just statements saying, "Oh well everyone who knows what they are doing knows its bad." Ok post a different listing site. Wait.. it doesn't exist does it?

    So you expect everyone else to dig through many pointless reddit posts and other badly formatted forum-type websites to find the diamonds in the rough to see what is good and what isn't? F-ck that sh-t. Here's all the info anyone needs for PS2:

    Get a i7-6700k or a i7-4790K depending on your preference in motherboard (they use different sockets). And don't look back. They're likely future proof for the next 5 years and you don't need to O/C.

    Speaking of Overclocking. The only ones who should be Overclocking CPUs for gaming are those who have old AF hardware and want to squeeze another year out of it. There's no reason to do it with a brandnew processor unless you went for cheap. Here's my advice:

    Get you one of the above CPU's. Let it ride for 5 more years. Then upgrade. If your financial situation isn't good, then OC for another 2 years. If still not good, start turning settings down.
    • Up x 2
  15. orangejedi829

    There are far more than enough samples for all of the popular (consumer) CPUs.
    *Because high school math says so
    FTFY.
    Doesn't matter, it's still apples-to-apples, and the 'top' CPUs are exactly what you'd expect, so it's clearly doing something right.
    You don't seem to realize that all of this is a non-factor in what is essentially a simple random sample.

    Statistics. Seems like nobody understands it, amirite? ;)

    This. So this.
    Well said.
  16. Rebelgb

    Strange I have a 8 core CPU (AMD) and I run at 80+ in big fights.

    So say that multiple cores don't matter if you want......
  17. orangejedi829

    Video?
  18. fumz

    /facepalm...

    First, is clicking links really that hard?

    the xeon at the top of the list's first two submissions are actually the same guy; he just tested 8 and 16gb's:
    http://www.passmark.com/baselines/V8/display.php?id=49312451197
    http://www.passmark.com/baselines/V8/display.php?id=49299948951
    Perhaps you can explain why these two scores are so different than the other two?
    http://www.passmark.com/baselines/V8/display.php?id=43138461196
    http://www.passmark.com/baselines/V8/display.php?id=39900820554

    passmark is easily manipulated... much like 3dmark.

    Second, Skylake has better ipc than any previous chip, so explain why passmark lists all these cpu's having better single-threaded performance than the 6600k, even though the cpu's passmark says are "better" are clocked as low as 2.9ghz: i3-4170
    i3-4350, pentium g3470, i7 6700 @3.4ghz, i5 4670, i7 6900 3.2ghz, i3 4360, i7 4770s, Pentium g3258, i7 4960, i5 5675, i7 4790t
    i7 4740mx, i5 4670k, i5 4690s, i7 4771, i7 3740, i7 4690k, i7 5775c.

    Going by this "guide" , the ignorant man would purchase any one of these cpu's and expect to get better single-threaded performance than the 6600k; thus, better PS2 performance. That would be a really stupid mistake to make.

    Lastly, the difference in score between the 6600k's is pretty big, from 8549 to 10284. How's that happen again if this "test" measures a cpu's single threaded performance?

    Can you explain any of this? Never mind the absurdity of suggesting that Xeon's have better single-threaded performance than any gaming cpu Intel has put out in several years...
  19. Gundem

    Well, that could change with the upcomming Canon Lake CPU's, which will be reducing the transistor size from 14nm all the way down to 10nm, which could mean performance improvements like those seen with Maxwell vs. Pascal. We should be seeing Canonlake sometime mid-2017.

    That being said, Intel has been a huge ******** about actually improving CPU speed, rather focusing on temperatures, power consumption and ******* integrated GPU's, so the improvement of Canonlake over Kabylake or Skylake might be less significant, but as always, we will have to wait for the benchmarks to tell the true story.



    I'd beg to differ on that point. A 30$ air cooler and a little common sense can bring a Haswell based CPU all the way up to 4.5GH's with almost no overvolting, a massive performance boost over Intel's built in Turbo Boost. Sure it might reduce the lifespan of your hardware somewhat, but not significantly enough to justify not overclocking.

    Of course, that's my personal opinion, so feel free to overclock or not as you wish. But I'd personally do it without a second glance.
  20. orangejedi829

    Facepalm indeed.
    I specifically mention consumer chips, and yet you link to a Xeon that has 4 submitted tests.
    What relevance does that have to this discussion?

    All of those CPUs should have single-thread performance on-par with the i5-6600k.
    Need I really explain? Fine.
    The i7s are all Ivy Bridge and later; it's no real surprise that they're still more than a match for a mid-range Skylake i5. Especially since Intel hasn't exactly been making monumental leaps in core performance the last few generations. Also keep in mind that while the 6600k only 'boosts' by 400mhz, many of those i7s you listed, such as the one running at "@2.9ghz" boost by 1000mhz or more. And naturally, they'll be running at the boost clock during stressful activity like benchmarking.
    The i3s and Pentiums are also all clocked quite high; every single one of them has a higher base clock speed than the 6600k. And since they're only dual-core, they have half the thermal constraints of the others.
    So no, it doesn't surprise to me at all that the CPUs you listed could have a slight edge on the 6600k in single-thread performance. (Or in overall performance, for many of them.) Even though it may surprise some out there (*ahem*) since "OMG! But the 6600k is Skylake! It's newer!"
    Is there variance between the individual tests? Yes. Does it make a statistical difference when there are literally thousands of submitted results? Nope.
    Now, if you'd like to provide some actual benchmarks that dispute any of this, be my guest.
    Hang on.. are you for real here? You really think Intel's flagship, extreme-workload CPU line would be worse than their consumer chips which cost 1/10th as much? Sure, the 16+ core variants might have tighter thermal constraints per-core, but i can guarantee you they still ain't gonna be no slouch. And the only CPU I know of that Intel has ever specifically targeted at "gaming" is the Pentium G3258 (which you lambasted in your reply) because it has... wait for it... awesome single-threaded performance!