Quick Note on Optimization

Discussion in 'Player Support' started by codeForge, Nov 21, 2012.

  1. JakeLunn

    I'm actually kind of surprised. Maybe I don't have much faith in my system but my performance has been pretty good despite playing on Ultra. Sure I would prefer an average of about 50+ fps but I fair well with my current 30fps minimum.

    Just give it some time though. It's a free to play game so it's not like you all invested $60 into something you can't play - yet. Hell, I bought All Points Bulletin for $50 a week before they announced they were shutting down the servers. I also bought Brink for $60 without a clue that it was going to crash into the ground from poor network optimization. You guys have it easy.
  2. Hatamoto

    So when can we hope to see any patches in this department? its been mighty quiet in the launcher lately :)
  3. RustyShackleford203

    We appreciate the hard work! Thanks!

    I personally have no frame rate issues. I run a GTX480 and it only starts dropping frames in very large battles.

    I feel much more concerned about players ghosting in and out due to server lag. Even from 10meters away from them, they sometimes fail to render in and start ghosting. Is this an issue that can be addressed?

    Thanks!
  4. Meganomic

    This could have been a great game. Too bad the developers don't know what they are doing. They should have hired skilled programmers instead of whatever random people they have now. I hear-by unsubscribe because of crappy performance. 20fps is a ******* joke.
    • Up x 1
  5. Kinigos

    It needs major optimization
    that said it's great game BUT...
  6. JohnnyBftw

    Is there any chance ATI will release new drivers that make CrossfireX more effiicient than it already is with the AMD Catalyst 12.11 Beta8?
  7. nitram1000

    This game could have been amazing, but the piss poor performance even on high end computers has totally ruined it for me. Firefights are just a laggy mess that feel like slow motion. Your awful engine has cost you subscribers, me being one of them who cancelled within 4 days of playing.

    Protip. If you have any tricks up your sleeves regarding CPU usage, I would reveal them quickly because the hype for Battlefield 4 will begin soon and you will likely lose a lot of players.

    Also, why in the holy hell did you use DX9 when a DX11 option would have allowed quad core users to get better FPS? Who made that decision?
  8. dr_Fell

    Thank You, I will try to find something similar for NVidia


    Any ideas that could bring us better FPS are great, though I have some comments on some of Yours
    If players were assigned automatically this could destroy gameplay - You can't play with friends, squadmates... And it seems to be easy to abuse.

    I had always scope view much more faster than normal view - even with 20 FPS in crowded areas I can snipe without problems.


    If I understant it right, it could bring some problems... If there are more than 96 enemies in the area You could be hit by enemy You can't see, right ? Although I don't remember seeing more than 96 enemies neaerby, that's for sure.

    EDIT:
    This thread has just disappeared. Does anyone know why ? Those settings really helped me to bump my fps a bit.
  9. FierceSmurf

    I really hope it does get optimized because its kinda sad that my pc cant run this game lol i get 60 fps out of fights but in fights it is just horrible lol
    i7 2700k
    gtx 570
    12gb ram
  10. ImNotDrunk

    Then they should of made a new engine. I just don't like how people are saying how a game like this is difficult make and its hard to make this much people playing a fps. I understand its hard but if SOE going to take on a project like this it should be playable. Reason why i chosen bf3 cause it actually uses all my cpu power and this game which is wayyyyy more cpu intensive barely stress my cpu. If you go back I had a programmer reply to one of my post saying that he doesn't see how 80% isn't stressing my cpu. First off i haven't seen my cpu ever get to 80% yet but regardless 80% load may be fine for other mmo's but when the game an mmofps where every frame is valuable 80% load just isn't good enough.
    • Up x 1
  11. Takoita

    Thank you for your time and effort!

    Slideshow-like FPS drops happen rarely for me, but any news on the lag issues front would be much appreciated :)
  12. blujay42

    This is ignorance. I have played hundreds of games on dozens of computers often at or near the minimum spec and the industry standard has always been that the minimum spec represents hardware that can be shoehorned into running the software at about 30 frames per second. Only in recent years has it become a trend that minimum spec is not appropriately handled. Mark my words. If they posted minimum spec as an E6850 they intent to bring the game to comfortable on an E6850. You have no idea the scope of a project like this represents.

    They didn't reach their goal in time. That's all there is to it. I also had people telling me I couldn't ever run Natural Selection 2 because their recommended spec boxes barely could, and five or six builds later I have gained over 40 fps and it never drops below 60, while in the early beta days I was hitting teens. That took over a year to accomplish.

    Remember that I have a number of computers and no self interest in misrepresenting the facts. I have a computer that can maintain this game above 45+ frames. Obviously I would like two have two computers that could play this game, but if it doesn't work out it will not matter to me. I'm only telling the truth. I've been doing this since Voodoo was a major brand.
  13. Adam Chattaway

    This game is mostly processor bound, i went from uplayable with a phenom 2 p55 be to an i5 3570k and the fps has trippled and playable on max.

    You are using a very old processor, my mate with the same tryed to play the game today and got 5-18 fps, so he stopped, You could buy a gtx 690 and your cpu would still give you bad fps.
  14. CyclesMcHurtz Code Monkey

    Hi. I'm that programmer, and I think there's a bit of context missing here. I was specifically talking about some playing I did on my AMD Phenom II x4 945. I hopped on to the live game (Connery, indar, mid-day) and had my performance monitoring running. It went from 65-80% most of the time with peaks and valleys both above and below. My GPU (a 560) was maxed out. This was recorded and graphed over a 30 minute period.

    That's exactly what a game should do. The GPU is supposed to be pegged (which is reported as anywhere from 90-100%, depending upon the tools you use) which means the graphics system is in control of your frame rate. The CPU needs to have a little bit of down-time, especially in a game with such huge potential for variation. The 65-80% is an average over usually 1-5 seconds, so you won't see the times it's actually been 100% or more of the GPU work. When the CPU bursts up to take longer than the GPU, then it drops down again. If it ran at 100% along with the GPU at 100% then one bad lag spike would kill it and the CPU or GPU would never catch up.

    We've tried to be pretty honest and open with where we stand on performance and optimizations, and that's why codeForge started this thread. We're not done, and we know very well it's not where it should be. We've found several bugs that have affected frame rate and we have other work to do in order to make the game scale even better between dual-core to 24 virtual cores. The unfortunate fact is not all optimization can be done overnight.

    (edit: spelling - hoped is not the same as hopped)
    • Up x 7
  15. Drahzar


    We know the following question is nearly impossible to answer, but: Do you have a rough approximation of a time scale in which we may reach the goal of the FPS-improvements? (Nothin too specific, but something like "2-3 months", and plz no jokes like "from now to doomsday...")


    Thanks in advance!
  16. SQLOwns

    Cycles/Code, thanks for the info. It's a hell of a lot more than most devs communicate to their customers.

    Cycles, what FPS were you getting during your test? I just ask because I am using that same CPU. I typically get between 20 - 50 fps depending on what is going on. I have all display settings on low, 82 voice channels, with max FOV and Render. I've never seen the GPU bottleneck, according to the in game FPS. Do you think it would hurt to turn up some graphic settings, or should I just leave them on low? I don't know that much about how the CPU and Video Card process the frames. Would increasing the settings hit the CPU at all, or would it not matter since it looks like I'm not fully utilizing the GPU?

    128 GB SSD
    12 GB DDR3
    Phenom II X4
    Radeon 6950 2GB
    1680x1050 res

    Thanks.
  17. Sirez

    I'm sorry, but what? Your belief is that on minimum specs it is a guarantee that a game will run consistently at thirty frames per second? Not even your belief, your claim that it is the "industry standard"? I fear that you have been misinformed; by whom, I am not certain. Minimum specs is just what it sounds like. The bare minimum to have the game be playable. Not consistent, not smooth, not ideal - playable. That is why there is a stipulation between "minimum" and "recommended". It is the entire point of the wording. At risk of catching ire for using this as an example - most console games run at about thirty frames per second, and those games are being developed on set hardware (i.e. recommended). Ideally, performance should be consistent at thirty frames, but sometimes heavy load will cause them to dip, or in a worst case scenario, a lot of time it will cause it to dip (see Wii-U's version of Black Ops 2). With direct ports, it is due to inferior hardware and, you guessed it, improper optimization (which at that point, it likely wouldn't matter what the hardware was, because it wouldn't be properly utilized anyhow)! Thus performance suffers but the game is playable. Recommended specs are there as more of a guarantee that you will have a smooth and consistent experience, not "minimum". To give a PC example, back in the day, I met the "minimum requirements" for the original Far Cry, and my average FPS was around fifteen. Not thirty. Fifteen. Did I get to thirty sometimes? Sure. Was it consistent? No. But, I digress, as I already stated I do not disagree that further optimization needs to occur to the ForgeLight engine.

    But... this IS a new engine. Are you saying they should scrap this engine entirely, close the game down, create a brand new engine from scratch, then re-test all over again just due to your comparison to Frostbite 2?

    facepalm.jpg

    Some of you seem to be suffering from what a lot of people go through during the first few weeks of a new MMO - an inflated sense of entitlement. Sony is providing you with a service; whether or not you choose to utilize that service is entirely up to you, but many of you make it sound like Sony is purposefully singling you out and trolling you with the engine to ensure you have the worst experience with the game possible. Hell, just a couple posts above me one of the coders talks about testing that is going on, not to mention the first post in this thread. If they didn't care about the severe performance issues some of you were having, they wouldn't of bothered to collect data and test for it and y'know... make this thread at all. The pool of players is much larger after release than it is in a more controlled beta environment - this is probably the best time FOR them to collect data in an effort to fix what's plaguing some of you. So instead of using the forums to breathe fire at Sony (******* Sony! **** game!), help them out. It's already been discovered that possibly the biggest culprit is some CPUs aren't being utilized completely, so obviously effort and progress is being made. But I'm just using logic. What do I know.
  18. Xavstar

    I previously stated I was getting a solid v-sync'd 60 FPS. After looking at the FPS figure at the bottom left when I was in a populated base I saw the FPS was at times going down to 40 FPS. But it felt like a fluid 40FPS which was weird as when I got 40 FPS in other FPS's I could feel the lag.

    I downloaded a CPU unparking utility from http://www.coderbag.com/Programming-C/Disable-CPU-Core-Parking-Utility (download link at bottom of the web page). It takes a while to scan the registry then click 'Unpark All. I did this and FPS in populated bases seems to have increased to 50-60 FPS. I have a i5 750 OC'd to 3.9ghz. And a Nvidia 580GTX 1536MB

    If you have 8GB+ RAM you could ramp up disk caching. Here are two guides:-

    http://en.kioskea.net/faq/7106-windows-7-increase-the-performance-of-disk-cache

    http://www.howtogeek.com/howto/windows-vista/increase-the-filesystem-memory-cache-size-in-vista/

    I use Windows 7 64-bit on a 80GB SSD. I did turn off Superfetch following the advice of a SSD guide. But I turned it back on as I thought well why not as I have 16GB RAM. My PS2 Steam folder is on a 7200rpm 2TB hard disk. Game loading to the character selection screen and first spawn on a continent takes a while. But following respawns on the same continent take less than five seconds.
    • Up x 1
  19. vonTossis

    Hello,
    Windows 8 64bit
    Phenom II X6 1055T @ 3.2GHz
    8GB RAM
    2x AMD 6870

    Sadly at low settings + render quality 50% = 25fps :(
  20. CyclesMcHurtz Code Monkey

    Overall graphics quality (the drop-down menu on the right side near the top) should really read "what kind of graphics card do you have?" Start with MEDIUM. Always. If you spent any serious money on your GPU and set things to LOW, then yes - you will be under-utilizing your GPU. In some cases it might even under-clock itself. You just told it to do a lot less work, so it has a lot less work to do. Here are the significant exceptions:

    1) Shadow Quality : The work here is on both the CPU and the GPU, so you should notice this change on both.
    2) Terrain/Flora Quality : This is a little bit on the CPU, so you might see a change here.
    3) Texture/Model Quality : This affects some of the data loaded from the disk, so it is mostly I/O, but that shows up in the CPU for older systems.

    There are a couple other hot-spots as well. Some card aren't good at huge numbers of overlapping pixels. Particles and Effects quality will help those people. No, I don't know who they all are, sorry. Fog Shadows are expensive, but they look awesome. Ambient Occlusion and Motion Blur are entirely subjective, and do affect your frame rate.

    Render Quality is basically what quality sampling you want. 100% is 1x super-sampling, 50% is "smeared pixels all over the screen" mode, but required for some of the older cards because it allows you to run with the UI at a reasonable size while still reducing the pixels drawn.

    The best experience will probably be to start with everything on MEDIUM unless you have an awesome card (I run mostly on HIGH) and set a couple of the key things above according to how it affects your frame-rate locally.

    At the warp-gate, set your Render Quality to 85%, and then fiddle with the settings until you get to the point with the CPU/GPU trading back and forth. You *WILL* need to exit the game to see some of these changes. Render Quality, resolution, and the check-boxes are fine but the rest of them might need to completely restart in order to flush out anything strange.

    Once you get to the flip-flopping point, turn up Render Quality until you see GPU all the time. Maybe add a percent or two. This is where you want to be.
    • Up x 7