Quick Note on Optimization

Discussion in 'Player Support' started by codeForge, Nov 21, 2012.

  1. HiFiWiFi

    I wish I had it like that, I get 15-20 in big fights and 25-30 otherwise, this is on High/Medium Settings, CPU: 2670QM, GPU: GT540M 2Gb, RAM: 8Gb 1.3GHz, HDD: 5400RPM, this is a laptop btw. oh and the res is pretty much 1366x768
  2. HiFiWiFi

    Wallhax
  3. Xavstar

    Have a look at http://www.notebookcheck.net/NVIDIA-GeForce-GT-540M.41715.0.html. This is a good website for seeing how laptop gpu's fair.
  4. FearTheCow

    Dear SOE,

    I saw the patch notes thread for the upcoming patch, where are the optimizations promised in the OP? It is coming up on two months since your vague promise of optimizations and yet nothing more has been said or done about how poorly optimized the game is, were you lying just try and keep people playing?
    • Up x 1
  5. Hatamoto

    Ive been wondering too ... it was supposed to be this big optimization patch .. it was going through heavy Q&A, was supposed to be released in january, remember that one? Or is it something i dreamt?
    • Up x 1
  6. dr_Fell

  7. Master Mace

    -Try getting another $500 rig. You'd be surprised how much changes every 6 years.
    -People here are pointing out that better processors don't help because the game is not threaded properly. It piles lots of stuff on the first core, and tapers off to the last one.
    -DirectX 11 came out 3 years ago. Your GeForce 8600 is not fine, it is very old. DX11 isn't the "future of gaming", it's already in the past.
    -I agree, there was a lot of hype about 64 bit gaming... back in 1996. N64 It's kind of wondered why companies moved backwards from it...
    -A $2000 gaming computer does do more than your 6 year old gaming rig. Much better images.
  8. Master Mace

    People aren't asking for DX11 drivers only. They are asking for a DX11 option. DX11 came out over 3 years ago, and it's not limited to high end PCs.

    Also http://www.merriam-webster.com/dictionary/loose

    Is that simple enough for you?


    You don't need to dump your paycheck into a graphics card every 3 months, how about upgrading your PC once every 5 years? A GTX 260+ would run fine, came out in 2008. This is 2013.

    Motorcycle fan club that way ------------------->
    Truck building fan club --------------------------->

    You're on a Gaming forum telling people not to game. Get real. This is a new game and the customers are voicing their opinion. They want to be able to get better performance out of their gear.

    Seriously, you sound mad. You should like... go work on your truck.
    • Up x 2
  9. teks

    Havent checked out the reponses for a few days. I'm aware I should consider upgrading my rig. Thing is I'm buying a house this month to try to take advantage of the low interest. It couldnt be a worse time for me. Its also not a good time because the new consoles havent rolled out yet once they do **** will change. So far its been pretty much if you can outperform an xbox 360 your golden. Thiz game is an exception like I said. Youd be really surprised at just what my rig can do. Crysis at high settings and all that jazz. The 8600 has impressed me a lot.
    For those who do want do build a cheap pc that can play this consider the trinity a10 amd processor. For $130 you can play this game just fine without buying a graphics card. People have already posted that the chip handles the game just fine and it will only get better.
    For me I scriunged up 70 bucks for an amd x2 6000 and 2 more gigs of ram. Call me crazy but my rig can play this game already and the patch helped a ton. If I can stall it I want to wait for the third generation of APUs
  10. Irathi

    We're still in january, lets not start complaining until february?
  11. PresidentFreeman

    I don't know what the devs are doing but with each successive 'hotfix' the game crashes more and there is more 'flickering screen' issues, normally at the same time.
  12. LordMondando

    I imagine the 'general performance' thing is a case of limiting expectations. Also the focus on GPU is intresting. Figures cross they have figured out how to move some of the stuff off the CPU onto the GPU. Though, who knows. Hopefully more details to come.

    Even then, in my position, in large battles event 3-4 fps would make quite a diffrent 23-4 average is noticably more choppy than 27-28
  13. FearTheCow

    Posts like this really boggle my mind, it is obvious you haven't even read the OP which is the developer stating that SOE has a lot of optimization to do. Yet you post that the general performance is a matter of people expecting to much... wtf? The game does not handle multi threading in a decent manner, so the CPU starves the GPU.

    I mean how am I CPU limited even though the game is is only utilizing 50-70%? While SOE has come up with some plausible sounding horse crap, why is it there engine is so limited yet so many other games aren't? The only two games I own that is CPU limited like this game is Supreme Commander, and that's because it only uses 1 core. The second game is Civ5 and that is late game when the CPU has to crunch huge amounts of information.
  14. LordMondando

    I do not think you understood my point. So allow me to clarify my position. For one I'm certainly not saying there is not a lot of optimization to be done. simply saying, this first proper round is currently very vague, but one would hope given their statements it currently makes better use of the GPU, which at present is underutilized.

    Indeed, as I've said elsewhere, the focus on GPU suprised me, given the CPU underutilisation, appears from what limited information, I or anyone else can gleam about the inner workings of the forgelight engine. Then again, how much do we know, little and how fair i comparing it to other games, not clear.

    So.. None of that implies the CPU is being perfectly utilized (or even close) or that I am worthy of some rant and I am worthy of being accused of poor comprehension or accusing you of 'expecting too much'.

    I've simply been saying (and I thought this was clear, then again, people on this forum have a habit of reading their own anger into my posts), you can expect their statements at this time. to try and manage expectations, as promising people the world and not delivering it would be a bad idea on their part. far as optimization goes.
  15. LabRatTy

    I remember back in the beta, I discovered that there are fully-detailed eyeballs, ears, heads etc etc beneath people's helmets that you never, ever see. It's all still there.

    I'm not an expert on computer graphics, but I'm almost certain that high polygon counts are very heavy on the CPU. And CPU optimizations are what this game needs. Whereas textures, special effects & shaders tax the GPU.

    When you have 20, 50, 100 infantry on your screen, you're rendering twice as many eyeballs, ears, and heads that you never, ever see. And remember - these characters are often being drawn behind walls and hills, too, so there's a lot more infantry on your screen than you think.

    Wouldn't this be a very easy, quick way to jumpstart optimizations? Or am I missing something? If the eyes absolutely must be there, couldn't they be made to be mostly flat & static?


    [IMG]
    • Up x 1
  16. husse

    i dont get it why? why you have to have eyes, ears. and even hair? when you cant remove your helmet. You just see helmet noise and black glasses :D oh and lips
  17. CyclesMcHurtz Code Monkey

    I see this statement often, but it's not a matter of "moving work to the GPU" since the Graphics Processing Units aren't doing the same jobs as the Central Processing Unit. Once reason it's not simple is that the CPU can do things like move memory around and calculate things that need immediate results. The GPU can do lots of things fast (some have more than 400 cores) but that's because they can easily break up the work by the region of the screen they work on or other things.

    Another thing to consider is how long it takes the GPU to get any kind of result back to the CPU to work with. When a program decides to draw something it doesn't actually trigger the GPU work right away, because it cannot do this on the PC. It first must be prepared in a format that the drivers can use. This is usually called a "batch" and contains a crap ton of data, but it is sent to the driver first and then queued by the driver to be passed to the actual hardware.

    The hardware then does the work, and the results are stored and displayed on the screen. This usually takes 2-4 "frames" of time (this depends upon how the application/game is configured, and what kinds of graphics settings you have used - look up "pre-rendered frames" for some more info on this). This becomes "lag" for the results and if you do the math, this can result in up to 100ms (1/10th of a second) delay between requesting the results and actually GETTING the results from the graphics card. It then takes about half-again as long for the results to get processed and sent back to the screen.

    That's 150ms of lag for the results of any work you've "passed off to the GPU", and the reason that only certain things are off-loaded - those things that don't require any further processing by the CPU. These tasks tend to be drawing, and not much else, and some results require flushing the queue. You can read some interesting technical bits if you are so inclined to learn more:

    http://gamedev.stackexchange.com/qu...atches-the-graphical-card-performance-etc-xna
    http://http.developer.nvidia.com/GPUGems/gpugems_ch28.html

    Many years ago now this was very true, but as graphics systems have improved the number of polygons and vertices has become less and less the issue. The major bottleneck is usually the number of interactions with the graphics system (the "batch count") - which is completely voodoo to most of you (I'm going to assume most of you aren't programmers, which I think is fair) but is really the bane of PC programmers everywhere. Modern graphics pipelines really require very little tweaking on the CPU side of things, so you just pass around a few small bits you change and most (in terms of size) data is kept and calculated on the GPU.
    • Up x 15
  18. LordMondando

    Thanks for the response, I stand corrected, I will read more. I suppose a large amount of the confusion you'll see comes from the ctrl+f feature making the bottleneck distinction between GPU and CPU.

    Given I have your no doubt, in demand attention. Would it be possible to give some details on the performance optimization coming up and why it favors low end gpus?
  19. Gribbstar

    Just like to say I really appreciate the long detailed technical responses. Rarely see them these days from developers.
    • Up x 2
  20. eQuistX

    I am really sad, I ahve something about 20-30 fps in large fights, 40-60 in abandoned places. I playing everything on Low, render quality 90 and 1980x1024, my specs:

    windows 7 64 bit home premium
    gtx 660m + I overclocked it
    8 Gb ram
    i7-3610QM 3.2 Ghz
    Any advice ? :'(