It points more to the fact that the game is heavily load-dependent, and that people opt for settings to shift their lowest FPS higher such that the best FPS tends to be >60 FPS.
that is a theory, its not how good it looks , its how smooth it moves when you moving fast , HIGH FPS is important for accurate aim, you could play the game on an old TV with 30 frame refresh rate , it would still look smooth while aiming while something is moving in the game fast even trough modern monitors are digital they work the same way like an analog TV, the displays refresh rate is not as important as the games refresh rate... as far as game play (not prettiness) i know its a bit hard to rap our head around this, never the less its true look at it this way ... all monitors (and tv's) scan pixels starting from the top ONE BY ONE from right to left, and then the next line, every pixel thats being displayed its the most recent data from the video card... in other word your game has 120 FPS and your monitor 60 FPS every new pixel that will be refreshed on the monitor will be already the pixel from the next frame, so over all you will always see the most recent frame on your monitor simply because 60 frames are still faster then your analog eyes and brain can handle.... there for its irrelevant what your monitors refresh rate is as far as game play source : i actually studied electronics
I don´t know if it is the new patch or what, but i can´t get the .ini-tweak to work. I actually get worse fps when running on "ultra". My specs are: GTX 570 AMD fx 6100 3,3GHz 12 Gigs of 1600 ram I´m using the latest betadriver for my videocard. In the UserOptions.ini, i couldn´t find a line that said "GpuPhysics=0 (or =1)". It just didn´t exist. If someone has some good ideas what to do, i would appreciate hearing them!
Running on ultra will give worse fps, because it's higher quality. The earlier people were inputting 5s instead of 4s and were defaulting to lower settings as they made an invalid selection. That's why their FPS improved. IMO If you want more fps then disable / lower shadows & lighting.
So I have an i5 3570k and a GTX670. I'm curious about what settings I could change to get the nicest looking graphics. I'm also wondering what I should change to try and help with the render distance, or is that a server side problem?
Open useroptions.ini and set Renderquality to 2.0 or higher. Be warned though, it will ra*e your PC. Higher renderquality makes everything sharper and clearer. It just looks great with it. If you can, set it higher than 2.
Tried this, lost 20 FPS. W/e, run on all low other than med graphics (so I can actually see cloakers) and average 90-100 fps in low action areas, then around 30 in high action. Have I7, 8 gig ram, HD 5850, etc. Card is old I guess but then again I run BF3 on high and average 50-60 FPS in even the most spammy scenarios. On plus side, with settings this low: no one can really hide from me since there is no shadows or even much foilage.
I have tried this so called "ultra settings" but imo it looks like ****e, sure you get more shiny objects/world but imo thats just wrong, and on a sidenote, i have "snowspikes" on my ATV and they did not render while having the "ultra settings" so its not ultra..
So you are seriously telling me that my rig is not suitable for ultra? And lowering those settings like smoke, does nothing. I have set things for the lowest and for the highest in game and it had no impact on fps. "Ultra" settings just made my fps drop really really much. So, in my opinioin this thing is optimise issue or some of the "ultra" specs info is off!
1. Did you close and re-open between changes? Not all settings require this but some do. 2. The big ones, fps-wise, seem to be shadows and lighting. Shadows can be set to 0 in the ini file to disable them completely.
These settings are all very nice and all but they come nothing to a good overclock. I started in tech test with the rig I have now (stock i5 2500K @ 3.3Ghz, 560Ti 448 and 8Gig Vengence RAM). It was hard going at first and even on (beta) release, I was forever seeing my FPS round the 30 mark with the CPU bottlenecking the game. Decided to use Asus Suite II and OC my CPU with its stock cooler to 4.1Ghz and the difference was drastic. Warpgate/random terrain/bio-lab indoors I'm now hovering around 40-60FPS at high settings even with the zerg throwing themselves against my 150 round Lasher. When I get chance I'll dump these lines into my useroptions.ini file and given it a go but please dont underestimate what a simple overclock can do for you. Side note, should'nt the GPUPhysics=0 be set to GPUPhysics=1 ? I have it on 1 on my settings and the option is ticked 'in game'. The option is greyed out as if you can modify it but I can still see that the box is ticked ala Ambient Occlusion.
None of those settings have ever gained me any real fps in situations where they aren't 60+ players around you. I simply went with my own setting, most of everything at lowest graphic.