So along with this patch we had some optimizations, and some changes to the Player Awareness values which results in a higher probability of seeing players at a further render distance. So far with all our collected data, performance is up slightly. Not enough that I can sleep tonight, but at least it didn't go down dramatically. Several client crashes have also been fixed, which is good. We continue to patch the client often after any new Game Update to address additional client issues. Of course, some percentage of people will have their frame rate reduced, some percent raised, some percent unchanged. Some will crash more, some will crash less. Some will get a call from their girlfriend saying it's over. Sorry about that. You probably should have chosen TR. Anyway, we do continue to work on performance. We have more optimizations coming in the weeks ahead, along with reductions in bandwidth usage as well. Stay tuned!
I'd like to be able to find that comforting. http://forums.station.sony.com/ps2/...ably-worse-in-major-bases.96548/#post-1279906 My systems gone from 28fps minimum, to unplayable mid teens slideshows in all fights thus encountered. that's a significant performance loss. It also appears to be occurring regardless of settings. (shadows off, render down as low as it can go, everything high, everything medium, resolution up, down. Only thing that is consistent is its CPU bound in said instances). I am also a miller resistant who made a point of testing out the render distances changes there (which I assume the changes are based upon) and the problem was not present there. It is hard though, to disentangle, the weird 'look towards center of base' fps drop, with a general fps drop under what I describe as 'load' situations (Miller prime time, battles involving hundreds of people). There also appears to be no correlation between FPS drop and players actually being visible on screen. Not trying to be a dick, appreciate this is difficult. But something has gone quite badly wrong for a certain proportion of people.
We have a method coming soon for people to actually get a "normalized" frame rate. You'll be able to choose an option from a terminal, appear in a new area, and be whisked around on a roller coaster while things explode and collide around you and we compute your frame rate. Then, you can run that every build and not differences. Right now, I certainly admit, it's frustrating to try and "measure" your frame rate because of the constant change.
Thanks I appreciate you guys are working on it. I'll do some more testing tomorrow IF I can, the problem is also (If I might add) the fps counter is so small, getting accurate readings in the in game video capture is very hard. Likewise using something like fraps (with its big counter) hurts frame-rate so much its hard then to produce videos accurately showing frame rate whilst reproducible effects as to what situations are causing it. Could I suggest then, making the FPS counter larger or easier to read from low res video. Appreciate this is hard, just want to be able to play again, happy to help in doing so.
Hey I've said it before..when people are upset about their performance, it's fine with me. I'm upset too! I have a 5-year-old AMD PC with an ATI card at home with 4G RAM. I play on that. When frame rate goes down, it goes down for me, too. (full disclosure, I also have a pretty hot-sh*t system at work... so... yeah... but hopefully the point is not lost...)
Hehe I'm hesitant to call it a benchmark just yet... all sorts of eyebrows raise... but I believe it's at least a step in the right direction.
HEY SOE PROCUREMENT! YOU OUT THERE? CODEMONKEY NEED BETTER BANANAS! YELLOW AND RIPE, NOT BROWN AND MUSH! CODEMONKEY NEED CRAY BRAND BANANA COMPUTER!
Do you think this may generally improve performance for multi-monitor players at higher resolutions as well? with multi-monitor systems for players at 5760x1080 etc, we're hoping you guys have some triple screen rigs set up at SOE to test.
Just tested, Averaging 12 fps while scoped in the Indar Wargate after this patch Any chance DualCores (over the 3.4ghz threshold) and Quadcores under the 4.0ghz Threshold will be getting Real help finally? Barring that, how about just some more powerful .INI (file) tweaks in the coming months? IOW: I mean like some kind of client flags to turn OFF all the CPU processing of all the extra Shaders, filtering, Eye-candy/Animations & Particles? Not even asking you guys to somehow bypass that stuff to the GPU, I mean just reduce the redundancy of this stuff when it's being Proc'd 100x for 100 players 10 times every Tic/Cycle (or however often it's been happeing). Not asking for simpler Models either, just want to know if this EyeCandy-Creep that PS1 never had can be turned off before it completely overloads the CPU (instead of waiting until it starts interacting with the GPU drivers like Cycles is always going on about)
Thank you for this. Speaking of "bandwidth"... The biggest difference between the Core 2 and the Core iSeries is the massive improvement of memory bandwidth thanks to the use of DDR3 memory combined with the on-die memory controller. When it comes to eeking out the most performance possible in a Core 2 Duo/Quad system, it's important to conserve memory bandwidth as much as possible, because the system memory is too slow to feed high resolution textures into the video card's RAM while also juggling the data used to calculate what is happening in the game. In my own experience with my Core 2 Quad 2.3GHz / GTS450 1GB GDDR5 / 8GB DDR2 800MHz / OCZ SSD system (1600x900 resolution), I have noticed that I will get frame hitching when looking around quickly while using the highest resolution textures. The system can manage 40-60 FPS even with the high resolution textures, until it has to move more textures from RAM to VRAM. The most VRAM usage that I have seen while playing PS2 is 550MB of the 1GB available on the graphics card, so I wondering if there is room for improvement in texture caching/prediction that might be able to mask some of the memory latency we're seeing on "front side bus" style computers? I don't have a lot of hope for being able to use high resolution textures on this system to actually play games, as I have experienced the same frame hitching in World of Warcraft, Guild Wars 2, Everquest 2, Firefall, Aion, and even Unreal 2 single player. It just doesn't have the memory bandwidth that games seem to need. It's close, with playable frame rates, but the hitching is way too annoying to ignore. In PS2 I use the lowest texture resolution to mitigate the problem. Objectively, it looks awful, but it solves the hitching.
I just paid a $523.07 cable bill to play planetside 2. The most recent update willl not allow me to run the game. From the launcher it goes to the load screen. It gets to 98% and never loads. Anyone else having this issue.
i know its very unlikely but one can still hope, any chance in the foreseeable future that this game will be able to run well on an AM2 Athlon CPU? i play at 3 a.m and get 60 despite only having 3 gig of ram and a severely under-powered dual-core AM2, but i know it runs so all i can do is hope the game reaches a point that this setup can run the game even if the polygons are reduced to rectangles,
Is it bad that my first thought after reading this was that you were just being a smart ***? Or its just an inside joke? Either way, I am all in favor of ANYTHING that can help our FPS. Especially in Labs and such..
Yes performance is slightly up and I like the new render too, it's not perfect but it's much better good job!
What he means by this is that they will add something like a tech demo to the game. Something that will be static with a fixed point of view, the same every time you "play" it. It's usually like a short movie, except that it's not prerendered. Every time you log into the game, even if you redeploy to the same fight, variables are constantly changing: number of players, vehicles, positions, location, the direction you are looking, good textures, bugged textures, particle effects, on & on. What he is talking about should eliminate that uncertainty for the purposes of testing our PC's. Every time you go through it, it will put the same demands on your PC. So, when you tweak something and run it again, you have a valid comparison to make. Did my average FPS go up or down? Did my max and min fps go up or down? Did microstuttering increase or decrease? Input lag? Etc.