I have a SLI setup with two GTX 460's. When I run the MSI OSD software I notice that the GPU load of both GPU's fluctuates between 30 min and 60 max %. So SLI does work in PS2, but it does not provide added value to this game. In fact, when I disable SLI and only play with one GPU, the GPU shows a load of 95-99%, which provides a more stable and so often higher FPS than when SLI is enabled. As far as I know this problem occurs because SLI does not scale well in this game (the engine can't handle 2 gpu's). Will this ever be resoloved in the future? Or is PS2 a game which does not and will never fully support SLI due to choices made in building the engine?
The engine, at current, is almost entirely dependent and cpu over gpu. Sadly atm a good gpu isn't going to do much compared to a cpu with higher Mhz.
I believe the game is very heavily dependent on the CPU. Although thats the case its not very optimised for the CPU yet. They need to fix that before they even think about making the SLI better. I'm sure they will work with Nvidia to make it better in the future.
The reason this game runs so crappy is because its Direct X 9 on a massive scale and DX9 sucks **** for multi threaded applications. Remember that MMO rift that ran like crap when it launched on top tier gear? Guess what its dx9 and still runs like crap. dont hold your breath for this game EVER running a constant 60fps. While i like this game, i probably wont play much longer as 35fps gives me motion sickness and the only battles I want to play are the big ones.
DirectX version has nothing to do with it. I've worked with engines for mmo fps that use different versions of directx and they all had different results but not at the fault of directx. The Jindo engine for example used directx9 and was capable of large maps with vehicles much like battelfield and still held high framerate with maximum graphics settings. Threading in applications is not part of directx's api.
It's kind of sad. SLI scales so bad in so many multiplayer games. Singleplayer often works fine, but multiplayer games just can't seem to handle multiple GPU's well. Now, this game does not heavily depend on the CPU, let that be a fact. You can prove it yourself by simply checking the GPU load of a single GPU in the in game by using afterburner OSD f/e. As I stated in my OP, one GPU shows a load of about 97-99%. Thus the game makes good use of a single GPU and does not lean on the CPU. This game simply doesn't support SLI, and it could be the result from lack of support from nvidia's side, or from the lack of support from soe's side. Well, at least I learned my lesson. No more SLI for me. The only multiplayer games for which it works so far is BF3. And even that hasn't been that long, only since one of the more recent patches from both Dice's and Nvidia's side.
The gpu usage is higher depending on your video settings in-game. I can confirm, however, that at it's core its more dependent on cpu than gpu.
I would like to say that I use SLi and find it helps give me 55-60 FPS, but remember that when using SLi or Xfire, your frames should not drop below 60, or at least 50, otherwise you will notice the latency from frame to frame, known as microstutter. With one GPU you can play at 24-30 frames and it will look fine, but V-sync should be enabled to keep the same refresh rate as your screen. It should be announced that you should use BETA drivers if you plan on playing with an SLi setup, as nVidia seems to have tweaked their SLi profiles for PS2 themselves, as so to make SLi run a bit better than current release drivers. Regardless of these options the game engine seems to rely on some CPU power when in large base raids or battles, but my results from monitors are inverse to what others have seen. Where others state their rigs are CPU bottlenecked, mine is GPU bottlenecked. In fact it states GPU bottleneck at all times, even at 60 frames and nothing is going on. I see a drop to about 40 frames in bases with many people fighting, but only in a certain heading, otherwise it's 60 frames. Sometimes I'll be looking at a Terminal and I'll have 45 frames, look to the left or right of it, 60 frames. Overall the game engine still needs to be improved a bit before saying it's polished, but I figure it will be worked out.
Hey KolourBlynd (you color blind, by chance? We need that feature added too! I love it in BF3, struggle mightily in PS2 - specific shade choices are terrible), when you say latest drivers, have you experienced benefits in the SLI department lately? I've a similar setup to you (i7 and 560 Ti 2GB cards in SLI being chief difference), and am on the 310.61 release (also: on Windows 8). Did you, by chance, move from 310.61 to the latest 310.70 and notice an improvement? I hate upgrading drivers since on rare occasion it can make everything worse (bad install or just doesn't play nice with the hardware/driver configuration of the system taken as a whole). I've had a finicky experience with BF3 framerates being much worse, the same, or a little better (regardless of promises for that specific driver release) after some upgrades, and also hate upgrading for one game in particular and later find out a few older games suffered a lot due to the driver. But if it sounds like SLI was improved a fair bit between the two drivers noted above, I might have to give them a try.
Can anyone else confirm something I just tested on my rig? While reading this thread, I thought I might try disabling one of my cards to see if I would get any performance change (running 2 - 560Ti 2 GBs). Normally, at the Warpgates, I see about 70FPS with both cards at about 65% useage. "Alt F" reports "GPU". When venturing to a medium to all out battle, I normally saw 18 to 35 FPS with both cards at about 55% uesage. "Alt F" would flip flop between GPU and CPU, but mostly on CPU. Instead of disabling SLi, I decided to disable CUDA on both cards (just out of curiosity). I then jumped in game; at the Warpgate, nothing seemed to have changed over the above. Deploying to a hot spot, I noticed "Alt F" stayed on GPU, my FPS was pretty steady averaging 40 FPS, and both cards averaged 90% useage. I know both cards were predominantly being used because temps on both cards were about 10 F higher as well. Running 310.61 drivers Win7 64bit Game at "high" settings with 1920x1200 res, AO on, motion blur off NVidia users, see if this makes a differance for you. CUDA (I believe) ties into physics computing, which if it's kind of broke and disabled right now may account for some poor performances...I don't know.
Er wait, the game doesn't properly support multithreading AND SLI? Reminds me of the issues I have with the original Rome Total War released in 03.
So you are saying that disabling CUDA appears to have improved GPU utilization and overall performance in the game? Anyone else want to give this a try? I might give it a try later, I figure it can't hurt. I don't have GPU Physics enabled (might be able to force it in the .ini file, though it appears the entire option is disabled in the actual menu), so there shouldn't really even be a need for CUDA to be enabled.
I haven't tried it, but I don't have CUDA activated anyway. CUDA is a fun project, and imo really one which we can benefit from, but there is almost no support from party's other than Nvidia. Really a waste imo. But it's a strange yet partially understanding situation which occurs here. SLI is not supported (or, well, very bad). Nowadays in the current PC market, and especially looking @ the consumer market playing PS2, most people are experienced with hardware and a fairly large part of the community owns a multi GPU PC. That is a reason to blame SOE for the lack of support of multi GPU systems. However, there are two party's involved: Nvidia and AMD with both their own ways of multi GPU usage. So I can understand that it can be hard to build a game on two different extended bases (Single GPU is ofc the actual base). Especially when there is a risk of bringing the advantage to only one of the two groups (AMD users OR Nvidia users). The only best way is to give Nvidia and Ati everything they need to develop drivers which increase multi GPU performance. However, drivers are only GPU sided, and not game client sided. So Nvidia and Ati can then not edit the game, only SOE can. This brings limits. Now I wanted to try SLI, and read amazing stuff about the support nowadays. But all I can say is that I only benefit from a multi GPU setup in singleplayer games. Multiplayer games however are a pain in the ***. Especially since mmo games often tend to lean a bit more on the CPU (Guild Wars 2 is another example of that) so that people do not need a PC/laptop with a expensive GPU to play the game. If only hardware sites would pay more attention to the aspects noted above, and there would be less people like me who actually think it's worth it to go SLI or Crossfire if you mainly play online multiplayer games.
Set render quality to 1.41 in the useroptions.ini and watch your sweet SLI setup cry. That will essentially enable 2x supersampling, and it does look amazing.
I have GPU physics forced in the ini, doesn't work though, other than adding a nice little grayed out check-mark to the grayed out check-box.
This may have been just another fluke as I have done a bit more testing...Initially, disabling CUDA does seem to have an effect: Here is a screen cap with CUDA enabled (default setting in NVidia Control Panel) You'll notice my FPS and GPU useage is all over the place. Generally, in a contested deployment, an average of 18 to 40 FPS with an average GPU useage of 40% to 60%. In a heavily contested deployment (near the end of the screen cap) you can see pretty erratic GPU useage. This image shows where and how I disabled CUDA. After checking "Use these GPUs", I unchecked the checkboxes associated with my cards and applied. Now, here is a screen cap of CUDA disabled. The recording was done in the same deployment area and was heavily contested. As you can see, in the first half of the pic, GPU useage was high, FPS was about 10 to 15 frames higher, and I can tell you GPU temps were up about 10 F (but still well within safe operating temps). Something obviously happened about halfway through the recording that dropped useage (didn't really notice anything while playing other than a drop in FPS - averaged around 30). When going back into NVidia Control panel, I noticed something odd with the "SLI rendering mode" setting...it showed "Custome"...there is no "Custome" setting in the dropdown box. Clicking on it automatically returned it to the "NVidia recommended - SLI" setting. The one positive thing that remained with CUDA off is that useage was less erratic even though useage and FPS (for whatever reason) dropped off. This is likely a driver issue but it is interesting on the effect it has had with my older rig. I would imagine, single card users might see similar effects. Anyone else care to test?
Have you tested again after returning the control panel to the proper SLI setting, while keeping CUDA disabled? I'd be curious. I can state I had a similar issue, cannot remember what sequence of events led up to it happening (possibly a driver upgrade combined with disabling and enabling SLI system-wide, since I drop down from multimonitor no SLI (3 distinct monitors) to Maximum Performance (SLI enabled), which then only allows monitors connected to a single display to be on (be it one display on the one GPU, or one or both of the other two displays on the other GPU). In the end, the result was the same: the SLI setting in the profile got switched to custom, which I discovered after seeing framerate look way below what I had been achieving previously. I went into the profile, changed it to the true default SLI setting, and performance in game returned to what I expected. It seems if the SLI setting gets set wrong, SLI by definition remains in working order, but completely goes haywire and efficiency drops drastically. I haven't installed one of those monitoring apps, as I had used a few previously but uninstalled them all during troubleshooting efforts to fix a Battlefield 3 issue. It would be great if you re-tested with the proper SLI setting and CUDA off, and we could compare that with your first benchmark.
I say people should give the No Cuda a try. It is only a tick box after all. Me personally I would say its more to do with the CPU. I have a 2500k @ 4.801Ghz (102.15 X 47) and 1905Mhz ram. But, I only have a single 480 running the game. Although its heavily OC'ed it still does not match the 500 / 600 series yet I get better FPS then people with better cards, or even SLI. On medium I sit around 100-130FPS doing nothing, 80ish in battles but NEVER below 65FPS no matter whats happening. On high I would NEVER go below 50-55FPS. No matter what happens. This is with a constant 99% gpu usage.
So my 2600K at a similar clockrate, with SLI 560Ti 2GB, should be able to provide similar results I reckon. Granted, I am in fact playing with basically complete Ultra settings in the ini file (and it looks pretty damn good!).