please get rid of the scaleform ui menus!

Discussion in 'Player Support' started by BlackDove, May 15, 2014.

  1. Lavans


    Then why does Nvidia list the Tesla K40 as GDDR5 SGRAM, but advertise the GTX 780 as being simply GDDR5? If GDDR5 is natively SGRAM, then why the redundancy when advertising the Tesla K40?

    All you have done is confirmed what was already said - GDDR5 SGRAM exists. You have not proven that the GTX 780 uses SGRAM, nor have you proven that VRAM (as a technology) is no longer in circulation.

    I don't care if it peeves you to see people refer to the integrated memory of a GPU as VRAM. The simple fact that you are derailing the thread is a clear indicator that you are making a desperate attempt at grasping straws to validate your competence.

    To the point of the thread, it doesn't matter if a GTX 780 uses SGRAM or VRAM as a memory technology. Proving it either which way has no bearing on PS2's UI being a "power virus". As such, get back on topic.
  2. Dragam

    Back on topic, as to ps2 being a power hog:

    After a session of ps2 with stock clocks:
    http://i.imgur.com/GrqOAVi.jpg

    After a session of crysis 3 with stock clocks :
    http://i.imgur.com/kykEpl0.jpg

    After a session of crysis 3 with OC :
    http://i.imgur.com/4i7h2p7.jpg


    I used Crysis 3, as thats another pretty big power hog (but unlike PS2, it has the Visuals to justify its power consumption)

    And its quite evident, that PS2's power consumption is out of proportion, even in comparison to a game like Crysis 3.

    Just to summerize it : ps2 with standard clocks and 93% gpu load topped off at 123 % TDP, where as crysis 3 with standard clocks and 99% gpu load topped off at 111% TDP.
    Overclocked, crysis 3 used 126% TDP with 99% gpu load... meaning almost the same as ps2 with standard clocks, and not even fully loaded.

    Obviously most games wont use nearly as much TDP as Crysis 3, and will usually stay on a 1 to 1 scale or lower (100% gpu load = 100% TDP)
  3. BlackDove

    I literally proved that ALL modern GPUs use SGRAM and have for many years in this post.

    https://forums.station.sony.com/ps2...scaleform-ui-menus.186679/page-3#post-2705694

    As i explained literally ALL GDDR RAM is SGRAM. As is clearly explained in the JEDEC standard. If it says GDDR of any kind it is GDDR SGRAM. They probably dont list it for the reason i wrote in the post you obviously didnt read!

    I think the desperate attempt was you trying to make it look like i didnt know what MCU refered to in Nvidia inspector. I gave you the correct answer. Then you brought up VRAM while trying to make me look like i didnt know what a memory controller is. You brought up VRAM, a technology thats no longer used in consumer or pro GPUs.

    And now theres apparently a secret standard for a mysterious second type of GDDR RAM out there... or they leave out the longer name for less technical customers?

    I guess if you ask someone for some salt on your food you need to specify you want the sodium chloride and not magnesium sulfate?

    A chemist(pro GPU customer) is that specific while a guy eating in a restaurant(gamer) doesnt need to be. Its understood what you mean by "salt" in that case.

    Now youre telling everyone else who notices the excessive load in the scaleform ui menus that their PCs are junk.

    Thanks for constructively adding so much factually correct input to the thread!

    Point remains that the 2D menus load the GPU more than the game seems to.

    Rendering behind a transparency thats obviously not well optimized might explain it but youd need a dev to comment on how they coded it to know for certain.

    What is certain is that people with good quality components have had issues with this games 2D elements for over a year and there are plenty of threads about it.
    • Up x 1
  4. Lavans


    It's because of how Nvidia's boost technology works. In an environment where the GPU is running cool, the drivers will automatically increase the TDP for more boost and (on paper) better performance. The hotter the GPU runs, the lower the drivers will set the TDP. The fault isn't with PS2, but rather Nvidia's drivers and their boost technology. I wouldn't consider it a problem though. If you allowed your GPU to run as hot in PS2 as it does in Crysis 3, I would be willing to bet the numbers would be a lot closer than they are now.
  5. Dragam

    Lavans : I understand what you mean, but its contradicting if you look at the numbers.

    I can indeed let PS2 use all the power it wants, and let it hit as high temps as it wants, but then it simply goes CRAZY with the TDP, regularly hitting 150%, meaning that my gpu's also get quite hotter than they do in Crysis 3.

    Here is a screenshot of me using unlimited power in PS2, with the TDP being around 130%
    http://i.imgur.com/L9Jq9rp.jpg

    The TDP will spike well beyond 140%, and the Cards will get into the high 80's... so ill have to disagree with you, and say that it doesnt have anything to do with boost technology, as this is the only game ive encountered, that goes crazy with the TDP (and Thus heat) like this.

    But it may still be on Nvidia's part - only with the driver, rather than the boost technology.
  6. Lavans


    It might also have something to do with having an unlimited power target and your awesomely low temps.

    I did a couple tests myself. Using the fan curve that was in the video I posted earlier (this being the curve http://i.imgur.com/ngN3GEz.png - not too different from vendor default), I set the Power/Temp Target to 132% and ran PS2 on full Ultra, and used the same Power/Curve settings in Sonic Generations maxed out with 8xSGSSAA

    Here are the results
    http://i.imgur.com/8YeyI2E.jpg
    http://i.imgur.com/UOB8Lis.jpg

    As you can see, PS2 and Sonic Generations almost perfectly mirror each other. Both games topped at around ~73c with ~110% power target and ~73% fan speed.
  7. Dragam

    Lavans : very strange...

    I just took a screenie, where i let it run crazy - unfortunately i didnt screenshot the spikes at 180% TDP (yes 180%, not a typo) was too slow on the button... but a steady TDP of 138% is still baffling.

    http://i.imgur.com/J61Quh2.jpg

    (TDP is the 2nd % in gpu info)
  8. Lavans


    What results do you see if you disable SLI and run a single 680 at stock settings?
  9. Dragam

    I suspect that TDP and temps would be fine, but that the gpu wont be able to load 100%, but let me check :)

    Edit : and that was the case + that the gpu doesnt fully boost, due to TDP limit.

    http://i.imgur.com/as3R8PU.jpg
  10. Dragam

    I continuation of the TDP conversation, i took a screenshot in skyrim (which is one of the many games that has a 1 to 1 or lower load / power ratio)

    http://i.imgur.com/4yhVHZl.jpg

    As you see, the gpu's are fully loaded, but the TDP is below 100%.

    The above is the case in most games ive tried :)
  11. Lavans

    Dragam: So basically, unless you change the TDP in the driver settings, you never see it hit above 100%?
    The voltages seem consistent between Skyrim and PS2 as well. It seems that the GPU only starts bouncing off the walls in PS2 with overclocking profiles set, which most average gamers don't do.
  12. Dragam

    Lavans :

    Well, if the power limit is set to 100% (as it is by default), there will be 0.01 sec spikes where the TDP will go up to 130 (just like the 180% TDP spikes i get with unlimited power), and then it immidiately jumps Down to 99% power Again.

    But you are right, that the TDP isnt a very big issue, as long as you are using default settings... but if you use OC, then ps2 will load your Card like no other application - even furmark doesnt come close as to loading the TDP... it maxes out at 130%.

    I had taken a series of screenshots from various games, showing the TDP, but imgur is messing up for me atm, so i guess it will have to wait.
  13. Dragam

    Okay, imgur seems to Work now... so here are screenies displaying the average TDP with OC Applied in several different games.

    Skyrim
    http://i.imgur.com/dpRORpF.jpg

    Crysis
    http://i.imgur.com/3VNz4XN.jpg

    BF4
    http://i.imgur.com/jhHicC3.jpg

    GTA 4
    http://i.imgur.com/5QTxrF4.jpg

    Non of the screenshots are particularly good, as they were just taken semi fast to display the TDP, which i until yesterday wasnt monitering... but they do show that non of them have nearly as high TDP usage as planetside 2.

    What conclusions can be drawn from this, i dont know... but something is obviously a bit off, when the game has such a high TDP usage, compared to other games (even the newest and most demanding ones).
  14. BlackDove

    So a game that has pretty simplistic graphics, and doesnt even use "real" antialiasing, loads yout GPU more than more "difficult" to run games?

    Leads me to believe that some aspect of its code loads the GPU in an atypical way similar to a power virus.

    As indicated by the spikes to 130% TDP when limited to 100% power target show, PS2 will use as much power as it can. Other games with IMPROPERLY coded Scaleform UI menus exhibit this behavior.

    Other games with these menus typically get FIXED. Its been a year and a half and a few hundred GB of updates.
  15. BlackDove

    Except that running a properly coded game that isnt graphically demanding wont exhibit this behavior and cant be blamed on Nvidia or AMD for implementing boost clocks or power targets.

    Boost clocks allow GPUs to overclock themselves if they have a WORKLOAD that requires it. Turn on TF2 or some other old game and youll see your fan and temps barely above idle, even with 8xMSAA on. Even if you have a 100% power target, a NORMAL GPU WORKLOAD wont cause the kind of excessive load that a bugged menu exhibits.

    Why doesnt a workload like that, even with 8xMSAA on, cause the GPU to excessively load? Setting the power target higher or overclocking should mean that your GPU loads THE SAME WAY that PS2 does by your logic, right?

    Wrong. Turn your detail settings down in most games or play a game which isnt that demanding and it wont boost itself to 130% TDP.

    PS2 is a pretty simplistic game graphically, and doesnt even use middle of the road antialiasing like multisampling. It uses edge blur which is not demanding(also why it doesnt look crisp), doesnt use PhysX or anything which is particularly stressful for a mid range GPU.

    Yet it loads GPUs unlike any other game. Why? What element makes it so demanding? Those 2D menus sure are complex... about as complex as a web page from 1998. So spiking to 130% TDP at stock clocks on a 680 is totally normal and good right?

    Like ive said before, you shouldnt have to underclock your GPU, set custom fan curves, remove the HUD and modify all this stuff just to get the game to work like it was coded right, because they have badly coded menus.
  16. Dragam

    BlackDove :

    That is correct - ps2 uses significantly more TDP on my Cards, than games such as Crysis 3 and BF4.

    Regarding the spikes, i believe they occur, cause the Cards cant load 100% in ps2, and stay within the 100% TDP limit... so it loads 100%, and spikes to 120 something % TDP, and then gets throttled back... this gets repeated every couple of sec.

    This is apparent from this screenshot
    http://i.imgur.com/as3R8PU.jpg

    But i should record a video of this probably.
  17. Lavans


    Yes, which it does. Reference the Sonic Generations vs PS2 screen shots, where the GPU is set to a power target of 132%. Neither of the two games hit 132%, and both were near identical in terms to clocks/power target/heat/fan speed.


    Have you been paying attention to the thread at all? It's been said time and again that TDP is only an issue, at the very least for Dragam, when overclocking. PS2 loads exactly the same as any other game under stock settings. Overclocking always has been, and always will be, user beware territory.

    I wonder. What kind of results do you get, BlackDove? Can you provide comparison screenshots of PS2 and various other games, both with stock settings and with your GPU set to 132% power target?
  18. BlackDove

    Yes ive been paying close attention to all of your posts.

    I dont know why you seem to be ignoring my posts when ive answered all of your "questions" about memory controllers and even took the time to explain the difference between VRAM and SGRAM.

    Ive also REPEATEDLY addressed the TDP issue.

    Its NOT ONLY WHEN OVERCLOCKING as Dragam JUST posted.

    The TDP spikes to 130% and then drops at STOCK power target and clocks for him, as he just posted.

    He, and the other people whos computers you claim are improperly set up, and i have repeatedly stated that this is the power virus like behavior that PS2(apparently the scaleform ui elements) exhibits.

    Thats NOT the same behavior as any other game.

    Have you paid attention to all the other overheating threads? In most of them you come in and tell the OP or anyone else having the issue that their PC sucks and they dont know what theyre talking about... just like this thread.

    Yeah i pay attention. Read what Dragam actually said.

    "That is correct - ps2 uses significantly more TDP on my Cards, than games such as Crysis 3 and BF4.

    Regarding the spikes, i believe they occur, cause the Cards cant load 100% in ps2, and stay within the 100% TDP limit... so it loads 100%, and spikes to 120 something % TDP, and then gets throttled back... this gets repeated every couple of sec.This is apparent from this screenshot

    http://i.imgur.com/as3R8PU.jpg

    But i should record a video of this probably."

    And no i will not put my GPUs power target above 80% while playing PS2 for you. Lol sorry.
  19. Lavans

    If you've been paying attention then you would have seen the discussion Dragam and I had outlining that it's normal for GPUs to exceed 100% Power Target, even on stock settings, due to boost technology. A game loading a card beyond 100% Power Target does not imply a power virus.


    Exactly. You have nothing to contribute to this thread.
    • Up x 1
  20. BlackDove

    Lol... yes its normal for it to exceed 100% TDP if you have the voltage and thermal headroom AND THE LOAD DEMANDS IT.

    Not all games cause a TDP spike to 130% at stock clocks and then rapidly throttle in a repeated cycle. Power viruses would cause such spiking and throttling on cards with boost.

    So you either ignore the behavior that does indicate a problem, or you dont get it, like you dont know what kind of RAM GPUs have used for the past decade, and your innaccurate misconceptions about how GPU boost works are NOT helpful or constructive for people who have this issue.