Newest GPU could add lag to EQ.

Discussion in 'The Veterans' Lounge' started by Strawberry, Sep 22, 2022.

  1. Strawberry Augur

    You argued the latency was tiny.

    You said: "delay would probably be a tiny fraction of the time the server and client communicate with each other."

    I showed you this is clearly false. Like Digital Foundry pointed out, latency is significant enough to be noticeable.

    At 30fps you have a frame every 32 milleseconds, with DLSS 3.0 that's an extra 32 milliseconds of delay.

    At 60fps you have a frame every 16 milliseconds, with DLSS 3.0 that's an extra 16 milliseconds of delay.

    Only at 120+ fps and beyond would the latency not be noticeably, but then there's no point in interpolating frames at these high framerates to begin with.

    A good round trip ping to a server is 60 milliseconds. Adding 32 milliseconds and 16 milliseconds are both noticeable and they are not a "tiny fraction". To me, playing on a server of 60 ping or 100 ping is a world of difference. Studies show latency of 15ms or higher is noticeable.
  2. MyShadower All-natural Intelligence

    The ability to output a game at a higher resolution than it is rendered on the fly using your very own video card is a technological leap and is just being simplified to "what about my framerates".

    All this sounds like is marketing hype to boost sales being called out and more hype from those calling it out to boost ad revenue. If the technology really does worsen game performance, no amount of marketing is going to fix that but a simple DLSS 3 toggle will. People are still going to pay more than retail to have it and there is still going to be more demand than supply.
  3. Jumbur Improved Familiar

    Nvidia has a history of "cheating" at benchmarks with their GPU releases. I suspect this is part of the reason. :p

    That said, framerate interpolation would probably be a success with "in-engine cutscenes", allowing the devs to temporarily crank up the detail in those circumstances, where interactivity is not a concern.
  4. Smokezz The Bane Crew

  5. Svann2 The Magnificent

    The client isnt waiting for data to go to the server and back before it shows a new frame. It would be ridiculous to do that. The client shows one frame after another without waiting. If you are getting 140fps now thats 7ms between frames. So latency added would be 7ms. Not a big deal. Especially not in an mmo.
    Waring_McMarrin likes this.
  6. Jumbur Improved Familiar

    Im not sure how the framerate interpolation actually works, but they don't have to finish rendering the newest frame before predicting intermediate frames. If the do a superfast pass, for only updating a subset of vertex-coordinates, they can use that to compute intermediate frames, without having to finish texturing and shading before later.
    So they don't have to wait a whole frame before spitting out the interpolated frames.
  7. Zalamyr Augur


    Ok, lets start with that fact that there's absolutely no reason to believe this technology will even work with all games, yet you've taken it a step further and somehow are fearmongering it's going to be forced on at all times with no option to turn it off.
  8. Coagagin Guild house cat

    No, many reputable labs have tested these cards and most everyone who posted their findings feel the card(s) are hyped at this point.

    https://www.pcgamer.com/nvidia-rtx-40-series-let-down/
  9. Strawberry Augur

    You can turn DLSS off, I think everyone knows this.

    No one has suggested this will be "forced" on you or that you "can't turn it off". I have not suggested this either, so don't accuse me I did.

    What is true is that people should be aware that DLSS 3.0 does not magically add frames and there is a cost in latency you pay for this.

    There is reason to believe this tech will be deployed much more widely than DLSS 2.0.

    The technology of interpolating frames is quite content agnostic, a lot more than DLSS 2.0 was. TV employ this trick on all content. Deploying DLSS 3.0 can in theory be done without developer input.

    Nvidia is in charge of the framebuffer, not the game developer. The developer can access the framebuffer through libraries like DirectX, but it is Nvidia who has control over it with the driver. If Nvidia wanted to deploy DLSS 3.0 on every game tomorrow without any developer input, they could.
  10. Strawberry Augur

    This has pretty much been my conclusion when I went over the numbers. Many graphs in the Nvidia presentation included RTX comparisons and extra frames through DLSS 3.0, which tells you nothing about rasterization performance.

    Nvidia made some bold claims about Flight Simulator without RTX, but these numbers look off, because it doesn't add up when you compare the clock and cuda cores from the 3000 series.
    Coagagin likes this.
  11. Zalamyr Augur

    If you aren't suggesting it will be forced, then who cares even a little bit if this option exists.

    Of course, immediately after you say you're not suggesting it will be forced, you doomsay about the fact that it *could* be forced.

    It's not going to be forced.
    Duder likes this.
  12. I_Love_My_Bandwidth Mercslayer

    That's a link to release coverage. No where in Laird's anti-Nvidia escapade does it indicate they have access to any independent testing or benchmarks. Please link to a single one dated yesterday, when I made that post. I'll wait.

    Are the cards hyped? Yes. They always are.
  13. I_Love_My_Bandwidth Mercslayer


    You're completely ignoring architecture and IPC improvements. But oh well.
  14. Strawberry Augur

    Core achitecture of GPU doesn't really change. It's highly predictable parallel hardware doing matrix math and dot products. It's not like CPU where branch prediction is very important. IPC is already accounted for, it's the number of cuda cores and clockspeed.

    The article from PCGamer is pretty spot on. I wouldn't go as far as to argue these GPU "suck" for rasterization like he did, but the little gains in rasterization don't justify the crazy price bumps.
  15. I_Love_My_Bandwidth Mercslayer

    SER isn't a change?

    Are you lost or something? o_O
  16. Strawberry Augur

  17. I_Love_My_Bandwidth Mercslayer

    omg
  18. Strawberry Augur

    What's SER? Did you invent the word and didn't expect me to ask what it is?
  19. I_Love_My_Bandwidth Mercslayer

    It's apparent to me you never bothered to research the technology you're trying to argue about. Done with you.
  20. Svann2 The Magnificent

    Why would you buy a bleeding edge card and not go with high framerates?? No one running one of these is ever going to be running it at 30fps.