performance when using nvidia SLI

Discussion in 'Player Support' started by indian spice, Nov 24, 2014.

  1. BlackDove

    Thats why i said "likely".

    Nvidia is actually a better deal in many cases. AMD may be cheaper but you also get something that is less efficient, runs at 95C, has terrible drivers, no PhysX, Adaptive Vsync, G-Sync, DSR or the free equivalent of Fraps(Shadowplay). Nvidias multi GPU and multi monitor actually works, where as AMDs either doesnt work, or doesnt work as well.

    Let me find you the links to some official press releases since you cant use google.
  2. Aldaris

    Likely implies a high chance. No such info exists to provide that chance.

    Incorrect. Less efficient in current generation. Has been more efficient in the past. It swings around. 95C? Looks like someone doesn't know what they're talking about. The stock coolerss of the 290X were bad. No denying that. Custom coolers are fine and don't get anywhere near that hot. Drivers run fine, you're listening to that old news nonsense it seems. PhysX is a gimmick and not a selling point. Adapative Vsync is more important, but not a significant selling point. G-Sync. Better argument, but again showing Nvidia's greed tying you to specific GPU and monitor venders and the premium expense you pay on that. Compare that the VESA adapting the AMD Freesync standard to do the exact same thing, without dicking over their customers. DSR, otherwise known as downsampling otherwise known as just turn on AA. Shadowplay is a niche product.

    Your last points are really lol worthy and again, show your bias.

    This is still besides the point. We can both play *** for tat fanboyism all day. My basic point was you've given advice and made a suggestion based on absolutely nothing but speculation. That's a bad thing to do. You have nothing concrete to go in for price or performance. Nothing.
  3. BlackDove

  4. Aldaris

    Thank you for continuing to prove my point, seeing as there's no objective information on either AMD's or Nvidia's next cards in terms of performance or pricing.

    Oh, look, another link that doesn't do as I asked. You obviously can't read, but I've already dealt with any spec leaks from anyone that doesn't have a card in their hand or from Nvidia/AMD: they all come with the same warnings and language, namely that it's aleged, speculated and potentially faked.

    So, my point still stands: You've given out advice based on bias and complete and utter speculation. That's a bad thing to do.
  5. BlackDove

    Actually you dont have a point.

    You keep trying to make it sound like i said that they released definite pricing or specs.

    I never did though. I used words like "probably" and "likely".

    However there is plenty of intentionally leaked and unofficial info about products like GP100, GM200 and Knights Landing. Micron and Nvidia released enough info about their stacked DRAM to know that GP100 will have about 1TB/s memory bandwidth.

    They released their target FLOPS and GFLOPS/W so we do actually have a good idea of their performance.

    You seem to think that unless i say "every manufacturer makes equally good things" im being biased.

    Which is nonsense. AMD has been making terrible chips for years and im not going to say "oh they make great stuff" when they make garbage. Im also not going to sit here and tell you that car makers that make ****** cars make good ones to appear "unbiased" to pepple like you who dont even use the word correctly.

    Some things are objectively better than others. Its being biased to pretend theyre all equal.
  6. Aldaris

    I never said the word definite, but hey, let's play the semantics game for you if you feel like it:

    prob·a·bly
    ˈpräbəblē,ˈpräblē/
    adverb
    adverb: probably
    1. almost certainly; as far as one knows or can tell
    like·ly
    ˈlīklē/
    adjective
    adjective: likely; comparative adjective: likelier; superlative adjective: likeliest
    1. 1.
      such as well might happen or be true; probable.
    Oh, look, you pretty much did use definite and yet you cannot do that with the information available.

    You're still talking about leaks while completely ignoring what I've written about them. You obviously have problems reading: leaks are not official. They are rumours, they are alleged. Any credible website will include a caveat that they should be taken with a pinch of salt. There are no confirmed specs for either AMD 300 series or Nvidia GM200 chip. There are certainly no real world confirmed benchmarks for game performance. This is the last time I will make that point.

    Oh look, a straw man argument. I never ever said that, nor did I imply that. Manufactorers quite clearly do not produce equal products. What I did say is that you shouldn't give advice out based on speculation, rumour and clear bias on your preferred product.

    Oh look, providing your bias against AMD quite clearly. Well done. Terrible chips would imply clearly objectively inferior chips in every margin for several years. That isn't true. For example, the current lineup of AMD is only clearly inferior in power consumption. Every other measure makes it comparable, especially in a price/performance comparison, but you refuse to acknowledge that. Bias confirmed.
  7. BlackDove

    I think its you who cant read. Probably is not definitely. You yourself admit that, so between history(GF110 to GK104, GK110 to GM204), the literature that Nvidia and TSMC have been releasing about upcoming products, the DEFINITE specs of GM204 and combine that with the leak about an approximately 3000 core 384bit bus large chip approximately 50% more powerful than a full GK110 and we have a pretty good idea of Titan 2 and 1080s performance in a synthetic benchmark.

    Of course thats not extensive real world testing of final products but you must realize that these chips are in development for years and ebgineering samples have existed for GM200 and Broadwell for a long time now. AIB makers kind of need those lol. You think they get a batch of chips and design custom PCBs and build GPUs in a couple days? Weeks? Lol

    If youd like ill go get some links to back up the objective fact that AMDs current products suck. Lets just go over the facts first.

    AMDs multi GPU and multi monitor hasnt worked for years, and they ignored experts who told them this, because it was too expenaive to fix. After a couple years of this AMD finally released software frame pacing except for DX9 games, which is a lot of games including PS2. Talk about a ripoff! Buy two GPUs to get lower performance than a single one.

    Then theres Hawaii. Basically a refresh of their pretty decent(only in single card configurations) 7970 from 2011 with a bunch of promised features that didnt pay off. Like mantle, XDMA which did nothing to fix Crossfire and having a 512bit bus. The full GK110 from 2012 beat AMDs new architecture without the throttling or excessive power consumption.

    With Nvidia you do pay more initially, but you get tons of free features that actually work(G-Sync, adaptive vsync, DSR, Shadowplay), excellent power efficiency, which means you spend less on electricity and power and cooling.

    Ok now lets talk about CPUs. AMD uses a two core per module architecture for their CPUs which basically does the oposite of hyperthreading. Instead of putting each core on a bidirectional ring bus and allowing each physical core to work on two threads, access the cache without a bottleneck and behave as two logical cores, AMD wraps two physical cores in an interconnect and bottlenecks them.

    Thats why an 8 core 9590 at 5GHz performs worse than an i5, but conses almost three TIMES the power! Yeah thats a great architecture isnt it?

    Once again, an Intel CPU costs more initially, but you dont need liquid cooling or a massive power supply and you get better performance than AMDs ancient architectures that they keep re releasing.

    AMDs APUs are actually kind of decent because they ditch the two core per module nonsense.

    Ignorant people like tl throw the wprd "bias" around like it can automatically win an argument by pointing out the fact that the person theyre arguing with has a preference for one thing rather than saying "all manufacturers make good things and yiure biased if you have a preference".

    Even though my preference(bias lol) is based on objective criteria. Keep saying the word "bias" as if it changes any of the benchmarks or numbers.

    9590(AMDs best desktop CPU) cant even keep up with an old Ivy or Sandy Bridge CPU

    http://www.pcper.com/reviews/Proces...and-FX-8370e-Review/Results-3D-Mark-Fire-Stri

    The 8350 cant keep up with an i3 and even with Mantlean i5 outperforms it lol

    http://techreport.com/review/26996/amd-fx-8370e-processor-reviewed/4

    Its also ironic that AMD optimized games usually perform better on their.competitors hardware.
  8. Aldaris

    Read the definitions. Again, I never said definitely. You're still failing to read.

    Aka complete and utter speculation. Once again, you can't read. I've already told you why leaks are useless. You have no actual proven synthetic or real world benchmarks for either Nvidia or AMD. You cannot give advice as you did without anything backing it up. This is a basic fact. End of discussion. You have no ground to stand on with this point.

    Irrelevant to the discussion.

    And incorrect, outright fabrications and more bias. Yawn. You're getting tiresome BlackDove.

    Multi monitor and multi GPU work just fine. DX9 games are in the minority so irrelevant. XDMA was never to fix Crossfire so more irrelevance from you.

    The only thing the GK110 'beat' AMD on is power. Everything else is comparable while being cheaper than Nvidia overpricing nonsense. Paying more for similar performance AND a bunch of niche, useless or propriety features of things that already exist? G Sync: propriety, forces you to buy Nvidia AND a supporting monitor, when FreeSync does the same for any monitor with DP 1.2a without tying you to Nvidia or certain monitors. Looks like G Sync just got spanked. Adaptive v sync, niche product not important enough to shift units. DSR, renamed downsampling which only achieves a form of AA which costing more resources than AA. ShadowPlay, niche product when there are free non Nvidia only products out there. (I note that you've ignored me on these points once already) Wow, that sure wants to make me spend money on them. Not.

    We never talked about CPUs and if someone doesn't have specific reasons for wanting AMD such as heavy multi threaded use or budget reasons, I wouldn't suggest them over Intel. I haven't had an AMD processor in a long time, so more irrelevance from you. I use the word bias, well, because you clearly are. I'm not using it to try and win an argument. I'm using it because you meet the definition.

    Bias is an inclination of temperament or outlook to present or hold a partial perspective, often accompanied by a refusal to even consider the possible merits of alternative points of view.

    The fact you're advising someone on a product without concrete details of specs of the product or its rivials, and price of the product and rival means you're biased.
  9. BlackDove

    Now youre doing what you just accused me of lol.

    So Freesync does exactly the same thing as G-Sync and doesnt require compatible hardware? Lie number one.

    Hawaii performs about equal to the much older GK110? Lie number two. Speaking of lies, lets examine some of AMDs blatant marketing lies.

    http://www.pcper.com/reviews/Graphi...ges-Performance-Fan-Speeds-R9-290X-and-R9-290

    AMD sent cards with special BIOS to the press to make their benchmarks look better, making Hawaii with its peoprietary BS like Mantle even more of a failure.

    And where do you get the idea that DX9 is so obsolete that the fact that you get NEGATIVE SCALING with AMD GPUs doesnt factor in? Got some numbers to back that up? Of Course not. Especially ironic since youre debating me on a current triple A DX9 games forum. Oh i guess Landmark and all those other DX9 Forgelight games are irrelevant too.
  10. BlackDove

    http://m.hardocp.com/article/2014/0...ctcu_ii_oc_overclocking_review/6#.VH6cLWJOnJs

    And the 290X and 780ti perform about equally? And actually the 290x performs closer to the 780 than the 780ti.

    AMDs GPUs have a lot of theoretical FLOPS but dont actually manage to translate into real world performance. They also have an architecture thats less efficient, objectively.

    You can say that youre being objective by saying that theyre competetive when theyre not, but youre just being "biased" lol.

    Why else would Nvidia be able to release their small chip and beat the 290x?
  11. Aldaris

    I didn't say compatible I said propriety. Man, you really are bad at reading. FreeSync is AMD's implementation of the VESA Adaptive Sync standard. It's open source, and it only requires a recent GPU and a 1.2a DP monitor. Anyone can implement that standard without royalties. It means no one has to pay Nvidia tax on both GPU and monitor. G Sync is already out of date and it's barely started.

    http://www.bit-tech.net/hardware/graphics/2014/06/05/sapphire-radeon-r9-290-vapor-x-oc-review/1

    My card. Comparable performance to 780 Ti while being cheaper. You're welcome.

    Oh wow, another irrelevant link from you, because anyone really uses the stock 290X at all in the real world.

    I get the idea that DX9 is irrelevant because it is. DX11 is now the standard for pretty much any AAA release. Even for the few DX9 games still out there, as most gamers are still on 1080p or less, there's barely any scaling from any multi GPU scenario. In specifics to this forum, a game that is strongly CPU limited and therefore no need for multi GPU, it's even more irrelevant. You're really good at making irrelevant points.

    I'm being a damn sight more objective than you are.

    I love the fact you're now making irrelevant straw man arguments to win this discussion, when the discussion is about unreleased graphics cards. My original point still stands. End of discussion.
  12. BlackDove

    Oh no wonder you call all those features irrelevant. You own a 290xand must therefore be BIASED toward AMD. Lol

    Your initial point was that the GM200 specifications havent been officially released and you objected to the fact that i said it will PROBABLY be about 3000 cores for its first(probably partially disabled) release, have a 384bit bus, 6GB for geforce 12 for Titan 2, 12 or 24 for Quadro, 2-3 TFLOPS DP, 6-7 SP.

    GM200 will ALMOST CERTAINLY be compute focused compared to GM204(which has almost no DP performance) as well. It will likely be around 50% more powerful than full GK110.

    The only thing i have no idea about is if its 28nm planar or 20 or 16nm FinFET.

    Now that this forum will prevent editing we can see if im right when Titan 2 and 1080(or whatever they call it) is released.

    Oh and PROBABLY and LIKELY do leave room for error, so your objection to my good advice to wait for the big chip because it will PROBABLY be better for a guy doing compute work on his GPU and basically telling him to rush a purchase on stuff that sucks for his requirements was a pointless objection to begin with. Its just bad advice to tell him to get any GM204.

    So whats your actual advice to the guy wanting to play games and do compute work with a budget for two 970s?

    Oh and please provide more than one benchmark. Here are some more of the 290x being outperformed by 780ti.

    http://www.pcper.com/reviews/Graphi...X-780-Ti-3GB-Review-Full-GK110-Crashes-Hawaii

    http://techreport.com/review/25611/nvidia-geforce-gtx-780-ti-graphics-card-reviewed/12

    Even in AMD optimized games.

    http://techreport.com/review/25611/nvidia-geforce-gtx-780-ti-graphics-card-reviewed/9

    Lets not forget that the small GM204 chip destroys the 290x while using half the power in games.

    And your excuse making for Crossfire not working with DX9 and AMD ignoring the problem until ebough people started realizing they were being ripped off is hilarious.

    Whats your defense for them sending cards with special BIOS to the press to make their cards perform better for benchmarks than real retail cards do? Oh a link to a single benchmark where a 290x did ok. Lol
    • Up x 1
  13. Aldaris

    I call those features irrelevant because they are. They're either being replaced or niche ideas. Mantle is actually a bigger deal than anything you listed other than G Sync. It has a real measurable performance impact. The only issue is that it will be replaced by DX12 in time, although we don't know whether it'll be Win 10 or not.

    Trying to claim I'm biased just makes you petty. I've acknowledged AMD's weaknesses in current gen. You refuse to acknowledge any strengths.

    Speculation on how powerful it'll be, how it will perform in to real performance in games, its price and how it will perform against competition. As I've said several times now. You really don't get that idea, do you?

    My god. You really don't read. I objected to this line "Two 970s or 390s would be more expensive than a GM200 based card and likely perform worse in games or other applications" because you cannot state that at all. It's impossible, which makes your advice certainly not good at all and makes your use of those words pretty dumb. As for DP, AMD do that better on all their cards compared to Nvidia except the Titans, which all cost a small fortune and out of his budget as he's already stated. If he wants

    A Titan would probably be the best balance, or two 290X if he wants more gaming performance with some DP capability. Either of those if he wants it right now. He can wait for new GPUs, but I wouldn't state which one to get without some actual facts, unlike you.

    Oh, look, you're providing links with benchmarks using stock 290X. Surprise surprise. Where as mine provides a benchmark with a card people actually use. Yet more irrelevant nonsense from you. You also have a strange definition of destroys. Only the 980 outperforms it, and costs nearly twice as much. You fail. Your own link even says get the 290X because it's price/performance is better than Nvidia's offering "For most folks, though, forking over 700 bucks for the GTX 780 Ti will seem like madness when the Radeon R9 290 offers 90% of the performance for $300 less. Yeah, the Radeon is noisier, but I'm pretty sure $300 will buy a lifetime supply of those foam earplugs at Walgreens. Heck, throw in another hundred bucks, and you could have dual R9 290s, which should presumably outperform the GTX 780 Ti handily. "

    I'm sorry you find reality hilarious. DX9 is not relevant anymore. Deal with it.

    I don't have one. It was a crappy move, but completely irrelevant to this discussion. You're really piss poor at producing a coherent relevant argument.
  14. BlackDove

    Actually calling you biased was my way of pointing out that you have an opinion and purchased things based on your values. Which is fine as lomg as you admit that your values often determine your purchases.

    And both AMD and Nvidia nerf double precision enough to ignore it on their Geforce and Radeons. An ancient 580 has more because it wasnt nerfed as much. I agree about the Titan except for the fact that GM200 is coming and its going to be the compute chip that replaces GK110. Do you disagree with that statement?

    Titan has never been significantly discounted, and IF Titan 2 comes out at around the same price as Titan 1 did(they tend to keep their pricing pretty consistent) then my advice still stands: wait for GM200. Do you disagree with that? Why would you say that its a good idea to spend $1000 on a GK110 now?

    And they were reference 780tis as well. So, you want links to highly overclockable 780ti benchmarks?

    And where do you get the idea that a 290x is half the price of a 980 lol? Most of the 290x with decent coolers are between $350 and $400. That is cheaper than a 980 but not half the $550 price.

    Oh look youre quoting prices from an old review when you can get 780tis for under $500 and 980s for $550. $300 difference lol. Right...

    By that logic you should choose a 970 anyway. Theyre $330-350 and can easily compete with a 290x, beating them in some games even, while consuming 100W less.

    You keep saying DX9 is irrelevant, yet here we are on the AAA title using an engine thats also being used in several other games from from Sony's forum.

    So i guess PS2, Landmark and all the other DX9 games that are still hugely popular are irrelevant because you said so? In fact Rift just released a massive new expansion, PS2 and Landmark are basically still in development. So how do you figure DX9 is IRRELEVANT?
  15. Aldaris

    That isn't bias. Preferences are not bias. Go look up the definition.

    Never said they didn't, but Radeons are better than Nvidia's outside of the Titans for DP. It may be coming, but he doubts he'll be able to afford it as he's said.

    http://www.newegg.com/Product/Product.aspx?Item=N82E16814125697&cm_re=Titan-_-14-125-697-_-Product

    Admittedly refurbished but it's not much more expensive than a 980.

    I never said anything about overclocking. I simply said stock. It was well known the stock 290X had the world's crappiest cooler design causing throttle issues. Why AMD ever released it like that, god knows. It was pure crap.

    Er, because I checked? The cheapest 290X with non stock coolers are about $30 away from being half the price of the cheapest 980 on newegg.

    Nope. Current prices.

    Because the 290X is still cheaper and comes with 3 free games compared to Nvidia's one. Only relevant if you want the games on offer mind, but that's a significant saving itself.

    The forum we're posting on has absolutely no bearing on anything. I said so because it's truth. The only games releasing purely on DX9 only or still running DX9 only are free to plays or non AAA, none of which require anything like Crossfire to get decent frames, making your frame pacing argument irrelevant. Anything else being released that's true AAA that you pay a price for runs DX11. DX9 is passed its time.
  16. BlackDove

    Lol comes with three free games.

    So shadowplay, DSR, adaptive vsync, PhysX, CUDA and G-Sync are irrelevant but three free games(which are definitely going to appeal to everyone and last in terms of usefulness)!

    That seals the deal!

    http://www.newegg.com/Product/Product.aspx?Item=N82E16814125489&cm_re=780ti-_-14-125-489-_-Product $469 for a 780ti which we were comparing to the 290x. Not sure why youre comparing the 980 to the 290x since the 980 is newer and more powerful.

    But anyway the 970 competes nicely with the 290x and the 970s are cheaper and use about half the power, saving even more money.

    Ok so you agree that PS2 and Landmark and all those other new DX9 games are irrelevant.

    And if he can afford about $700 for two 970s, he can definitely afford a GM200 Geforce lol.

    So your best advice is to get an overpriced 290x or a refurbished Titan 1? Why not just wait for GM200 or get a single 970 now?

    My advice is:

    A single 970 gives him about the same performance as a 290x, costs less, uses less power, supports CUDA develpment.

    If he wants to wait and invest about the same money as two 970s, which he can apparently afford, he can get a single GM200 Geforce and have more compute and DP than GM204 or wait for full DP with Titan 2 if he wants to spend ascmuch as thrre 970s lol.
  17. f0d

    now getting back on topic

    both SLI and crossfire are horrible with planetside 2, i had the chance to test 2x gtx670's and 2 r9 290s and you are better of just using a single card because of the hitching that you get from it
    sure i can get 200fps with 2 cards but they have the smoothness of trying to play at 20fps which also hitches down to 2fps its freakin aweful

    im ebaying my 670's now as i have a single 970, i might in the future try 2x 970's but i doubt it would work very well

    FIX SLI/CROSSFIRE IN YOUR GAME SOE
  18. BlackDove

    Tell AMD to fix Crossfire in DX9. Thats not SOEs fault. It doesnt work in any DX9 game.
  19. f0d

    that doesnt explain SLI thats also horrible and jerky with loads of hitching
    single card is the best way to go with PS2 atm
  20. BlackDove

    Thats true, but theres nothing thst anyone but AMD can do to fix Crossfire.