Yeah i got the gigabyte GTX 660 oc ti and it topped out at around 50 c i believe and at idle around 43c. Tried the old ram disc.This was the memory i got, 32 gig.http://www.kingston.com/en/memory/hyperx/blu/ (Is that ecc ram?)Couldn't figure it out just for wits and giggles. no luck. Re downloaded the whole game and tried the new nvidia drivers and then read the forums and found that it is actually damaging gpu's so would definitely roll back that one. I used one called virtualram fyi. Have already deleted it. Also apparently my mother board is very weak on overclocking that Asus pz77-v-plus so that's next and then to put it all in an aquarium full of mineral oil. I understand ssd will work submerged. Dont know about gpu's.
My 660Ti idles about 39C and hits 60C under heavy load, even with the aggressive fan curve I have. The 2D menus make it spike that high. Usually it's in the 50's in PS2. No, ECC RAM is error correcting, and requires that the CPU have an ECC capable memory controller. You'd need a Xeon vs an i5 or i7. There are a few basic types of RAM. Non-ECC Unregistered DIMM's: this is what most desktops use. It has no ECC and no register/buffer. ECC RAM has extra bits for error checking and correcting. Most servers and workstations have this. You need a CPU that supports it though. It actually detects when errors have been introduced to memory from things like cosmic rays striking individual memory cells and flipping a bit, or from other interference/damage that can cause bits to flip in memory. Registered memory has and "fully buffered memory" buffer the data going to and from RAM, to allow for massive amounts of memory in a single system. I believe the new Ivy Bridge Xeon's support 3TB of ECC registered memory per CPU, scaling to 12TB in a single four socket system. You need to get Gigabyte's monitoring tool instead of EVGA's, if you're using Precision. It's called OC Guru 2. http://www.gigabyte.us/products/product-page.aspx?pid=4320&dl=1#utility First one in the Utility section. There could be some conflict in the card's reporting clock speed vs load or something using a different vendor's monitoring tools. I think we have the exact same graphics card, the Gigabyte 660Ti 2GB? My card basically idles in terms of load, in most games. PS2 will load it, but only REALLY highly in the 2D menu's because of the glitch they seem to be unable to fix or even address. I hope you're joking about the liquid submersion cooling for your PC. It's totally unnecessary and will just ruin your components, but yes SSD's work when submerged. HDD's don't unless they have a snorkel.
Wrong. The 7990 was released, and is in competition with the GTX Titan. Since the performance of a GTX Titan is either below or around a GTX 690, in a way you are correct--I will give you that much, but motivation-wise, MALTA was released against the Titan. You don't take into account that if the 7990 out performs GTX 690, it out performs the GTX Titan as well. The 7990 out performs the GTX 690 in most 3rd party benches who are "un-bias" towards either brand. You're Nvidia-Fanboy Colors are showing... Since I'm the owner of both cards (Asus ARES II and a Titan), I like to think I am somewhere in between... Woulda, coulda, shoulda... I think you forget that Nvidia is a business. The top priority of a business is to increase revenue returns. The 680 is just another GK104 refresh to milk the Nvidia Base. They coulda used a GK110 on the 680. There woulda been a GK110 on the 680, but it never happened. The 680 shoulda been the Titan... Ya. I don't think it takes a rocket scientist to realize that Nvidia was holding back for as much as they possibly could at the time, and the Nvidia Base would have bought more GK104-refreshes of upcoming generations, brain-lessly... The GTX 770 is a good example of that... "Since they needed a performance boost, they used GK110 for 780," is an understatement. Like I said earlier. the only real perk about the GTX 780 is the frame latency. It's a sub-par Titan with better frame latency. Ya, you're right. It's a performance boost against Nvidia's competitor: AMD's 7000 series... The performance boost is under, against a Titan or GTX 690. You haven't even taken account that the AMD 8000 Series, the same generation as the GTX 780, hasn't been released yet... Why would Nvidia design a card for it's competition when their competition hasn't released it's new series. Realistically, it's a bad move on Nvidia's end. Look I mean no offense, and I can see you've been hurt by AMD in the past. That much is certain in your post. Being an owner of both cards, I don't like it when Nvidia-fan-boys talk down about AMD. As of right now, AMD is trying to push the evolution of gaming to the next level with the PS4 and XBOX One that's being released. These two consoles is just the beginning. I don't think it's appropriate for any Nvidia-fanboy to talk smack at them while they do all the heavy lifting... If one assume that the PS4 and XBOX One are a big hit, AMD will start to gain an advantage of its goal. Besides. Nvidia has placed its faith on its discrete graphic cards and 2013 game-boy wannabe (Shield) with Cloud... Doesn't seem like a lot of "innovation" is taking place at the Green Camp... Maybe Nvidia should venture into SSDs and 7.1 headsets... As far as Malta. You can voice your negative replies about it. You're entitled to it. It's almost irrelevant because the consumers will still consume. As long as 3rd party benches show that it outperforms it's competition by a small percentage, CF working or not, people will still buy it. To argue, throw a fit, or pull your hair at the thought, motivation, and reasoning behind consumers buying the AMD 7990, is in vain... From here on out. I'm not replying to this topic any further.
Ok, where to begin on your wall of BS text. First off, a single GPU card does not compete with a dual GPU card usually. The most similar product to the 7990 from Nvidia is the 690, since it's also a dual GPU with a PCI-E bridge chip built into the card, and is always running in SLI(like the 7990 is always running CrossFire, which doesn't work). The 7990 only beats the 690 if you count all the dropped/runt frames that you never actually see as part of the benchmark, meaning they use a Fraps-like tool instead of a hardware frame rating system that gives you relevant performance information. Now to the issue of CrossFire not working. It's been discussed everywhere on the internet, and AMD has even tried to fix it with prototype drivers. Here's a few articles discussing how CF is broken, and demonstrating it empirically, and irrefutably by several different independent expert sources. http://techreport.com/review/24553/inside-the-second-with-nvidia-frame-capture-tools/9 http://www.pcper.com/reviews/Graphics-Cards/Frame-Rating-AMD-Improves-CrossFire-Prototype-Driver http://www.pcper.com/reviews/Graphi...ils-Capture-based-Graphics-Performance-Testin http://forum.beyond3d.com/showthread.php?p=1689708 In reality, CF gives you worse performance than a single card by introducing what's called frametime variance(a runt/drop frame briefly shown followed by a frame that is on the screen for a long time). This is where the term "microstutter" comes from. As you can see by the huge orange line in that graph(and I highly recommend you watch all the videos on frame rating and FCAT), CrossFire doesn't work, and the charts show that you get about the same observed FPS as a single card in a CrossFire setup, while adding massive frametime variance. Pretty much every review site is starting to get FCAT capable hardware to do their GPU reviews. It's very expensive, and gives you an actual idea of how the GPU performs, unlike overlay software that reports FPS(like Fraps). It requires PCI-E SSD's and very expensive capture cards, but it is the only way to see if multi-GPU setups actually work as advertised. That prototype driver that AMD's been working on will add frame metering in software(which has overhead and is not as good as SLI's hardware level frame metering). Heavy lifting my ***. The PS4 and Xbox deal will keep AMD from going bankrupt, which is good, because there needs to be competition to Nvidia. There's nothing revolutionary about either consoles architecture, and the only interesting things about them are the integrated SRAM and the unified memory. Memory on the same silicon as the GPU is coming in two generations from Nvidia, offering 1TB/s memory bandwidth(superior to the SRAM on die), and the unified memory is coming with the 800 series GPU's. Basically, the consoles have two neat features to make up for their lacking performance, and they have to seriously turn down the effects, compared to say a mid ranged PC with a 660Ti. Here, if you don't believe me, read this: http://www.pcper.com/news/Editorial/Epic-Games-disappointed-PS4-and-Xbox-One I'm an Nvidia fanboy because I say that AMD is releasing essentially defective hardware, demonstrated to be not functioning as advertised, and selling it for the same price as something that works? Ok... No innovation from Nvidia? Apparently you don't bother to do any reading at all do you? http://www.gputechconf.com/page/home.html Might want to read the hundreds of innovations that Nvidia has developed for the computing industry. You're right about one thing though: the ignorant consumers who don't bother to do their research will continue to buy junk products.
Yes i had guru 2 but have seemed to miplaced the disc, my gpu is this one but with three fans. Cosmic rays!? Oh ****! http://www.guru3d.com/articles_pages/gigabyte_geforce_gtx_660_ti_windforce_oc_review,1.html it came factory overclocked. Can i still bump it up? Also which one would work best https://www.google.co.th/search?q=s...Qek5oCoBg&sqi=2&ved=0CFEQsAQ&biw=1686&bih=927
I would not overclock your GPU for PS2. Especially since it's factory overclocked already. You can easily run this game on Ultra, with all the settings I recommended as far as texture filtering. I reduce the power target to 80% on my 660Ti so it stays lower clocked than a stock one most of the time, because of the 2D menu glitch. http://www.gigabyte.us/products/product-page.aspx?pid=4320&dl=1#utility You can get OC Guru there. LOL, you're not really planning on liquid submersion are you? It's really only necessary for ultra high overclocking and power dense data centers.
Yep, going to Bali for a week to train some cooks then looking into aquariums. Going with an 1150 socket i7 and msi monster power something another mob. FYI http://www.invadeit.co.th/product/v...ce-gtx660-ti-3gb-gddr5-gv-n66toc-3gd-p015060/ and running adle temp of 29c. Bout to start playing so will let you know how the temp plays out with oc guru2. If my aquarium rig cant play PS2 properly i will be calling smedley, he may need my snorkel. Hope to kill you soon! Oh will look for spikes at map screens as well!
Well played sir, I was reading through to determine if anyone was going to correct matters, but you cleaned house pretty well. All accurate, multi-sourced, and reputable. Could not have put it better myself.
We are all missing the bigger picture here! And that is the 780s don't come with the $75 bonus cards.. Wish I would have known that before buying a few of them and thinking the cards would be in the box
How does your GPU stay at 41C in a battle? That's crazy. Mine will hit 50-55C in battles, and 60 in the 2D menus. That's after reducing the power target to 80% too lol. Thanks. Unlike Serpent, I actually care about facts and scientific data(inside joke with me and XRIST0). I don't just pretend I'm "not biased because I have a card from each brand".
There isn't much you should have to tweak in your BIOS unless you're overclocking. I don't think the card has the ability to monitor memory voltage, and they just left it in the software tool. Do you mean XMP for overclocked memory that's beyond DDR3 1600? I believe you only need to enable that if you have memory that's faster than 1600 specifications.
I found going CPU STRONG is the best. I left my system at default clocks on the CPU and got 40-75 FPS in the WG I cranked it to 4.7 and BAM 120-160 in the WG. So the CPU is the key.
Wtf 41 full load, too hard to believe? OK, I have a different GPU, but I also used to have major arsenal of videocards (ati and nvidia). I could never ever keep them under 80c no matter what I did. Also every single friend of mine reports 80c as the default load temps that with fan speed adjustments. Indeed if the room is hotter as now in the summer, the fan noticeably goes higher, but all in all i have never ever went cooler. I even did a test in the winter by letting 2 windows open in my room till I got back from work to test the system under cold conditions. There were 4degrees in my room and after 3 minutes of battlefield 3 it went straight into 80c. Cable management ok, ariflow ok. This is how it still looks from the inside...except the old psu: What did you do to it?
https://www.dropbox.com/s/nwzl6zm4twyfvg0/003.JPG https://www.dropbox.com/s/b3xejvup9hfb4nd/005.JPG Sorry, could not figure out proper screen shot procedure with new keyboard but if you look through the glare of the first photo you will see 41c. It was up to 42 and we were in heavy battles tonight. In bios i changed all fan settings to turbo and did an aggresive fan boost through oc guru II but the fan never got above, i believe 50%. My case also has two large fans on the top panel that are turned on to high. It is a cougar case. My gpu has 3 fans and i believe the gigabyte 660 ti that is not over clocked at the factory has two. I also live in Thailand where it is hot and muggy as it is the monsoon season but i like to keep my office nice and chilly. Everything seems nice and cool in there. I also keep my case closed, only took off side panel for photo but it has a clear side to see through.. Peace