Advanced search

Message boards : Number crunching : Building a Desktop for GPUGrid

Author Message
tomba
Send message
Joined: 21 Feb 09
Posts: 497
Credit: 700,690,702
RAC: 0
Level
Lys
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34074 - Posted: 29 Nov 2013 | 19:10:14 UTC

I would like to have a go at replacing my four-year-old+ Dell XPS 435 with a home-built (i.e. me-built!) system. The Dell has but one PCIe x16 socket and I have a spare GPU that's sitting in its box doing nothing.

This is new territory for me but here is what I think I need:


  • A mother board that supports two wide Nvidia GPUs and a fast Intel i5 processor. Initially the GPUs will be modest but, as funds become available, I would replace them with the latest whiz-bang Nvidia offerings; GPUGrid is my priority.
  • Two hard drives, two DVDs and an SSD for Windows (7...)
  • A PSU to support the above
  • A box to put it all in, with minimum two USB ports on the front (lots on the back!) and great cooling!.


I live in France so parts sourcing would have to be Europe.

I do hope this hits many of you who have been down the home-grown path before!

Thanks, Tom


____________

Profile microchip
Avatar
Send message
Joined: 4 Sep 11
Posts: 110
Credit: 326,102,587
RAC: 0
Level
Asp
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34076 - Posted: 29 Nov 2013 | 22:17:25 UTC

I just built a new system, based on AMD FX 8350 8 core CPU. It uses an ASRock 990FX Extreme3 with 3 PCIe slots (two @ 16x and one @ 8x). I use my older GTX 560 and GT 440 cards but only the GTX 560 crunches for GPUGRID as the 440 is somewhat underpowered, though it crunches Milkyway and Einstein. I live in Belgium and payed for the mobo 100€ while the CPU costs me 188€. PSU, HDD, Case and DVD come from my old system so I just reuse them

I have little experience with modern Intel CPUs so can't comment on them much. What I do know is that the AMDs are better at multi-threaded tasks. So if you use them, like I do for audio/video encoding besides number crunching, AMD will not only be 10-15% faster but also a lot cheaper.
____________

Team Belgium

TJ
Send message
Joined: 26 Jun 09
Posts: 815
Credit: 1,470,385,294
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34082 - Posted: 30 Nov 2013 | 0:53:40 UTC
Last modified: 30 Nov 2013 | 1:03:40 UTC

Here is a lot of information, if you like some reading.

BTW: Amazon.de is some nice prices for parts, I bought a few there and via the link of this site you fund the project a little as well.
I guess you have a link to Amazon.fr?
____________
Greetings from TJ

Dagorath
Send message
Joined: 16 Mar 11
Posts: 509
Credit: 179,005,236
RAC: 0
Level
Ile
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34085 - Posted: 30 Nov 2013 | 10:04:24 UTC - in response to Message 34074.

We can give better advice if we know what you intend to use your new system for. So far you have told us you will be using it to crunch mostly GPUgrid. What else do you intend to use it for? Do you do video/photo editing? Gaming? Develop software? Compose music (MIDI stuff or whatever they do these days)?

I ask because if it's for general web surfing, email, skype then why do you need 2 HDD, 2 DVD and an SSD? Unless you intend to keep lots of files and need super fast access maybe you don't need that. Do you intend to burn lots of music files and duplicate DVDs? If not then why 2 DVD?

Cooling.... If you install a video card that vents hot air into the case you create big cooling headaches. Select a GPU that vents the hot air out of the case. You'll spend a lot less on additional case fans and other cooling solutions if you do it that way. If you decide to go with a card that vents air into the case then buy a case that already case holes drilled and grills for additional fans because you'll need them. Or maybe you have the tools to cut and drill the holes yourself.

____________
BOINC <<--- credit whores, pedants, alien hunters

tomba
Send message
Joined: 21 Feb 09
Posts: 497
Credit: 700,690,702
RAC: 0
Level
Lys
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34199 - Posted: 11 Dec 2013 | 7:35:42 UTC
Last modified: 11 Dec 2013 | 7:39:00 UTC

Humble apologies! I subscribed to this thread but have yet to get any email notification of your postings! Sorry.

First, your questions / comments:

Yes - I use Amazon France (free shipping!) but I'll certainly check out Amazon Germany.

No - nothing I run, apart from BOINC, puts a great demand on the processor. I do a bit of Office, email and lots of surfing. I like two hard drives so I can clone C: every night but two DVD drives is a bit over the top. And an SSD is a nice-to-have only.

I came to the conclusion that AMD offers a better bang for the buck so that's what I've looked at.

OK, here's what I've been thinking these past 10 days:

I want to do more for GPUGrid. Right now my ASUS GTX 660 does two long runs every 24 hours. I have a GTX 460 sitting its box that will do one in 24 hours but I only have one PCIe x16 slot. Over time I want to invest in more powerful GPUs so the plan is to build a system that has enough oomph for that future.

My thoughts at the moment are:

1. AMD rather than Intel; better bang for the buck.
2. Mobo with four PCIe x16 slots
3. A case that fits under my desk and will support water cooling (if needed...)
4. 1000W PSU

Here's my first stab at a configuration with parts from Amazon France:



Not cheap!

I have a GSO 9600 somewhere. Perhaps, for starters, I could use that for the video and, if I can keep the total spend to €1,000 I could add a second GTX 660 (€159). That would let me do five WUs/24 hours. To do that I would need to cut back on the configuration:

1. The case is a given, methinks
2. A more modest CPU?
3. Fan rather than water cooling?
4. A less powerful PSU?
5. Slower RAM?
6. Perhaps there's a more modest mobo that would still deliver for three top-end (future...) GPUs.

Note above that I have no ideas about the paste to use between the CPU and its cooler...

Tom

tomba
Send message
Joined: 21 Feb 09
Posts: 497
Credit: 700,690,702
RAC: 0
Level
Lys
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34200 - Posted: 11 Dec 2013 | 7:47:36 UTC

Just checked out Amazon Germany prices.

Total is €981.47 vs. Amazon France's €918.52. And I doubt I'd get free shipping from Germany!

Tom

TJ
Send message
Joined: 26 Jun 09
Posts: 815
Credit: 1,470,385,294
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34204 - Posted: 11 Dec 2013 | 17:13:48 UTC - in response to Message 34200.

Hi Tom,

I get free shipping from Amazon.de to the Netherlands.
A MOBO with 4 PCIe buses is expensive and in my opinion not handy. If you put 4 GPU's in , they will almost tough each other with heat build up.
I have a PC with the same AMD processor but used this MOBO:
ASUS SABERTOOTH 990FX R2.0 - Socket AM3+ - ATX with currently a price of 145 Euri in Netherlands.
I have never seen that one of my rigs has used more than 5GB of RAM, so I now use only 8GB (2x4) 1600 that is around 85 Euri, so you could safe there.
1000W PSU sounds good to me.

But wait to what others will advice you and chose what you like best.
____________
Greetings from TJ

Profile microchip
Avatar
Send message
Joined: 4 Sep 11
Posts: 110
Credit: 326,102,587
RAC: 0
Level
Asp
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34205 - Posted: 11 Dec 2013 | 17:45:23 UTC
Last modified: 11 Dec 2013 | 17:48:29 UTC

You may consider the ASRock 990FX Extreme3. It costs 99.95 euro's (in Belgium) and has 3 PCIe slots. It's a very decent mobo in my experience, despite its "cheaper" price. Its UEFI BIOS is also nicely laid out. That mobo unofficially supports up to 64GB RAM. I say unofficially because ASRock says it needs to bring out a new BIOS version to support that much memory, but it's only willing to do so when 16GB modules become available so it can verify they are compatible. Without a new BIOS, the mobo officially supports 32GB.

Also, good choice on the FX 8350. I have exact the same one :)
____________

Team Belgium

Dagorath
Send message
Joined: 16 Mar 11
Posts: 509
Credit: 179,005,236
RAC: 0
Level
Ile
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34207 - Posted: 11 Dec 2013 | 21:50:22 UTC
Last modified: 11 Dec 2013 | 21:51:48 UTC

Tomba,

I'm not sure how much of the following info you are already aware of but I have explained in some detail so that other crunchers who are as new to this game as I am can learn too. As always, i hope the gurus are checking the details so that I can verify whether or not I have actually figured this stuff out correctly, especially how the PCIe lanes work.

The easy stuff first... the paste. AFAIK, the best paste these days apparently contains silver. Silver is expensive but since it's the least expensive part of building a rig and a small tube goes a long way it probably makes sense to buy the best. I use whatever I can find when I need it even if it's the cheapest stuff on the market because it works well enough for me. Remember the best paste in the world is no better than cheap paste if your cooling system is poorly designed to begin with. If you understand and use the implications of the Laws of Thermodynamics you'll get excellent results even with cheap paste.

I was preaching the benefits of a future proof system a couple weeks ago but as usual skgiven has given me good reason to rethink that. I didn't realize just how expensive future proof can be. Also, since then I have been monitoring resource usage on my two rigs and I see now that what skgiven has repeated frequently is true... today's GPUgrid tasks don't need massive CPU power, fast PCIe or lots of RAM on the GPU card. What they do need is good cooling. The two mobos I am using are old and have only PCIe generation 1.0 but my tests have shown that two GPUgrid tasks running simultaneously on my GTX 660TI pin the GPU to 99% usage continuously but never use more than 40% of the PCIe capacity! So, IMHO, unless you plan on 3 X GTX 690, PCIe gen 2.0 is more than adequate and PCIe gen 3.0 is expensive overkill at this time and may very well be for several (?) more years. When PCIe 2.0 becomes inadequate and PCIe 3.0 is a necessity, mobos with 3.0 will probably be much cheaper than they are now. But who can be certain of the future?

Here is the important part for you (possibly)

You've stated you want a mobo with 4 X PCIe X16 slots. The ASUS Crosshair V Formula-Z you are considering has four slots that will accept PCIe x16 cards but only three are double width slots which means you would be able to put only three GTX 660 (or similar vintage) cards on that mobo. If you were to put a GTX 660 in the third slot it would cover the fourth slot. That is apparent from the picture of the mobo at the above link.

Not sure if you're aware of this next part. If you are then maybe it will help somebody else. The PCIe specs for the ASUS Crosshair V Formula-Z state:

3 x PCIe 2.0 x16 (dual x16 or x16/x8/x8)
1 x PCIe 2.0 x16 (x4 mode)
1 x PCIe 2.0 x1
1 x PCI


The dual x16 means if you install only two x16 cards then those two slots will be assigned 16 lanes each. The x16/x8/x8 means if you install three x16 cards then one slot will be assigned 16 lanes but the other two will be assigned only 8 lanes each. A PCIe lane is similar to a lane on a roadway. A road with one lane can carry a certain amount of traffic, a road with four lanes can carry (theoretically) 4X more traffic, sixteen lanes means ~16X more traffic. More PCIe lanes means more data can flow from the GPU to RAM/CPU/disk in the same amount of time. The point is this... though that mobo can accept four cards that can use sixteen lanes each, if you install three such cards only one will be able to move data at x16 speeds, the other two will be reduced to half speed (x8). Pray the gurus correct me if I am wrong about that.

You said eventually you want to put high power GPUs on the mobo. The question is... How "big" or "powerful" could you go with that mobo. The future is hard to predict but with the demands of current GPUgrid tasks, IMHO, you could install 3 X GTX 660TI or even 3 X GTX 680 cards and get excellent results. If you want to eventually put 3 X GTX 690 then you might need all slots to run at x16 speed to prevent a bottleneck on the PCIe "bus". Probably someone with more experience should comment on the feasibility of 3 X 690 on that mobo, current feasibility as well as future feasibility.

There are mobos that have four double-width x16 slots that will all run at x16 speeds but they are bloody expensive. IIRC, over $400 CDN for such a mobo. On top of that you need a CPU that has, IIUC, 4 X 16 = 48 PCIe lanes else some slots will run at less than x16 speed. As you can imagine, the more lanes a CPU has the more it costs. Intel CPUs with 48 lanes are bloody expensive and limited only to Haswell (?), not sure about equivalent AMD models.

The CPU

IMHO, a four core CPU running at a reasonable clock speed is more than sufficient for 3 X GTX 680 running GPUgrid tasks. Again, not sure about 3 X GTX 690.

RAM on mobo

I also doubt you need 16 GB but RAM is relatively cheap so maybe 16 is not a bad idea. But remember the mobo has four RAM slots. You could start with 8 GB and wait for a really got deal on another 8, if you think you need it.

Cooling rant continued

Liquid cooling isn't worth the price or hassle, IMHO. Pumps wear out, hoses leak, if you use the wrong fluid or combination of metals the radiator corrodes and leaks... who needs that? Also, there was a cruncher here a little over a year ago who thought he had it all wrapped up with expensive liquid cooling but he put the machine under his desk as you plan to do, Tomba. Bad move. He fried the CPU (possibly GPU too, can't remember) because the hot exhaust built up under his desk and the whole works overheated. Apparently the thermal protection system in the GPU/CPU didn't function correctly or something like that. The good news is it kept his toes warm, for a while. Anyway, forget all the hype in the ads, a shiny new "game changing" liquid cooling system with "jaw dropping" cooling specs won't work worth a damn if the ambient temperature is too high. In geographical locations that get fairly warm in summer, the smart way to cool rigs that generate a lot of heat is to expel the hot exhaust to the outdoors or at least into a different room in the house. Or, if you can afford it, crank up the air conditioner. If I were to put liquid cooling on 3 GPUs I would run the coolant lines out of the house and situate the radiator outside otherwise it's a bad waste of good money. I would also devise a way to move the radiator indoors for the winter so the rig can help heat the house. That additional flexibility requires only two additional hose couplings. Under no circumstances would I situate the radiator under my desk without an additional, large, mains powered fan (about 25 cm?) to move the hot air out from under the desk.

PSU

Google 'psu calculator'. I've heard the "eXtreme site" is very good, YMMV. Enter your system specs and get the answer.

Case

Think outside the box, literally. Three cards on a mobo that is oriented vertically is not a good idea, IMHO, unless you're willing to install several additional case fans. If you orient the mobo horizontally and leave the top open, you get the benefit of convection (hot air rises all on its own) which means fewer additional fans and less noise. Also, with three GPUs on one mobo I would go for cards with the single blower type fan which are louder but move more air than blade fans. The main benefit of the blower fans is that they push the air out the back of the card and away from the other cards rather than out to the edges of the card and onto adjacent cards as is the case with dual bladed fans. Others may disagree but I think for a three card setup that is very important.
____________
BOINC <<--- credit whores, pedants, alien hunters

TJ
Send message
Joined: 26 Jun 09
Posts: 815
Credit: 1,470,385,294
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34221 - Posted: 11 Dec 2013 | 23:46:31 UTC - in response to Message 34207.

A good sum up Dagorath but I have some comments.
A GTX690 has 2 processors so it is in fact a dual card. With Windows you can only run 2 of these (4 GPU's in total). For more you need to go to Linux or something else.
With 3 cards you need to alter the BIOS first, on most MOBO's.

A lot of people suggest to use an open case. I do not agree. If you have a close case (side panels on) that will work as a funnel. Dell workstation have even an extra device over the processor with a fan that is a close system and works as a funnel with a sort Venturi effect. There is only a heat sink mounted on the CPU's (mostly two Xeon's) with only one fan and operates cooler and more silent then some rigs I built myself. I have checked it with a special thermometer with multiple sensors. A close system is cooler, especially when ambient temperature is high.
If you open the case and place a blower at it, then it will become another story.
By the way; most of my rigs stand under my desk with no problems.

Finally there is a lot written about single and dual (or triple) GPU fans. I have a 770 which has two fans. So one troughs heat out and the other puts it into the system. This card however runs GPUGRID at 63°C (or lower).
In the same system I have a 780Ti with one fan. This one runs at 72°C because I have set the maximum temperature to 72°C with GPUTweak. If I don't do that it goes to 80°C easily (what I don't like). So from experience next time I will buy a EVGA 780TI with two fans.

This is however my opinion. But same as you I like to learn as well. So I will read other posts with great interest.
____________
Greetings from TJ

Dagorath
Send message
Joined: 16 Mar 11
Posts: 509
Credit: 179,005,236
RAC: 0
Level
Ile
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34224 - Posted: 12 Dec 2013 | 5:38:47 UTC - in response to Message 34221.
Last modified: 12 Dec 2013 | 5:48:01 UTC

Good points TJ :)

Funny, as I was composing my previous post and recommending an open case, I was thinking of Dell machines. I have peeked inside several different Dells and have come to admire the engineering in their cooling system. I agree, the funnel/venturi is an excellent idea and so simple. I don't know why other manufacturers haven't imitated it. Patent maybe?

A similar idea to Dell's... back in the mid 1990's one could buy well made, flexible plastic duct with flanges, mounting holes and screws all correctly sized for standard computer fan dimensions. The idea was to attach one end to the fan on the CPU heatsink and the other end to the case to provide a constant flow of cool(er) air directly onto the CPU with no mixing of the cool(er) air with hot air in the case. For some reason that idea didn't seem to catch on and I haven't seen those ducts anywhere for years which is sad. I would buy several if I knew of a source. I don't like having my case open either but I prefer to build my own rigs as opposed to buying pre-built so an open case is an easy/inexpensive option.

Thanks for your comments on GTX 690. I knew they were a dual GPU design (2 X 680 I believe) but that's about all I knew. Your comments fill in some missing information which is good because my ultimate goal was to have 4 X 690 on a single mobo. Seems I must lower my expectation to only three though they'll probably be up to 990 or 1090 before I can afford it.
____________
BOINC <<--- credit whores, pedants, alien hunters

tomba
Send message
Joined: 21 Feb 09
Posts: 497
Credit: 700,690,702
RAC: 0
Level
Lys
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34227 - Posted: 12 Dec 2013 | 8:25:35 UTC

Many many thanks for all the insights you guys have provided. I'm grateful.

My planned configuration has changed! See the blue rows. Only the case and PSU are as before.



Mobo: I took heed of the warnings about four GPU slots; their performance and cooling. I'm now at three.

CPU: I'm down from 4.0 GHz to 3.5 GHz. The price drop was worth while.

CPU Cooler: Now a fan rather than water cooling. The one I chose was not the cheapest but it gets excellent ratings.

RAM: Down from 16GB to 8GB.

Paste: Added the most popular one.

GPU: I ADDED a new GTX 660.

I ran the PSU calculator (good tip!). With three modest GPUs I got 700W. Replacing two of them with top-end GPUs I got 876W and a recommendation for a 1000W PSU. Spot on!

So - I went from €925 to €845 AND I get another GTX 660! Can't be bad...

Thanks again everyone. Tom

TJ
Send message
Joined: 26 Jun 09
Posts: 815
Credit: 1,470,385,294
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34232 - Posted: 12 Dec 2013 | 10:48:23 UTC - in response to Message 34224.

An air duct, that was the word I was thinking of but could not get into mind. So thanks Dagorath. The Alienware now from Dell has such an air duct over the GPU's. Only two will fit and then with a front fan it blows directly into the duct while another fan cools the rest of the case
____________
Greetings from TJ

TJ
Send message
Joined: 26 Jun 09
Posts: 815
Credit: 1,470,385,294
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34233 - Posted: 12 Dec 2013 | 10:51:36 UTC - in response to Message 34227.

One more thing Tom,

Amazon.fr is indeed a nice place to buy, but I have seen that some things at Amazon.de are more expensive then some dedicated computer hardware internet shops. So I compare with a few large one in the Netherlands. Do not know about these in France but they must exists. Perhaps you can check a few, and a 660Ti might be available for the same price and is faster.

Good luck with your new machine.
____________
Greetings from TJ

tomba
Send message
Joined: 21 Feb 09
Posts: 497
Credit: 700,690,702
RAC: 0
Level
Lys
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34234 - Posted: 12 Dec 2013 | 11:15:04 UTC - in response to Message 34233.

One more thing Tom,

Amazon.fr is indeed a nice place to buy, but I have seen that some things at Amazon.de are more expensive then some dedicated computer hardware internet shops. So I compare with a few large one in the Netherlands. Do not know about these in France but they must exists. Perhaps you can check a few, and a 660Ti might be available for the same price and is faster.

Good luck with your new machine.


Hi TJ,

Thank you for that. Good idea to check other suppliers in France and I will, though there are pretty good reasons to give Amazon France the business.

Except for the CPU cooler, which will come from an Amazon trader, I can put all the bits one one order. I can specify on that order to ship only when everything is available. That means they won't take my money till then, and my 30 days "Return to Amazon" guarantee starts at the same time for all components.

Tom

Profile Retvari Zoltan
Avatar
Send message
Joined: 20 Jan 09
Posts: 2353
Credit: 16,375,531,916
RAC: 5,811,976
Level
Trp
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34237 - Posted: 12 Dec 2013 | 13:37:25 UTC - in response to Message 34221.
Last modified: 12 Dec 2013 | 13:44:39 UTC

A GTX690 has 2 processors so it is in fact a dual card. With Windows you can only run 2 of these (4 GPU's in total). For more you need to go to Linux or something else.
With 3 cards you need to alter the BIOS first, on most MOBO's.

You should ask Firehawk on how to put 3 GTX 690's in a single PC with Win7.

Finally there is a lot written about single and dual (or triple) GPU fans. I have a 770 which has two fans. So one troughs heat out and the other puts it into the system. This card however runs GPUGRID at 63°C (or lower).
In the same system I have a 780Ti with one fan. This one runs at 72°C because I have set the maximum temperature to 72°C with GPUTweak. If I don't do that it goes to 80°C easily (what I don't like). So from experience next time I will buy a EVGA 780TI with two fans.

Blower type (single radial fan) GPUs are better when there are more than 1 GPU in a close case, as the blower type fan directly blows the hot air out from the case on the back side, except the GTX 690, which has the fan in the middle of the card, so half of the heat stays in the case (unless you have some special front fans). The multi-fan (axial fan) coolers are always blow the hot air inside the case (only a small amount goes out on the rear grille), so if you put more such cards in a close case they will heat each other (especially the upper ones gets hotter). I recommend not to put any GPUs next to the other, at least 1 slot space should be left empty between the cards to have adequate airflow. So you should buy a MB with 2 slots between the PCIe x16 slots. (The Asus Sabertooth 990FX R2.0 is such an MB). I'm using motherboards with 4 (equally spaced) PCIe x16 slots, but I'm using 2 slots only so I have 3 slots between the used PCIe slots, to have 2 slot room for the airflow between the cards.

Dagorath
Send message
Joined: 16 Mar 11
Posts: 509
Credit: 179,005,236
RAC: 0
Level
Ile
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34253 - Posted: 12 Dec 2013 | 17:36:21 UTC

@TJ:

That's another good example of good Dell innovation and engineering. I hope it sells well for them. I agree with your recommendation of 660TI over 660. For just a little more money you get a big increase in performance.

@tomba:

Consider saving the cost of Windows and install Linux instead. Linux will give you 11% shorter run times compared to Win 7/8. If you intend to keep your current Windows rig you could use it for web surfing and non-crunching activity. I'll guide you through the installation, setup and configuration every step of the way and I'm sure other Linux users will help too. It's not nearly as difficult as many people think. Eventually you'll get rid of Windows completely and wonder why you didn't do it sooner. You might be able to get a nice UPS for what you would spend on Windows.

@Retvari:

Thanks for sharing your considerable wisdom with 690 cards and the problems associated with different fan types. I spent many years as a welder where I built numerous different things, from metal and glass furniture to bridges. I love building custom solutions for problems that don't have a solution one can purchase off the shelf. If I had 4 X 690 on one mobo, each with a radial fan between the two GPUs I would fashion a custom duct out of thin cardboard that would fasten to the back side of each card to collect the hot air from all of them and carry it outside the case. The proper name for such a structure is "exhaust manifold" and of course every gasoline/diesel powered vehicle has at least one. A manifold could also be made out of the thin, flexible kitchen cutting boards one can find in stores. I am a fanatic about not allowing hot air to escape into the case. Collect it ASAP and send it out the case. Balsa wood is another material that is very easy to cut and fashion into small shapes. My friend makes amazing shapes out of paper that he later coats with epoxy to give it strength and rigidity.

Send me a photo and I'll tell you what to measure and how to measure it. Or send me a dimensioned drawing. I'll make you a custom manifold out of plastic, steel, aluminum, wood, leather, fiberglass, epoxy reinforced paper, anything except concrete. I hate concrete.

____________
BOINC <<--- credit whores, pedants, alien hunters

tomba
Send message
Joined: 21 Feb 09
Posts: 497
Credit: 700,690,702
RAC: 0
Level
Lys
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34261 - Posted: 12 Dec 2013 | 22:38:49 UTC - in response to Message 34253.

@TJ:

I agree with your recommendation of 660TI over 660. For just a little more money you get a big increase in performance.

Here I read that the 660TI is but 4% points better than the 660:



Given that Amazon France's 660 is €165 and the TI version is €255, for a 4% improvement I'll stick with the non-TI version (me thinks...). It's hardly a big performance increase and it's not a little more money.

Tom




Jim1348
Send message
Joined: 28 Jul 12
Posts: 819
Credit: 1,591,285,971
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34262 - Posted: 12 Dec 2013 | 22:55:07 UTC - in response to Message 34261.

Here I read that the 660TI is but 4% points better than the 660:

I think that was quite accurate about a year ago. But since then, the work units have gotten harder. My own tests on the GTX 660 Ti indicate that it is maybe 15% faster than the 660 now. And the power useage is slightly less, so it has close to a 20% improvement in points per day/watt. I don't think that is necessarily enough to overcome the big price advantage that the 660 enjoys (about $100. in the U.S. at the moment, maybe more), but it is worth mentioning.

TheFiend
Send message
Joined: 26 Aug 11
Posts: 100
Credit: 2,557,052,477
RAC: 2,225,950
Level
Phe
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34267 - Posted: 13 Dec 2013 | 3:28:31 UTC

I often go looking for bargains - earlier this year I acquired 2 Crosshair IV Formula motherboards for less than the price of one more recent Crosshair V.

tomba
Send message
Joined: 21 Feb 09
Posts: 497
Credit: 700,690,702
RAC: 0
Level
Lys
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34270 - Posted: 13 Dec 2013 | 7:50:59 UTC

Nearly there! Funding approved by she who must be obeyed...

My biggest concern has been mounting a CPU cooler but I found this video, a step-by-step guide to installing my chosen cooler. I noted the materials used and bought them yesterday. See the blue rows in the table. I think I got a great deal on the actual cooler; €28.87 with free shipping, from a French supermarket chain - here.



Somehow an SSD crept in there but, with tongue in cheek, I can claim the total is under €1000...

A niggle is that the mobo and case must come from Amazon traders. I'm thinking about ordering them first, followed by the Amazon order when I see the whites of their eyes.

Tom

tomba
Send message
Joined: 21 Feb 09
Posts: 497
Credit: 700,690,702
RAC: 0
Level
Lys
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34271 - Posted: 13 Dec 2013 | 8:51:51 UTC - in response to Message 34253.

@tomba:

Consider saving the cost of Windows and install Linux instead. Linux will give you 11% shorter run times compared to Win 7/8. If you intend to keep your current Windows rig you could use it for web surfing and non-crunching activity. I'll guide you through the installation, setup and configuration every step of the way and I'm sure other Linux users will help too. It's not nearly as difficult as many people think. Eventually you'll get rid of Windows completely and wonder why you didn't do it sooner. You might be able to get a nice UPS for what you would spend on Windows.

Thanks for that, Dagorath. I like the idea of shorter run times but, this being my first build, I think it unwise to implement a new opsys now. And I do have a "free" copy of Win7 available, i.e. paid for long ago and written off.

Perhaps you could point me at a rundown on the differences between your preferred Linux and Win7?

John C MacAlister
Send message
Joined: 17 Feb 13
Posts: 181
Credit: 144,871,276
RAC: 0
Level
Cys
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwat
Message 34273 - Posted: 13 Dec 2013 | 9:34:11 UTC

Lovely machine, Tomba.

The only changes I would make would be using Ubuntu and installing an FX-8350 CPU. Alas, funding authority has nixed the proposed new build for now. :(

tomba
Send message
Joined: 21 Feb 09
Posts: 497
Credit: 700,690,702
RAC: 0
Level
Lys
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34275 - Posted: 13 Dec 2013 | 11:40:27 UTC - in response to Message 34273.

The only changes I would make would be using Ubuntu and installing an FX-8350 CPU.


Linux is not on, yet... If I went for the FX-8350, what difference would I see for the extra €40?

Tom

Dagorath
Send message
Joined: 16 Mar 11
Posts: 509
Credit: 179,005,236
RAC: 0
Level
Ile
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34277 - Posted: 13 Dec 2013 | 13:18:08 UTC - in response to Message 34271.
Last modified: 13 Dec 2013 | 13:23:37 UTC

Thanks for that, Dagorath. I like the idea of shorter run times but, this being my first build, I think it unwise to implement a new opsys now. And I do have a "free" copy of Win7 available, i.e. paid for long ago and written off.

Perhaps you could point me at a rundown on the differences between your preferred Linux and Win7?


Oh, well if you already have a Windows and that's the OS you know best than that is definitely the way to go to get it all working. You can experiment with Linux later when the time is right. And since you have a legal Windows disk you have a lot of nice options for implementing a dual opsys system. You can go dual-boot which allows booting either Linux or Windows. Or you can install Linux then install VirtualBox (free) on Linux and use your Win7 disk to create a virtual Win7 machine that runs on the host Linux opsys. That way you can have both opsys running simultaneously. You can also do the opposite and install VirtualBox on Win7 and create a virtual Linux machine running on the host Win7 opsys. This latter scenario as a very convenient way to explore Linux without having to boot back and forth between Win and Lin. People who have never experienced a virtual machine are usually surprised at how well they work and how robust they are. I use them frequently.

My favorite Linux is Ubuntu and one of the reasons (not necessarily the main reason) it is my favorite is because it's lead developer, Mark Shuttleworth, is wealthy and doesn't mind spending the money required for development. Probably the biggest reason it is my favorite is that Shuttleworth has abandoned many of the ideas that have prevented more people from adopting Linux. He has made Ubuntu very easy to use, easier than Windows IMHO, while retaining all of Linux's traditional stability and open source concept.

I could point you at various rundowns of the differences between Ubuntu and Windows but I think you would view those the same way I do. They hype either Linux or Windows and resort to niggling little details about both. For me there are only three important differences: Linux is free, more stable and more user friendly. All other differences grow out of those three top level differences, again IMHO. Others present differences that are correct but so irrelevant that in the end those differences are always overridden by which opsys you are most familiar with so what's the point of worrying about those.

I like Linux because it's free and there is nobody saying...

I agree Bob, your team's code would improve our opsys, good job. To keep revenues up we cannot offer that as a free update/upgrade so we'll hold it back for now and include it in the next version along with other things our loyal suckers... ooops! I mean loyal users... have been brainwashed to want, and we'll charge them ummm... errrr... hmmmm... call in the marketing people and lawyers and ask them what they think our loyal suckers... ooops! I mean loyal users... will be able to afford next time then add 25% to that. Huh? The new code hasn't been fully debugged and tested? What does that have to do with anything? If we hear any complaints about that we'll send McAfee, Norton and their crew on a free Mediterranean cruise and layout how they will tell the world all the trouble is due to... ummmm... hmmmmm.... who is most unpopular these days... Al Qaida? that nutbar in North Korea? those holier than thou Canucks? bahh it doesn't make any difference and yes I know it's Canadonians not Canucks, the point is we'll pull a name out of a hat and blame it all on the cyber-terrorists they sponsor. It works every time, even Jobs gets away with it and everybody knows what a poofter he is. Which reminds me... how is our campaign to associate turtlenecks with homosexuality and AIDS progressing? Are our loyal suckers... ooops! I mean loyal users... buying that nonsense yet or do we have to give away a truckload of cheap turtlenecks away at the upcoming Gay Pride parade and snap pictures of them with the turtlenecks on with Gay Pride banners clearly visible in the background and plant the pics right beside ads about AIDS?


You know all the above is true, extrapolate from there. You don't need my help with that.
____________
BOINC <<--- credit whores, pedants, alien hunters

TJ
Send message
Joined: 26 Jun 09
Posts: 815
Credit: 1,470,385,294
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34283 - Posted: 13 Dec 2013 | 17:36:23 UTC

Hi Tom,

You made the system even nicer.
However if you buy an SSD from 100 or 120 GB (which is enough for Win7 and a lot of programs) and use the hard disk for data, you safe about €80 and could by the faster AMD CPU or a 660Ti?
With the CPU cooler you get thermal past so you didn't need to order that extra.
I have several anti static wrist bands but don't use them. What I do however is to lay the MOBO on the anti static bag it is shipped in. And if you read the manual of the CPU cooler than you see which nuts you need to place in which hole for the AMD build and then you can place that more easily at the back of the MOBO than was shown in the film. You don't need to touch the MOBO so much in that case.
And for cleaning the heat pipes that tough the CPU and the CPU as well you can use iso-alcohol, also known as rubbing alcohol or isopropyl alcohol, you can buy a bit at a local chemist or pharmacy, way cheaper then the special stuff hardware stores are selling.
____________
Greetings from TJ

Dagorath
Send message
Joined: 16 Mar 11
Posts: 509
Credit: 179,005,236
RAC: 0
Level
Ile
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34285 - Posted: 13 Dec 2013 | 19:40:42 UTC - in response to Message 34283.

My friend installed an SSD in his computer. Now that the initial Wow! factor has worn off he says he wishes he had bought a UPS instead. The main benefit he sees is that the machine boots extremely fast but if the machine is a dedicated cruncher it doesn't shutdown and reboot often. He claims the SSD makes little difference to his normal computer activities like web surfing, listening to music, word processing, and CAD (he's a draftsman). Sure, those files load and save a little faster but there's not much practical difference between 1 second and 5 seconds. If you save 4 seconds 50 times a day that's 200 seconds or less than 4 minutes which is nothing I would worry about.

A UPS can save you hours of messing around with restores and redoing work that got destroyed by a power failure before it got saved or backed up.

____________
BOINC <<--- credit whores, pedants, alien hunters

Profile Retvari Zoltan
Avatar
Send message
Joined: 20 Jan 09
Posts: 2353
Credit: 16,375,531,916
RAC: 5,811,976
Level
Trp
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34286 - Posted: 13 Dec 2013 | 22:36:47 UTC - in response to Message 34285.
Last modified: 13 Dec 2013 | 22:50:30 UTC

My friend installed an SSD in his computer. Now that the initial Wow! factor has worn off he says he wishes he had bought a UPS instead. The main benefit he sees is that the machine boots extremely fast but if the machine is a dedicated cruncher it doesn't shutdown and reboot often. He claims the SSD makes little difference to his normal computer activities like web surfing, listening to music, word processing, and CAD (he's a draftsman). Sure, those files load and save a little faster but there's not much practical difference between 1 second and 5 seconds. If you save 4 seconds 50 times a day that's 200 seconds or less than 4 minutes which is nothing I would worry about.

In my experience it is a very important factor that the user interface should immediately react to the user's actions something spectacular (not just changing the shape of the mouse pointer), or the user will click again and again and again. Having an SSD and a lot of RAM (so you can turn off the virtual memory) can enhance the user experience so much, that the wow factor could be permanent. In this area the mobile devices always had the benefit of having an SSD built in right from the start. (I had an old "tablet" PC with Win98 on a 2.5" HDD - it was a completely pointless device)

EDIT: I have a Core i7-980X (6 cores, 2 threads on each core) and I used to crunch on it for rosetta@home. Their workunits read and write so much at startup, that they keep on failing when the BOINC manager is trying to start 8-9-10 of them at once (with "no heartbeat from client" error). This error could be avoided by suspending all rosetta@home workunits before shutdown, and restarting them one by one after startup, or having an SSD drive.

A UPS can save you hours of messing around with restores and redoing work that got destroyed by a power failure before it got saved or backed up.

Sure it can. A UPS is highly recommended t for workstations and servers. But if you don't have one, you can turn off write caching on your drive(s) in device manager, and this will reduce the chance of data loss in a case of a power failure, and it won't degrade the performance of the SSD drive much. If you turn off virtual memory also, it will reduce the chance of data loss even more.

Profile skgiven
Volunteer moderator
Volunteer tester
Avatar
Send message
Joined: 23 Apr 09
Posts: 3968
Credit: 1,995,359,260
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34287 - Posted: 13 Dec 2013 | 23:11:41 UTC - in response to Message 34285.
Last modified: 13 Dec 2013 | 23:14:45 UTC

During the last week I tested Linux vs W7 and Linux was 12.5% faster on a GTX770 (Exact same system). XP is about as fast as Linux.

Having an SSD on my Windows system has been of great benefit to me over the last year or so. I hate waiting 10minutes for a system to restart, and installing drivers (especially NVidia) is so much faster; 3 or 4minutes compared to about 30min if you include the restarts. My Linux systems still use HDD's, but boot time isn't as high for dedicated crunching systems and I don't reboot too often.
____________
FAQ's

HOW TO:
- Opt out of Beta Tests
- Ask for Help

Profile skgiven
Volunteer moderator
Volunteer tester
Avatar
Send message
Joined: 23 Apr 09
Posts: 3968
Credit: 1,995,359,260
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34288 - Posted: 14 Dec 2013 | 1:26:23 UTC - in response to Message 34261.

@TJ:

I agree with your recommendation of 660TI over 660. For just a little more money you get a big increase in performance.

Here I read that the 660TI is but 4% points better than the 660:



Given that Amazon France's 660 is €165 and the TI version is €255, for a 4% improvement I'll stick with the non-TI version (me thinks...). It's hardly a big performance increase and it's not a little more money.

Tom

That is a 4% delta when comparing cards against what was the top GPU for GPUGrid, the Titan (now overtaken by the GTX780Ti), and ((55/51)*100%)-100%=8%; the Reference 660Ti at Reference clocks is 8% faster than a reference 660.
Most GTX660Ti's typically boosts up to ~1200MHz and for here are around 20% fast than a reference 660, as suggested. However the theoretical performance difference between the GTX660 and 660Ti is 40%. The 660Ti is somewhat bandwidth constrained compared to the GTX660; both have the same bandwidth, but the 660 has less CUDA cores to feed.

Obviously I can't make a performance table that includes each and every manufactured version of every card, so I just went by reference models. A few people did chip in with their cards boost capabilities.

NVidia GPU Card comparisons in GFLOPS peak
____________
FAQ's

HOW TO:
- Opt out of Beta Tests
- Ask for Help

Dagorath
Send message
Joined: 16 Mar 11
Posts: 509
Credit: 179,005,236
RAC: 0
Level
Ile
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34289 - Posted: 14 Dec 2013 | 4:13:23 UTC - in response to Message 34288.

The 660Ti is somewhat bandwidth constrained compared to the GTX660; both have the same bandwidth, but the 660 has less CUDA cores to feed.


I believe you're refering to GPU memory bandwidth correct? And I believe that restriction either decreases or disappears on GTX 670, yes?

Regarding comments extolling SSD... don't take this as mindless Windows bashing or a Linux plug but I had no idea it takes 30 minutes and several reboots to install or upgrade an NVIDIA driver on Windows. On Linux an NVIDIA driver installs in about 7 minutes for me. The way I do it requires a reboot but that's only because that's the only way I can find to shutdown and restart X. If the preferred method worked for me I wouldn't have to reboot at all and install and updates would require about 5 minutes. OK, that's a compelling reason for a Windows user to have an SSD.

So I'll expand on that a little because tomba asked for a list of differences between Lin and Win. I'm not sure of the technical reason but driver installation in general, not just NVIDIA driver, as well as opsys updates are generally much faster and easier on Lin compared to Win. Anybody who has used both opsys' can verify that. Devoted Win proponents counter with the argument that Win has drivers for more devices than Lin and that is true but not nearly as true as it was 10 years ago. As more and more device manufacturers have come to realize that complying with open standards has a positive influence on their revenue, Linux devs have followed up with drivers for those devices and frequently you bring home a new camera, printer or whatever and just connect it to the USB port and 10 seconds later it's working, no need to put a disc in the DVD drive or download something or even reboot. Now that is the way Bill Gates intended Plug 'n Play to work, I think. And that's one of the reasons why I feel my claim that Linux is more user friendly is not just fanboy talk, it's real.

____________
BOINC <<--- credit whores, pedants, alien hunters

TJ
Send message
Joined: 26 Jun 09
Posts: 815
Credit: 1,470,385,294
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34291 - Posted: 14 Dec 2013 | 8:39:54 UTC - in response to Message 34289.
Last modified: 14 Dec 2013 | 8:40:11 UTC

I like to read all these Linux things as I still have issues with my Linux or even better Linux and I.
But it would be better to make a new thread for that as this is more about building a new PC.
____________
Greetings from TJ

Profile skgiven
Volunteer moderator
Volunteer tester
Avatar
Send message
Joined: 23 Apr 09
Posts: 3968
Credit: 1,995,359,260
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34293 - Posted: 14 Dec 2013 | 11:34:54 UTC - in response to Message 34291.
Last modified: 14 Dec 2013 | 11:39:24 UTC

If we had a nice Linux forum area, all things Linux could be shifted to there... Alas we don't so all things Linux are strewn across all threads in all forums...

I now find setting Linux up faster than setting Windows up.
My only gripe is the lack of tools/apps to control fan speed (especially on multi-GPU setups), but it can be done and there are plenty of work-rounds.
I find that Linux sees some devices that Windows struggles with, especially LAN cards and the likes. The chip-set drivers are just there, whereas when setting up Windows its often a case of restarting about 7 times.

I expect I will eventually move to Linux for every system and just run W7 in a VM; that way I can crunch without the 12.5% performance hit, and it's also much easier to backup and restore VM images.
____________
FAQ's

HOW TO:
- Opt out of Beta Tests
- Ask for Help

tomba
Send message
Joined: 21 Feb 09
Posts: 497
Credit: 700,690,702
RAC: 0
Level
Lys
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34295 - Posted: 14 Dec 2013 | 12:00:21 UTC

Again, thanks everyone for your input.

I listened. The SSD is gone and the FX8350 is in. Also in are a hard disk (500GB is more than enough) and a DVD/CD RW (Blueray is of no interest to me).

The items in blue are on order.



That leaves the question of GTX 660 vs. 660TI. Price difference is €116. But what's the performance difference in the real, GPUGrid, world?

I keep a log of WUs. This past month I've done 27 NATHAN_KIDKIXc22 WUs. The CPU run times are remarkably consistent, averaging 43298 seconds; 12.027 hours. That's on my 2.67GHz i7 and mildly-OCed ASUS GTX 660 (GPU: 1097MHz, Shader: 2194MHz, Memory: 3004 MHz).

If those of you that run the 660TI could post how long your rig takes to process one of those Nathans, that would be most helpful (me thinks...). Thanks!

Tom

Dagorath
Send message
Joined: 16 Mar 11
Posts: 509
Credit: 179,005,236
RAC: 0
Level
Ile
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34296 - Posted: 14 Dec 2013 | 12:10:42 UTC - in response to Message 34291.

I like to read all these Linux things as I still have issues with my Linux or even better Linux and I.
But it would be better to make a new thread for that as this is more about building a new PC.


You are correct and I have tried to restrain myself from going too far off-topic. My apologies if I have done so but I got the feeling tomba wanted to hear a little about Linux. Again, sorry if I've gone too far.

@skgiven

An area for Linux would be nice. As for app to control fan speed on Linux try my gpu_d script. If there is sufficient interest I'll improve it to handle all the GPUs in a single PC and, if the backend nvidia-settings binary supports LAN, I'll make the script support that too. Or if you would be more interested in re-flashing and fixing the self-serving temperature vs. fan speed curve some manufacturers seem to be building into their cards I can give links to what I've read about that. The only reason I haven't posted those links so far is because they're scattered all through my browser's bookmarks and it would take some time to round them all up and put them in a post here. I'll do it if anybody indicates they would find those links useful otherwise I won't bother wasting the time. Also, I thought most everybody here already knows about re-flashing their GPU BIOS.

____________
BOINC <<--- credit whores, pedants, alien hunters

TJ
Send message
Joined: 26 Jun 09
Posts: 815
Credit: 1,470,385,294
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34298 - Posted: 14 Dec 2013 | 13:05:27 UTC - in response to Message 34296.

You are correct and I have tried to restrain myself from going too far off-topic. My apologies if I have done so but I got the feeling tomba wanted to hear a little about Linux. Again, sorry if I've gone too far.


No, no you are not going to far. But after a while it will be difficult to find all the bits and peaces of information. With more "dedicated" threads that will be easier.
____________
Greetings from TJ

TJ
Send message
Joined: 26 Jun 09
Posts: 815
Credit: 1,470,385,294
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34299 - Posted: 14 Dec 2013 | 13:15:17 UTC

Hi Tom,

The AMD CPU will come with a stock cooler. It is very easy to install and cools great. My system does 4 Rosetta's and 2 GPUGRID on two 660's and runs at 48-54°C with the stock cooler. It can make a lot of noise though if the room temperature becomes high. During summer in France even higher then in the Netherlands. but you can try first. The thermal past is already mounted on that cooler.

I would install an SSD though of 100 or 120 MB. Insight (a Europe) company has now 40% discounts on a Kingston SSD. If you can find that company in France too you could if have for around 50 Euro's.
It will be such a joy to install drivers, boot, upgrade BOINC boot, de-install driver, boot and use another set of drivers. And don't forget that you have to boot Windows many times due to security updates and such, or as a WU has don clocked the GPU clock. A thing that often occurs on my 660. The investment is worth while just as Zoltan explained.

Good luck with you choices.
____________
Greetings from TJ

tomba
Send message
Joined: 21 Feb 09
Posts: 497
Credit: 700,690,702
RAC: 0
Level
Lys
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34306 - Posted: 14 Dec 2013 | 15:01:39 UTC - in response to Message 34299.

Hi TJ,

The AMD CPU will come with a stock cooler. It is very easy to install and cools great. My system does 4 Rosetta's and 2 GPUGRID on two 660's and runs at 48-54°C with the stock cooler. It can make a lot of noise though if the room temperature becomes high. During summer in France even higher then in the Netherlands. but you can try first. The thermal past is already mounted on that cooler.

I think I'm committed to the already-purchased cooler but, if there's a decent cooler in the mobo box, I might try installing it in my i7 Dell, which makes too much CPU fan noise for me to run any Rosettas.

I would install an SSD though of 100 or 120 MB. Insight (a Europe) company has now 40% discounts on a Kingston SSD. If you can find that company in France too you could if have for around 50 Euro's.
It will be such a joy to install drivers, boot, upgrade BOINC boot, de-install driver, boot and use another set of drivers. And don't forget that you have to boot Windows many times due to security updates and such, or as a WU has don clocked the GPU clock. A thing that often occurs on my 660. The investment is worth while just as Zoltan explained.

I'm convinced. I looked at Insight France but there was no deal like your found locally. So I went for a 120GB SSD, and cradle, from Amazon France.

Tom

tomba
Send message
Joined: 21 Feb 09
Posts: 497
Credit: 700,690,702
RAC: 0
Level
Lys
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34308 - Posted: 14 Dec 2013 | 15:22:54 UTC - in response to Message 34296.

I got the feeling tomba wanted to hear a little about Linux.

Absolutely right, Dagorath! In fact, I decided to have a go at Linux on my Win7 PC while waiting for my PC build pieces to come together.

I installed and ran Virtualbox, which soon wanted a Ubuntu CD.

I downloaded the 12.04.3 ISO and inserted a new 700MB RW CD for the burn. "Not enough space on the CD"; the ISO is 724,992KB.

This is no place to solve my problem but can you point me at a list I can cry on? Thanks.

Tom

Profile Damaraland
Send message
Joined: 7 Nov 09
Posts: 152
Credit: 16,181,924
RAC: 0
Level
Pro
Scientific publications
watwatwatwatwatwatwatwatwat
Message 34310 - Posted: 14 Dec 2013 | 18:19:15 UTC - in response to Message 34308.

I downloaded the 12.04.3 ISO and inserted a new 700MB RW CD for the burn. "Not enough space on the CD"; the ISO is 724,992KB.

I had that problem too, downloaded a older version and updated.
Another option is to put it into a USB stick.
There's plety of forums talking about this (I find it very silly, that they did this). Search "ubuntu size 700MB" and you find more info.

captainjack
Send message
Joined: 9 May 13
Posts: 171
Credit: 4,011,060,309
RAC: 16,248,628
Level
Arg
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34311 - Posted: 14 Dec 2013 | 18:51:59 UTC

Tomba,

When you create a new machine in Virtualbox, it will step you through the parameters to create a new virtual machine. When it gets to the step where it asks for the install *.iso file, you should be able to point it to the file location on your hard drive.

Hope that helps.

tomba
Send message
Joined: 21 Feb 09
Posts: 497
Credit: 700,690,702
RAC: 0
Level
Lys
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34312 - Posted: 14 Dec 2013 | 19:47:04 UTC - in response to Message 34311.

When you create a new machine in Virtualbox, it will step you through the parameters to create a new virtual machine. When it gets to the step where it asks for the install *.iso file, you should be able to point it to the file location on your hard drive.

Thanks! Did that on my 64-bit Windows 7 PC. Got this:



Not easy!!

Profile Damaraland
Send message
Joined: 7 Nov 09
Posts: 152
Credit: 16,181,924
RAC: 0
Level
Pro
Scientific publications
watwatwatwatwatwatwatwatwat
Message 34313 - Posted: 14 Dec 2013 | 20:01:35 UTC - in response to Message 34312.

I didn't pay much attencion to this, but I would advice you to see issues with GPUs on VirtualBox. Not very sure you will go smooth.
http://www.youtube.com/watch?v=69bRWxg2_QE

TJ
Send message
Joined: 26 Jun 09
Posts: 815
Credit: 1,470,385,294
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34315 - Posted: 14 Dec 2013 | 20:51:19 UTC - in response to Message 34306.

Hi TJ,

The AMD CPU will come with a stock cooler. It is very easy to install and cools great. My system does 4 Rosetta's and 2 GPUGRID on two 660's and runs at 48-54°C with the stock cooler. It can make a lot of noise though if the room temperature becomes high. During summer in France even higher then in the Netherlands. but you can try first. The thermal past is already mounted on that cooler.

I think I'm committed to the already-purchased cooler but, if there's a decent cooler in the mobo box, I might try installing it in my i7 Dell, which makes too much CPU fan noise for me to run any Rosettas.

Tom

I'm afraid that is not going to work. The cooler delivered with the AMD CPU is especially for AMD and on the MOBO a fitting with backplate is already mounted for it. Only pushing a handle and it is tight locked. You can try it first and if you like its performance you can put your new cooler on the i7.
____________
Greetings from TJ

captainjack
Send message
Joined: 9 May 13
Posts: 171
Credit: 4,011,060,309
RAC: 16,248,628
Level
Arg
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34318 - Posted: 14 Dec 2013 | 23:00:16 UTC

Tomba,

I just downloaded a fresh copy of Ubuntu 13.10, created a new VM, and told it to install from my hard drive. It seems to be working just fine (no error message like yours). The only thing I know to tell you is to check and make sure that virtualization is enabled in your BIOS. Many computers come with virtualization disabled and you have to enable it in the BIOS.

BTW, when you choose to burn the Ubuntu *.ISO file to an optical disk, you can burn it to a DVD and it will fit just fine.

Let us know how it turns out.

tomba
Send message
Joined: 21 Feb 09
Posts: 497
Credit: 700,690,702
RAC: 0
Level
Lys
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34320 - Posted: 15 Dec 2013 | 7:56:04 UTC - in response to Message 34318.

The only thing I know to tell you is to check and make sure that virtualization is enabled in your BIOS.

No mention of virtualisation in my BIOS. Did a BOIS update. Nothing changed...

Dagorath
Send message
Joined: 16 Mar 11
Posts: 509
Credit: 179,005,236
RAC: 0
Level
Ile
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34325 - Posted: 15 Dec 2013 | 10:45:05 UTC - in response to Message 34310.

I downloaded the 12.04.3 ISO and inserted a new 700MB RW CD for the burn. "Not enough space on the CD"; the ISO is 724,992KB.

I had that problem too, downloaded a older version and updated.
Another option is to put it into a USB stick.
There's plety of forums talking about this (I find it very silly, that they did this). Search "ubuntu size 700MB" and you find more info.



They didn't do anything silly. It just looks silly because so many newbies are unfamiliar with the terms they use and get confused. I will attempt to explain.

You can download 2 different kinds of ISO files from which to install Ubuntu:

  1. The plain vanilla Ubuntu installer ISO fits on a CD. It boots, asks you some questions about configuration options and installs Ubuntu. That's all it does. To run Ubuntu you must complete the installation then reboot. This is the way a Windows install disk works, this ISO fits on a CD. If you are already sure you want to install to HDD then just download the plain vanilla installer.

  2. The Ubuntu Live ISO does not fit on a CD because it has extra code the plain vanilla installer does not have. That extras code gives a Live DVD the ability to boot directly into a fully operational Ubuntu (something the plain vanilla ISO does not do). That allows you to test drive Ubuntu before installing it on your HDD. If you decide to install Ubuntu to your HDD after the test drive you don't need to download the plain vanilla ISO because the Live DVD can also install Ubuntu.



The next big surprise tomba is going to run into is that there is no C: to be found anywhere, no D: either. Don't worry tomba, Linux has not destroyed you disks, they still work.

Once you get a virtual Ubuntu created, the software updater will run and tell you there are a hundred or more updates. The first update is equivalent to a service pack in Windows but we don't call them service packs. The second update may be fairly big too but the third will be small. Do the updates before you start exploring and playing. After the updates are done you should shutdown the VM, open VBox manager and clone the first VM. Give the cloned VM a name that has the word "virgin" in it. Don't ever use the virgin. That way if you destroy the other VM by doing something dumb you just tell VBox manager to delete the destroyed one then clone a new one from the virgin. That gives you a fresh and already updated Ubuntu to start over with. Actually, I like to do the updates then install BOINC and then clone the virgin.

____________
BOINC <<--- credit whores, pedants, alien hunters

captainjack
Send message
Joined: 9 May 13
Posts: 171
Credit: 4,011,060,309
RAC: 16,248,628
Level
Arg
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34329 - Posted: 15 Dec 2013 | 15:08:58 UTC

Tomba,

Are we still talking about your Dell 435 XPS? It's possible that your motherboard/BOIS/CPU/chipset combination does not support virtualization. I did some searches on the Internet and found a post that said Dell sometimes blocks virtualization in the BIOS on their desktop computers. I also found a thread that said some CPU's lack the hardware virtualization capabilities that are required to run a virtual machine.

You can do a search on the Internet with your exact configuration and see if it is supposed to support virtualization.

Hope that helps.

Dagorath
Send message
Joined: 16 Mar 11
Posts: 509
Credit: 179,005,236
RAC: 0
Level
Ile
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34332 - Posted: 15 Dec 2013 | 18:01:26 UTC - in response to Message 34329.

Regarding necessity for hardware virtualization, see Hardware vs. software virtualization from the official VirtualBox documentation. To summarize, you don't need hardware virtualization if the guest OS is 32 bit. If the guest OS is 64 bit then your CPU must support hardware virtualization.

It's true, Dell has in the past disabled hardware virtualization on some models and made it impossible to turn on. If you look at BOINC's Event Log, about 20 lines from the top BOINC lists the hardware extensions your CPU supports. For Intel CPUs it will show VT-x. For AMD CPUs it will show AMD-V. The catch is that the list of extensions can be very long and may not fit on the screen you are using. In that case BOINC does not line-wrap the list it just chops it short. The bottom line is that if you see VT-x or AMD-V in the list then you can be sure your CPU has that feature. It might be disabled in BIOS but at least it is there and you might be able to turn it on in BIOS. If you do not see VT-x or AMD-V in BOINC's list then it might be that BOINC has simply chopped off the portion of the list where the feature appears. In that case you need to find stdoutdae.txt in the BOINC data directory, open it with a text editor and search for VT-x (if you have an Intel CPU) or AMD-V (if you have an AMD CPU). If you find it then your CPU supports it but, again, it might not be enabled. If you don't find it then your CPU doesn't support it. In that case you cannot install a 64 bit guest OS, only a 32 bit guest OS. The guest OS is the OS you install in the VM.

I could be wrong but I think tomba's problem was that he tried to save the VM image to the Ubuntu install DVD instead of HDD.


____________
BOINC <<--- credit whores, pedants, alien hunters

ExtraTerrestrial Apes
Volunteer moderator
Volunteer tester
Avatar
Send message
Joined: 17 Aug 08
Posts: 2705
Credit: 1,311,122,549
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34333 - Posted: 15 Dec 2013 | 21:10:28 UTC

Wow, this thread surely developed quickly! I hardly managed to catch up with new posts over the week, let laone posting myself. I suppose you haven't ordered yet, Tomba? Anyway, here are my few cents:

GPU: back when the GTX660Ti wasn't much more expensive than a GTX660 it was the obvious choise for crunchers. But now the markup on the last cards is far too large. You should go straight for a GTX770 (280€) if you want more than a GTX660, anything in between is too expensive for the performance.

Memory: I agree with 2x4 GB, unless you know you need more (you don't). What's the clock speed and timing of your kit?

DVD: you don't need this if it's going to be a dedicated cruncher (install from USB and later on use another drive over the network).

If it's going to be your new main rig I strongly suggest going for the SSD. The "wow" factor may disappear after a short time, but once you go back to an HDD-equipped machine again it becomes totally unbearable and you wonder how this has ever been "OK".

SSD: I recommend the Samsung 840 Evo 120 GB at 85€. It's got one of the best controllers and firmware for a very good price and enough capacity for most of your stuff.

HDD: you should be able to get 1 TB for ~50€. My experience with HDD space has been "don't skimp on it if it just costs a few € more; it will come in handy at some point".

Fan: excellent choice!

Heatsink: the stock cooler doesn't cut it for 24/7 crunching if you're concerned with noise.

Number of GPUs: I won't put more than 1 high performance card into my main rig, since I couldn't cool that heat silently enough with air. Two cards is still a manageable setup: you can have enough space between the cards and the PCIe connections are not really bottlenecking yet (I'm not only considering the current GPU-Grid app - it's better to err on the safe side here). At 3 cards things become challenging: you need a significnatly stronger PSU and you're almost forced into blower-style coolers, which are powerful but inherently loud. Also the PCIe configurations and on AMD the link between Northbridge and CPU become important to consider.

I'm asking because if you build for 2 GPUs you can choose a smaller PSU (600 - 700 W 80+ Gold should be ideal) and the mainboard requirements are reduced, which could again save a bit. Also consider the running costs if you're shooting for 500+ W of load with that 1 kW PSU. At 20+ cent/kWh the rule of thumb is "each W drawn 24/7 costs you 2€/year". Don't build too large of a system and be surprised by the cost later!

CPU: if you're actively working with the system and plan to keep it for a longer time (maybe passing it along in the family once it's not fit for crunching any more) I'd go for the slightly faster CPU as well, although the difference is not yet in the "clearly noticeable" range.

I would also favor Intel due to a few factors:
- far better power efficiency, unless you find a project where AMD excells - but with Intel you're still more flexible. We're talking about >50 W difference here, i.e. 100€/year higher running costs for the AMD system (unless your electricity is significantly cheaper)!

- even the idle power consumption is better than for AM3+ mainboards, which matters if the system is used for general web/office after its active crunching carreer (again saving money, just not as much as before)

- free PCIe 3, with even basic boards providing 2 x 8x lanes for 2 GPUs -> performance like 2 x 16x PCIe 2, whereas AMD will typically provide at best 1 x 16 and 1 x 8 PCIe 2 on more expensive mainboards

- "free" credits from the Haswell GPU: ~10k RAC at Einstein or ~50k at Collatz (additional power draw 20 - 30 W depending on settings and tuning)

- performance for general usage with an i5 is about equal the top AMD - so the AMD doesn't have an initial price advantage either

If you still want to go with AMD I'd at least eco-tune the setup: AMD pushes their top chips quite hard to reach high frequencies. If you disable turbo you gain some power efficiency, and if you additionally reduce the stock voltage (a bit is always possible) you can reduce the power consumption difference compared to Intel significantly. The 8320 could go lower due to lower stock speeds, but you could underclock & undervolt the bigger CPU as well if you wanted to.
Of course you could do so with an Intel as well, but here most choose to go for higher clocks since the starting point is already comparably efficient. For my 3770K I'm choosing both: OC to 4.1 GHz at reduced voltage of 1.03 V. The AMD 8350 will need massive 1.35 - 1.4 V to reach its top turbo bin of 4.2 GHz...

MrS
____________
Scanning for our furry friends since Jan 2002

tomba
Send message
Joined: 21 Feb 09
Posts: 497
Credit: 700,690,702
RAC: 0
Level
Lys
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34343 - Posted: 16 Dec 2013 | 10:51:26 UTC - in response to Message 34333.

Thanks for chipping in MrS, and a special thanks for bringing this thread back on topic!

I suppose you haven't ordered yet, Tomba?

Yes I have! Here 'tis:



GPU: back when the GTX660Ti wasn't much more expensive than a GTX660 it was the obvious choise for crunchers. But now the markup on the last cards is far too large. You should go straight for a GTX770 (280€) if you want more than a GTX660, anything in between is too expensive for the performance.

Now there's a thought... I just emailed this link to my son with a gentle hint that Xmas is just around the corner :). If that works, I have the potential of a 770 and 2x660 for the new rig, with my spare 460 going into the current one. Perhaps that's OTT? .

What's the clock speed and timing of your kit?

I have no idea :(

DVD: you don't need this if it's going to be a dedicated cruncher (install from USB and later on use another drive over the network).

No. It is not to dedicated. It replaces the old one, which goes to my daughter.

SSD: I recommend the Samsung 840 Evo 120 GB at 85€.

That's the one coming!

Number of GPUs: I won't put more than 1 high performance card into my main rig, since I couldn't cool that heat silently enough with air. Two cards is still a manageable setup: you can have enough space between the cards and the PCIe connections are not really bottlenecking yet (I'm not only considering the current GPU-Grid app - it's better to err on the safe side here). At 3 cards things become challenging: you need a significnatly stronger PSU and you're almost forced into blower-style coolers, which are powerful but inherently loud. Also the PCIe configurations and on AMD the link between Northbridge and CPU become important to consider.

Hmmm. Food for thought there, especially if my 'cunning plan', aka Baldrick, for a 770 comes off...

100€/year higher running costs for the AMD system (unless your electricity is significantly cheaper)!

The electric tariff I'm on gives me 22 days a year when the price is 10x the normal tariff. No crunching on those days! However, for the remaining 343 days I pay half the normal tariff, so electric usage is not a problem.

Many thanks again for a fascinating rundown on crunching considerations. I'm grateful.

Tom

tomba
Send message
Joined: 21 Feb 09
Posts: 497
Credit: 700,690,702
RAC: 0
Level
Lys
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34344 - Posted: 16 Dec 2013 | 11:05:25 UTC

Re my plan to use the third of the three WIN 7 licences I bought ages ago, I just noticed it's the "Update" version.

Does this mean I must first install XP or Vista on the SSD, and then install WIN 7?

Will the WIN7 install complain that the opsys it finds on the SSD is unverified, or should I buy a full XP or Vista CD?

(No - I don't want to start with a Linux install...)

Tom

tomba
Send message
Joined: 21 Feb 09
Posts: 497
Credit: 700,690,702
RAC: 0
Level
Lys
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34345 - Posted: 16 Dec 2013 | 12:09:00 UTC - in response to Message 34344.
Last modified: 16 Dec 2013 | 12:19:28 UTC

Re my plan to use the third of the three WIN 7 licences I bought ages ago, I just noticed it's the "Update" version.

Call off the dogs! Just ordered WIN7 Professional SP1 with COA on eBay UK for £37. Excellent feedback.

Tom

ExtraTerrestrial Apes
Volunteer moderator
Volunteer tester
Avatar
Send message
Joined: 17 Aug 08
Posts: 2705
Credit: 1,311,122,549
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34351 - Posted: 16 Dec 2013 | 21:45:43 UTC

Hi Tomba, have fun with your new system :)

Although I did write a lot (which might help others here and there as well) and you can easily tell I would not have suggested an AMD.. the system doesn't look that different from what I was suggesting - very nice! Let us know how it works out. And if you have the time and nerve: experiment with energy saving on that AMD ;)

BTW: regarding that update version, I think you only need to provide a valid key for a prior version. Not entirely sure, though. When upgrading from XP the disk is completely swiped anyway. Give it a try when you need another license!

MrS
____________
Scanning for our furry friends since Jan 2002

TheFiend
Send message
Joined: 26 Aug 11
Posts: 100
Credit: 2,557,052,477
RAC: 2,225,950
Level
Phe
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34359 - Posted: 17 Dec 2013 | 19:00:18 UTC - in response to Message 34345.

Re my plan to use the third of the three WIN 7 licences I bought ages ago, I just noticed it's the "Update" version.

Call off the dogs! Just ordered WIN7 Professional SP1 with COA on eBay UK for £37. Excellent feedback.

Tom


I also recently bought Win7 Pro on ebay, got for £35 with COA. When it was delivered it turned out to be a Dell disc that came with an unused COA. Didn't use the disc but the COA worked, although I did have to use phone activation to verify it.

tomba
Send message
Joined: 21 Feb 09
Posts: 497
Credit: 700,690,702
RAC: 0
Level
Lys
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34373 - Posted: 18 Dec 2013 | 14:23:54 UTC



Well, everything is shipped, and some bits have been delivered. I'm getting excited... :)

In my current rig I switched the old GTX 660 for the new GTX 660 OC, to make sure it was not DOA. It's totally silent, and it's faster: GPU from 1097 to 1149, Shader from 2194 to 2299. Memory is the same at 3004.

I await calls from more delivery men.

Tom

tomba
Send message
Joined: 21 Feb 09
Posts: 497
Credit: 700,690,702
RAC: 0
Level
Lys
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34390 - Posted: 19 Dec 2013 | 9:42:02 UTC

I don't have a cc_config.xml file, which I shall need for multiple GPUs.

I search these fora for cc_config.xml but no post told me its destination folder.

Where does it go?

Tom

TJ
Send message
Joined: 26 Jun 09
Posts: 815
Credit: 1,470,385,294
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34392 - Posted: 19 Dec 2013 | 10:32:39 UTC - in response to Message 34390.

The cc.config goes into the BOINC Data folder.
____________
Greetings from TJ

tomba
Send message
Joined: 21 Feb 09
Posts: 497
Credit: 700,690,702
RAC: 0
Level
Lys
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34396 - Posted: 19 Dec 2013 | 15:18:49 UTC

The case arrived this morning so I made a start.

I don'r have the earthing strap yet so I passed on the mobo. However, I installed the SSD and a disk drive. Easy. Disk drives go in screwless. Two bays have cradles for SSDs so my on-order cradle is redundant.

The DVD was a bit of a challenge. There are five bays and the five front grills pop off easily. But behind them are five metal plates with no visible means of removing them. A quick Google advised breaking them off, which I did. Then the DVD drive went in easily, and screwless.

I am not a little worried about the myriad of cables coming the front control panel. All gauges, all connectors, all colours. The case's Micky Mouse guide shows a few of the key ones but that's it. No definitions of which colours are what :(

Tom

tomba
Send message
Joined: 21 Feb 09
Posts: 497
Credit: 700,690,702
RAC: 0
Level
Lys
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34398 - Posted: 19 Dec 2013 | 18:46:10 UTC - in response to Message 34392.

The cc.config goes into the BOINC Data folder.

Thanks TJ. Noted. Tom

Dagorath
Send message
Joined: 16 Mar 11
Posts: 509
Credit: 179,005,236
RAC: 0
Level
Ile
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34399 - Posted: 19 Dec 2013 | 19:13:40 UTC - in response to Message 34396.

The connectors on the ends of the cables usually have names such as USB, HDD led, PWR, RESET, etc. stenciled on them. Your mobo's manual will tell you wherte each connector goes. The connectors usually are keyed which means they can fit only one way. Where there is any doubt about polarity, the mobo manualwill probably show + and - on a diagram. Red is +, black is -, white can also be + and green is sometimes -.
____________
BOINC <<--- credit whores, pedants, alien hunters

tomba
Send message
Joined: 21 Feb 09
Posts: 497
Credit: 700,690,702
RAC: 0
Level
Lys
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34421 - Posted: 21 Dec 2013 | 19:17:04 UTC - in response to Message 34399.

The connectors on the ends of the cables usually have names such as USB, HDD led, PWR, RESET, etc. stenciled on them. Your mobo's manual will tell you wherte each connector goes. The connectors usually are keyed which means they can fit only one way. Where there is any doubt about polarity, the mobo manualwill probably show + and - on a diagram. Red is +, black is -, white can also be + and green is sometimes -.

Thanks for that, Dagorath. You were right. The mobo manual told me exactly what to do with all those pretty cables!

The earth strap arrived 11:00 this morning so I continued the build. Google was a great help in sorting out information missing from some of the Micky Mouse instructions. Only the mobo instructions were comprehensive. Thanks, ASUS!

I chickened out on the Hyper 212 Evo CPU cooler and installed the stock cooler that came with the CPU. Time will tell if that was a bad decision.

The build is done. Five hours, allowing for a quick lunch and my standard two-hour nap! I decided to leave the christening till the morning, after a thorough check that everything's in the right place. Fingers crossed...

Tom




Dagorath
Send message
Joined: 16 Mar 11
Posts: 509
Credit: 179,005,236
RAC: 0
Level
Ile
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34422 - Posted: 21 Dec 2013 | 20:10:43 UTC - in response to Message 34421.
Last modified: 21 Dec 2013 | 20:14:51 UTC

You're welcome.

I bet the stock CPU cooler will be adequate when the rig crunches GPUgrid tasks because GPUgrid tasks don't work the CPU extremely hard, they work the GPU hard instead and for that reason I would watch the GPU temperature very carefully.

The stock/default GPU cooling mode for my 660ti is Automatic and it sucks because it allows the temperature to hover between 80C and 84C which is far too high. I have to put the fan control in manual mode and set it to a high RPM manually in order to keep the temp no higher than 65C. Actually I have a script that takes care of that for me but the point is that you might want to have your GPU temperature monitoring software installed and running before you start the first GPU task and be prepared to either suspend the task if the temperature soars too high or setup manual fan control or whatever method your software allows.
____________
BOINC <<--- credit whores, pedants, alien hunters

tomba
Send message
Joined: 21 Feb 09
Posts: 497
Credit: 700,690,702
RAC: 0
Level
Lys
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34423 - Posted: 21 Dec 2013 | 20:42:15 UTC - in response to Message 34422.

I would watch the GPU temperature very carefully.

you might want to have your GPU temperature monitoring software installed and running before you start the first GPU task and be prepared to either suspend the task if the temperature soars too high or setup manual fan control or whatever method your software allows.

Another good tip. Thanks!

Currently copying my GPU temperature monitoring software onto a USB stick.

tomba
Send message
Joined: 21 Feb 09
Posts: 497
Credit: 700,690,702
RAC: 0
Level
Lys
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34427 - Posted: 22 Dec 2013 | 10:14:38 UTC - in response to Message 34421.

I decided to leave the christening till the morning, after a thorough check that everything's in the right place. Fingers crossed...

With some trepidation I powered up this morning. Post screen, followed by BIOS screen. Checked first boot was the CD, popped in the Win 7 CD and it booted.

Win 7 installed and now installing a humongous number of Windows updates.

I'm amazed! It works!! :)

Tom

tomba
Send message
Joined: 21 Feb 09
Posts: 497
Credit: 700,690,702
RAC: 0
Level
Lys
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34428 - Posted: 22 Dec 2013 | 11:32:07 UTC - in response to Message 34422.

I would watch the GPU temperature very carefully.

Both PCs are running Santi WUs on GTX 660.

-------------Temp C------Fan RPM
Old------------65---------2130
New-----------60---------1470

Looking good for New, methinks...




Dagorath
Send message
Joined: 16 Mar 11
Posts: 509
Credit: 179,005,236
RAC: 0
Level
Ile
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34431 - Posted: 22 Dec 2013 | 12:03:25 UTC - in response to Message 34428.
Last modified: 22 Dec 2013 | 12:04:41 UTC

It looks very good for New. Congratulations! Your first build appears to be a success :-)

Now, in your opinion:

1) was it a difficult process?
2) did you save enough money to justify the effort?
3) would you build your own again in the future and would you recommend it to others?
____________
BOINC <<--- credit whores, pedants, alien hunters

John C MacAlister
Send message
Joined: 17 Feb 13
Posts: 181
Credit: 144,871,276
RAC: 0
Level
Cys
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwat
Message 34432 - Posted: 22 Dec 2013 | 13:21:54 UTC

And congratulations from me, too: I may even try such a project in 2014....

tomba
Send message
Joined: 21 Feb 09
Posts: 497
Credit: 700,690,702
RAC: 0
Level
Lys
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34434 - Posted: 22 Dec 2013 | 17:15:50 UTC - in response to Message 34431.

The new beast is now running GPUGrid with both GTX 660s so I retired the old PC with a GTX 460. I have planned to include the 460 in the new rig but I had forgotten it too is double width. There's no room in this case for three double width GPUs.


1) was it a difficult process?
2) did you save enough money to justify the effort?
3) would you build your own again in the future and would you recommend it to others?


1) Stepping into the unknown is always intimidating so I took my time, double checking as I went along and using Web searches for clarification when in doubt. No. It was not a difficult process even for this old guy!

2) Oh yes. Up to now I had always bought Dell and my opening bid was to check their top-of-the-line desktop; €2000. I spent a tad over half that for better function and expandability. She who holds the purse strings is delighted :)

3) Definite 'Yes' on both counts. If I can do it, anyone can.

Tom

TJ
Send message
Joined: 26 Jun 09
Posts: 815
Credit: 1,470,385,294
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34436 - Posted: 22 Dec 2013 | 18:35:14 UTC - in response to Message 34434.

Congratulations Tom!

You have now the same rig as I have with two 660's on a Sabertooth MOBO and an FX8350. I also use the stock CPU cooler still. However it will become noisy if ambient temperature rises and 6 cores are doing Rosetta.

It is indeed nice to build you own rigs, you can put in whatever you like.
What you can do as watch as many offers at PC shops and buy stuff, then you can build a second one over time. I used 5 months to collect all I needed for a new rig. Like an action for a WD Black HD 1TB for €40. That's more than half off the normal price.

One thing though, and a lot of people will not agree, but Dell builds very good silent cooling systems.
____________
Greetings from TJ

tomba
Send message
Joined: 21 Feb 09
Posts: 497
Credit: 700,690,702
RAC: 0
Level
Lys
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34439 - Posted: 23 Dec 2013 | 10:59:32 UTC

Had a fright last night. Got home to find the new PC off!

Found that the default power setting in Windows 7 is to sleep after 30 minutes.

tomba
Send message
Joined: 21 Feb 09
Posts: 497
Credit: 700,690,702
RAC: 0
Level
Lys
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34444 - Posted: 23 Dec 2013 | 16:47:02 UTC

BOINC Notices tells me:



My cc_config.xml file contains:

<cc_config>
<use_all_gpus>1</use_all_gpus>
</cc_config>

I'm happily using two GPUs. Perhaps this parameter is now redundant?

Tom

Richard Haselgrove
Send message
Joined: 11 Jul 09
Posts: 1626
Credit: 9,295,466,723
RAC: 18,414,230
Level
Tyr
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34446 - Posted: 23 Dec 2013 | 17:04:10 UTC - in response to Message 34444.

Always check with the documentation - in this case, client configuration.

I think your tag would be valid if you put it inside an <options> block.

By default, BOINC will use the 'better' of two dis-similar GPUs. Since your two GPUs are the same, there is no 'better' or 'worse'. BOINC will use both of them without complaint.

tomba
Send message
Joined: 21 Feb 09
Posts: 497
Credit: 700,690,702
RAC: 0
Level
Lys
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34449 - Posted: 23 Dec 2013 | 18:59:09 UTC - in response to Message 34446.

I think your tag would be valid if you put it inside an <options> block.

That fixed it. Thank you! Tom

Profile skgiven
Volunteer moderator
Volunteer tester
Avatar
Send message
Joined: 23 Apr 09
Posts: 3968
Credit: 1,995,359,260
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34462 - Posted: 24 Dec 2013 | 15:23:31 UTC - in response to Message 34446.

By default, BOINC will use the 'better' of two dis-similar GPUs.

The Boinc manual merely says,

    <use_all_gpus>0|1</use_all_gpus>
    If 1, use all GPUs (otherwise only the most capable ones are used).


That isn't enough information and is vague.
While not applicable to this situation, it would be useful to clarify what you mean by dis-similar GPU's (Fermi vs Kepler, or AMD vs NVidia vs Intel), as that's not mentioned in the manual.

If you just have NVidia cards then Boinc reads what's in the NVIDIA library, and then proceeds to report the 'lesser' card(s). Also, if it reports 2 GPUs then Boinc will use both of these - you don't need <use_all_gpus>1</use_all_gpus> in a cc_config.xml file.
It's been like that since around the time of NVidia's 175 drivers (years ago).

IIRC when you have an AMD attached to the monitor and an NVidia in another slot both are used, but not the other way around without using cc_config, though that might depend on 'better', and it's not clear whether one ATI would be used or two NVidia's in the case where each NVidia is only slightly less powerful than the AMD, and vice versa...

While Boinc might use the better, dissimilar card, Boinc doesn't report the better similar card to the project:

Intel(R) Xeon(R) CPU E3-1265L V2 @ 2.50GHz [Family 6 Model 58 Stepping 9]
(8 processors) [2] NVIDIA GeForce GTX 670 (2048MB) driver: 331.93 Microsoft Windows 7
Professional x64 Edition, Service Pack 1, (06.01.7601.00) 24 Dec 2013 | 10:01:16 UTC
http://www.gpugrid.net/show_host_detail.php?hostid=139265

A Stderr output from that system.

<core_client_version>7.2.33</core_client_version>
<![CDATA[
<stderr_txt>
# GPU [GeForce GTX 770] Platform [Windows] Rev [3203] VERSION [55]
# SWAN Device 0 :
# Name : GeForce GTX 770

http://www.gpugrid.net/result.php?resultid=7582795

Before encouraging people to use the manual, wouldn't it be wise to make sure it was up to date and contained sufficient information to be of use by crunchers?

http://boinc.berkeley.edu/wiki/GPU_computing,

"GPUgrid.net (Linux 32 & 64bit and Windows) Bit slow but a GT220 works."

Firstly, GPUGrid is Linux x64 only, not Linux x86.
Secondly, it's Windows x64 or x86.
Thirdly, it's CC1.3 and above only (GT220 is CC1.2), and a GT220 was never recommended.
____________
FAQ's

HOW TO:
- Opt out of Beta Tests
- Ask for Help

Dagorath
Send message
Joined: 16 Mar 11
Posts: 509
Credit: 179,005,236
RAC: 0
Level
Ile
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34463 - Posted: 24 Dec 2013 | 17:06:13 UTC - in response to Message 34462.

@skgiven,

I fixed the info regarding Linux 64bit, Windows and CC1.3 at http://boinc.berkeley.edu/wiki/GPU_computing#Attach_to_projects_with_GPU_applications. If you wish to see other changes PM me.

____________
BOINC <<--- credit whores, pedants, alien hunters

Profile skgiven
Volunteer moderator
Volunteer tester
Avatar
Send message
Joined: 23 Apr 09
Posts: 3968
Credit: 1,995,359,260
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34464 - Posted: 24 Dec 2013 | 17:18:34 UTC - in response to Message 34463.

Excellent!
Thank you,
____________
FAQ's

HOW TO:
- Opt out of Beta Tests
- Ask for Help

tomba
Send message
Joined: 21 Feb 09
Posts: 497
Credit: 700,690,702
RAC: 0
Level
Lys
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34468 - Posted: 24 Dec 2013 | 18:28:56 UTC

Well, my new rig is busy making a bigger contribution to GPUGrid. I even have the old rig upstairs running with a GTX 460…

I have a problem. The new rig is fan-noisy with just two CPU cores active, running the acemd.814 code. An ear inside the case tells me it’s the stock CPU cooler. I guess that chickening-out on the EVO cooler was not a good idea!

ASUS doc tells me the CPU warranty is void with a non-stock cooler. What to do? …

Tom

Richard Haselgrove
Send message
Joined: 11 Jul 09
Posts: 1626
Credit: 9,295,466,723
RAC: 18,414,230
Level
Tyr
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34470 - Posted: 24 Dec 2013 | 21:27:40 UTC - in response to Message 34462.

By default, BOINC will use the 'better' of two dis-similar GPUs.

The Boinc manual merely says,

    <use_all_gpus>0|1</use_all_gpus>
    If 1, use all GPUs (otherwise only the most capable ones are used).


That isn't enough information and is vague.
While not applicable to this situation, it would be useful to clarify what you mean by dis-similar GPU's (Fermi vs Kepler, or AMD vs NVidia vs Intel), as that's not mentioned in the manual.


Sorry, I tend to vary my answers - both style of writing, and depth of technical detail included - according to my perception of the needs of the questioner (and how energetic I'm feeling at the time...). Here are a couple of previous attempts at the same subject - feel free to grab either of them.

SETI message 1085712 (technical, quotes source)
BOINC message 42194 (interpretation)

Dagorath
Send message
Joined: 16 Mar 11
Posts: 509
Credit: 179,005,236
RAC: 0
Level
Ile
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34473 - Posted: 24 Dec 2013 | 22:12:24 UTC - in response to Message 34468.

Try ducting cool air from outside the case directly to the CPU fan. Here are a bunch of pictures of ducts in use. Notice the pics of a pop bottle with both ends cut off. Now that's my kind of modding... cheap and recycling stuff. I'll give you a link to a tool you can use to cut holes easily in cases, the tool is called a nibbler.

Checkout The Effectiveness of Air Ducts in CPU Cooling.

Google for pc cooling duct for more links. I don't know how expensive those ducts are these days but I used one years ago and it gave amazing results.

Or just put the other cooling solution on and don't tell Asus. They'll never know.Keep the original and if you ever need warranty put the original back on and send it in. However, if you keep theb CPU cool and power it through a good spike protector the chances of it failing are very slim.

____________
BOINC <<--- credit whores, pedants, alien hunters

Dagorath
Send message
Joined: 16 Mar 11
Posts: 509
Credit: 179,005,236
RAC: 0
Level
Ile
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34474 - Posted: 24 Dec 2013 | 23:36:58 UTC

Nibblers:

I've had one of these for years. I like it because with it you can cut a hole in the middle of a piece of material without starting the cut on the edge then cutting over to your hole although you have to drill a small hole inside the perimeter of your hole to start the cut.

Amazon also sells the AK327 and this one probably others too. A rotary saw in a Dremel works well too but it's more expensive.

____________
BOINC <<--- credit whores, pedants, alien hunters

TJ
Send message
Joined: 26 Jun 09
Posts: 815
Credit: 1,470,385,294
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34479 - Posted: 25 Dec 2013 | 18:27:53 UTC - in response to Message 34468.

Well, my new rig is busy making a bigger contribution to GPUGrid. I even have the old rig upstairs running with a GTX 460…

I have a problem. The new rig is fan-noisy with just two CPU cores active, running the acemd.814 code. An ear inside the case tells me it’s the stock CPU cooler. I guess that chickening-out on the EVO cooler was not a good idea!

ASUS doc tells me the CPU warranty is void with a non-stock cooler. What to do? …

Tom

You can experiment a bit with the fan settings in Thermal Radar or chose another preset cooling mode. Or turn the "turbo" mode of in the BIOS. As you can see the AMD is running at higher speed, if that is lower is is cooler and uses less power, and thus has the little fan make less rotations (not spin that fast).

You can put on another cooler if you want, the new CPU coolers from Be Quit are very good.
ASUS don't care what you use to you cool with, its the manual of the CPU that says that you need the stock cooler to keep the warranty. But if you keep the stock cooler, than no problem you can send that in with the CPU if needed.
But the change that you ever need to do that is very very small.
____________
Greetings from TJ

tomba
Send message
Joined: 21 Feb 09
Posts: 497
Credit: 700,690,702
RAC: 0
Level
Lys
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34487 - Posted: 27 Dec 2013 | 17:04:53 UTC

I shall be back here soon with a plea for help on the BIOS of this beast I've bought...

In the meantime, if you've been following this thread, you will know I suggested to my son that his Santa contribution could be a GTX 770. Didn't work! I got an Amazon gift certificate for €50... She who holds the purse strings saw how disappointed I was and has agreed to fund the balance! Bingo!!!

My first thought is to put the 770 in my old i7 PC. It has a 620 watt PSU - is that enough? But I don't know how long I can keep up the pretense that it must be on for file transfer purposes...

Has anyone had experience of squeezing three wide PSUs into a Sabertooth 990FX R2.20 mobo (four PCIe slots) inside a Cooler Master HAF 932 Advanced case?



tomba
Send message
Joined: 21 Feb 09
Posts: 497
Credit: 700,690,702
RAC: 0
Level
Lys
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34488 - Posted: 27 Dec 2013 | 18:27:33 UTC - in response to Message 34487.

Has anyone had experience of squeezing three wide PSUs

Make that GPUs...

____________

TJ
Send message
Joined: 26 Jun 09
Posts: 815
Credit: 1,470,385,294
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34489 - Posted: 27 Dec 2013 | 20:07:25 UTC - in response to Message 34487.

620W is absolutely enough for one GTX770.
It should be possible to put 3 GPU's on the Sabertooth, but with very little space in between and the fan of one GPU will such in warm air from the other card.
I have space for it and my case is somewhat shorter then yours. However it could become very warm then and the CPU cooler will make even more noise.
____________
Greetings from TJ

flashawk
Send message
Joined: 18 Jun 12
Posts: 297
Credit: 3,572,627,986
RAC: 0
Level
Arg
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwat
Message 34490 - Posted: 28 Dec 2013 | 1:43:21 UTC

Tomba, if you need help optimizing your BIOS settings on the Sabertooth MB, just let me know, I'm running 4 990FX chipset motherboards (2 Asus M5A990FX Pro R2.0 and 2 Asus Crosshair V Formula-Z). I'm going to be upgrading the other 2 to Formula-Z MB's soon because they let me run 32GB of 1866 RAM with lower settings and they are much more stable because the memory slots have their own 4 pin power connection straight from the PSU. The PCIe slots also have their own 4 pin Molex connection too, my TDP hardly moves on my GPU's. If your interested, just let me know, I have a tremendous amount of experience with that chipset.

Dagorath
Send message
Joined: 16 Mar 11
Posts: 509
Credit: 179,005,236
RAC: 0
Level
Ile
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34491 - Posted: 28 Dec 2013 | 3:48:58 UTC - in response to Message 34489.

TJ is right... with 3 cards in the case you're going to have cooling problem and/or a noise problem. That's why Retvari recommended cards with radial fans if you intended to put 3 on 1 mobo. Cards with radial fans blow the heat out the back of the case. IIRC, your cards have axial fans which blow the hot air at the card beside it or the CPU and leave the hot air inside the case.

Fortunately there are various solutions:


  1. wear ear plugs
  2. banish the rig to a different room and access it remotely
  3. increase the airflow into the case and between the cards
  4. decrease the temperature of the air going into the case
  5. a combination of 3 and 4



Try removing the case side cover and laying the case down on the closed side. Less hot air will collect in the case that way. Or remove the side cover, leave the case standing up and position a large, mains powered fan so it blows air into the case.

After experimenting with PC cooling for many years, I guarantee to you that the most effective way to increase cooling is to reduce the temperature of the air going into the case. Nothing else even comes close.

____________
BOINC <<--- credit whores, pedants, alien hunters

tomba
Send message
Joined: 21 Feb 09
Posts: 497
Credit: 700,690,702
RAC: 0
Level
Lys
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34494 - Posted: 28 Dec 2013 | 17:58:03 UTC - in response to Message 34490.

Tomba, if you need help optimizing your BIOS settings on the Sabertooth MB, just let me know

Thanks for that, flashawk. I think I need your help!

Found some ASUS Windows apps, one of which gives me this opening bid:



Looks like the stock cooler fan revs are high. I was wondering if I could down-clock the CPU to reduce the heat but I have no idea how to do that.

Note that fan 4 --- the top fan --- is not running; . I managed to break a blade with my fingers while it was running (ouch!). A replacement is on its way.

I was a bit surprised to see the temp difference between my two GTX 660s; 57C vs. 43C. One is brand new and short, the other 2+years old and long. I think the old one is in PCIE-1.

Do you see anything here I should worry about?

Thanks, Tom



tomba
Send message
Joined: 21 Feb 09
Posts: 497
Credit: 700,690,702
RAC: 0
Level
Lys
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34495 - Posted: 28 Dec 2013 | 18:22:47 UTC - in response to Message 34491.

TJ is right... with 3 cards in the case you're going to have cooling problem and/or a noise problem. That's why Retvari recommended cards with radial fans if you intended to put 3 on 1 mobo.

Please tell me about cards with radial fans. Who makes them?

I did reduce the fan noise by installing the case-supplied cover that funnels air from the front fan, through the hard disks and onto the GPUs. And I reckon that if I put the PSU at the top of the case I may just have room for a third GPU. We shall see, and hear!!

Tom

Dagorath
Send message
Joined: 16 Mar 11
Posts: 509
Credit: 179,005,236
RAC: 0
Level
Ile
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34498 - Posted: 28 Dec 2013 | 19:46:34 UTC - in response to Message 34495.

In the following pictures, the axial fans are those with propeller type blades, radial fans are those with squirrel cage impellers.

axial fan vs. radial fan

Radial fans generally move more air for the same size fan but they're usually louder. I prefer them because they blow the air through the heatsink fins then immediately out of the case. Axial fans are usually quieter but if they were installed to blow air out of the case they would take up a lot of room.

Also, axial PC fans suffer from a cooling shadow directly behind the motor whereas radial fans are able to deliver airflow all over the cooling fins which tends to make them more efficient wrt size.

All of the video card manufacturers offer models with radial fans as well as models with axial fans.
____________
BOINC <<--- credit whores, pedants, alien hunters

flashawk
Send message
Joined: 18 Jun 12
Posts: 297
Credit: 3,572,627,986
RAC: 0
Level
Arg
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwat
Message 34500 - Posted: 28 Dec 2013 | 22:27:06 UTC

Hi Tomba, not many of us AMD guys that have experience with the Asus software don't use it, it reads the motherboard thermistor built in to the AM3+ socket, it's not accurate and reads 10°C to 12°C too high. Programs like AMD Overdrive and HWiNFO64 are both great programs for accurate readings. In fact, they guy who wrote HWiNFO32/64 crunches here at GPUGRD.

As for the video cards, you might want to look at the "blower" type fans, they do a great job blowing all the hot air out the back of the case. This is an example of a blower card, it's like a squirrel cage or what you might find in a swamp cooler. Anyway, there's a bunch of stuff we should go over for the BIOS settings that will really help your crunching times. You can PM me if you like and take a look at my crunching times, my GTX780 times are pretty good, I keep getting errors from the SANTI's if I don't watch them like a hawk.

TJ
Send message
Joined: 26 Jun 09
Posts: 815
Credit: 1,470,385,294
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34503 - Posted: 29 Dec 2013 | 0:01:27 UTC
Last modified: 29 Dec 2013 | 0:04:03 UTC

I use the Asus software. The Al Suite 3.0 for Intel based CPU's is quite handy whit predefined Fan Control settings or change it manually, per fan.
Indeed I heard that Asus reads the temperatures to high, but I have installed and tried all I could find to measure temperature and they all differ, so it makes its hard to decide who has the proper reading. The differences however is no more than 3 or 4 degrees Celsius. If the builder of the MOBO can't be trust, who can?
But if Asus reads 10°C to high then Tomba's and my systems are running cool. Even 17°C cooler then my system with liquid cooling.

At flashawk, your times are good (your 780 is faster then my 780Ti), so it can not harm if you could quote some AMD BIOS settings for us to see if it could help others as well.
____________
Greetings from TJ

flashawk
Send message
Joined: 18 Jun 12
Posts: 297
Credit: 3,572,627,986
RAC: 0
Level
Arg
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwat
Message 34504 - Posted: 29 Dec 2013 | 4:09:02 UTC

Hey TJ, the AMD software I linked in my previous post is made by the manufacturer of the CPU, I don't see how you could go wrong. HWiNFO32/64 gives me identical temps as AMD Overdrive, they are both reading the "on die" thermistor that is built into the CPU, you can't get more accurate than that. The Asus software doesn't read the on die temperatures for each core, that's one way of telling what's being read when you see 8 different temps along 8 different speeds and multipliers and 8 different voltages.

I was able to change my multipliers through the Overdrive software and raise my speed to 4400MHz without raising the voltage, I kicked it up and ran the AMD stability test for 3 hours without issue. I then ran Intel's CPU/RAM burn in test for 2 hours with the same results, pretty nice. If you guys want advice on BIOS settings, I'll do it for you but only through PM's, it keeps trouble makers and trolls from butting in and causing distractions. Don't get me wrong, I'm not mad or bitter, in fact, this is a pretty easy going forum with a lot of affable folks, it's just something I learned over a decade ago to make things go smoother.

Besides, I'm not in to handing out information to folks who hide their computers and don't post in the forums to give feedback, I'm not trying to offend, it's just the way I feel. I will help anyone if they ask for it and if I can't, I'll point them in the right direction, I've never been much good at putting together a "How to" for that stuff. I guess I should learn to but one thing I have done is very heavily research almost all of the BIOS settings, one good resource is the ROG forum at Asus because all of the 990FX chipset motherboards have the same BIOS settings, the lesser boards just omit certain things.

tomba
Send message
Joined: 21 Feb 09
Posts: 497
Credit: 700,690,702
RAC: 0
Level
Lys
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34507 - Posted: 29 Dec 2013 | 16:17:20 UTC - in response to Message 34500.

As for the video cards, you might want to look at the "blower" type fans, they do a great job blowing all the hot air out the back of the case. This is an example of a blower card

OK. I got it now!! Thanks !!!

Today I ordered from Amazon France the PNY "blower" version of the GTX 770, and I persuaded them to take back my ASUS 660 and replace it with the PNY "blower" version. They even sent a no-cost-to-me mailing label for the ASUS return! What service!!

I have to say that I've been very happy with ASUS GPUs over the years and I'm sorry to now be switching to another vendor. I hope it works out...
____________

TJ
Send message
Joined: 26 Jun 09
Posts: 815
Credit: 1,470,385,294
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34509 - Posted: 30 Dec 2013 | 0:23:15 UTC - in response to Message 34504.

I belief you flashhawk and I understand your ideas about the BIOS.
Tried HWiNFO64 at once and indeed reads much more temperatures then I have ever seen, also of Intel CPU's.
And I know that AMD Overdrive is made by AMD and can be trusted.
But Thermal Radar is made by ASUS who also made the MOBO and these temperature readings are a bit different (higher). That gave me worries, as I know to little about that. All I want is that my systems which run 24/7 do so for hopefully 5 years without overheating. Perhaps I am over cautious with high temperatures.
____________
Greetings from TJ

flashawk
Send message
Joined: 18 Jun 12
Posts: 297
Credit: 3,572,627,986
RAC: 0
Level
Arg
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwat
Message 34511 - Posted: 30 Dec 2013 | 4:39:35 UTC

I don't think you understood me TJ, maybe or maybe not. The Asus AI suite reads the temperature from a probe in the "socket" below the CPU, it's attached to the motherboard outside the CPU chip, it will never give you an accurate CPU temperature unless your interested in what that dead air space is under the CPU. If you remove the CPU, it's under the sliding bed that you drop the processor in to, Overdrive reads the 8 thermal probes built in to the silicone of each processor die.

Like this

This shows HWiNFO32/64 CPU-RAM information

That was one of my air cooled rigs with 2 GTX780's, these next 2 are liquid cooled (same rig)

This shows power consumption, HWiNFO64 CPU temp, PSU voltages



TJ
Send message
Joined: 26 Jun 09
Posts: 815
Credit: 1,470,385,294
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34512 - Posted: 30 Dec 2013 | 9:20:32 UTC - in response to Message 34511.

Yes I thought I understood you flashawk, but with your latest technical explanation it is total clear to me now. Thanks for that.
____________
Greetings from TJ

flashawk
Send message
Joined: 18 Jun 12
Posts: 297
Credit: 3,572,627,986
RAC: 0
Level
Arg
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwat
Message 34513 - Posted: 30 Dec 2013 | 12:20:02 UTC

I can't remember who it was that said that knowledge is power, the more accurate information you have the smoother the decision process is and you won't second guess your self. Software like AI (witch isn't all bad) is 90% responsible for giving OEM air coolers a bad name (AMD and Intel), people think they're junk when in reality they are adequate in a positively ventilated well engineered case.

If you trust your tools, you can make split second well informed decisions and start a routine that will help in the long run. Sorry about being so pushy about that software, Asus really should put a pop-up disclaimer in there and explain exactly what's being measured. Years ago, the CPU makers didn't put thermistors (for lack of a better word) in their processors, the motherboard manufacturers put them in the middle of the ZIF socket. You could see them, they were usually white or powder blue little beads, I thought they were the coolest things sense canned beer.

Us older folks remember them and recognize them as a hold over and don't give them a second thought, the younger crowd thinks of them as something they can use. I guess the MB makers just don't want to let go of them because they had temp probes first. I remember Abit and Iwill both dropped them (those were my favorite MB makers and I think Intel did too with some 440BX boards). Now everything is point and click, I still keep forgetting I can use my mouse in the BIOS, I still use the arrow keys.

tomba
Send message
Joined: 21 Feb 09
Posts: 497
Credit: 700,690,702
RAC: 0
Level
Lys
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34515 - Posted: 30 Dec 2013 | 15:16:46 UTC

My - this has been a busy thread! Since I started it, here's where I'm up to...

My new build is chugging away merrily 24/7, with 2xASUS GTX 660s, and upstairs my old rig is, sometimes, getting bonus credits via its GTX 460.

Tomorrow is promised my PNY GTX 770, which I shall install in my old rig for the time being, replacing the GTX 460.

The replacement for my broken case fan arrived today, and the PNY 660 is slated for arrival Jan 3. By then I will have the PNY 770 and two 660s, one (new) PNY and one (old) ASUS, the PNYs exhausting out the back of the case as recommended here.

Plan then is to move the PSU to the top of the case and see if I can fit three wide GPUs in there. After all, the mobo has four PCIe slots. We shall see what transpires....

Wish me luck :)

Dagorath
Send message
Joined: 16 Mar 11
Posts: 509
Credit: 179,005,236
RAC: 0
Level
Ile
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34517 - Posted: 30 Dec 2013 | 19:54:47 UTC - in response to Message 34515.

My - this has been a busy thread!


Because we've covered many of the things one needs to consider when building a computer for crunching. There is no single recipe to follow when building your own. There's lot of options, lots of things to consider. In exchange for doing that extra work you end up with a high performance rig at a very attractive price.

And thanks to input from flashawk we will now avoid the "bad" temperature reading software and not become worried over what are basically inaccurate temperature readings. That's essential stuff when building your own. Thanks flashawk :) BTW, the expression "knowledge is power" probably grew from Diderot's, "To ask who is to be educated is to ask who is to rule."

Plan then is to move the PSU to the top of the case and see if I can fit three wide GPUs in there. After all, the mobo has four PCIe slots. We shall see what transpires....

Wish me luck :)


It's not a matter of luck, it's a matter of skill and knowledge, but good luck to you anyway.

____________
BOINC <<--- credit whores, pedants, alien hunters

flashawk
Send message
Joined: 18 Jun 12
Posts: 297
Credit: 3,572,627,986
RAC: 0
Level
Arg
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwat
Message 34519 - Posted: 30 Dec 2013 | 21:41:30 UTC - in response to Message 34517.
Last modified: 30 Dec 2013 | 21:46:52 UTC

And thanks to input from flashawk we will now avoid the "bad" temperature reading software and not become worried over what are basically inaccurate temperature readings. That's essential stuff when building your own. Thanks flashawk :) BTW, the expression "knowledge is power" probably grew from Diderot's, "To ask who is to be educated is to ask who is to rule."


Ya, the deal with those temp readings really throws a wrench in people's plans and pushes them to throw money at something that could be better spent on other components. It's like teaching your self to let off on the brakes when your doing 360's in the snow so you can regain control, some people just can't let go of it (some won't either). "Diderot", that's it! Or him, it sucks getting old, I'm starting to forget stuff. I missed you last summer Dagorath, where did you disappear too? We missed you stirring things up around here, our discussions about cheaters on points and witch BOINC application gives the most points. I better not get to far off topic, the mods do a good job giving out tech advice too, this is actually one of the better GPU forums and we don't do gaming (or at least talk about it much).

Edit: I like your new quote, it's deep.

Dagorath
Send message
Joined: 16 Mar 11
Posts: 509
Credit: 179,005,236
RAC: 0
Level
Ile
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34523 - Posted: 31 Dec 2013 | 3:41:34 UTC - in response to Message 34519.

I'm basically retired but I have mineral claims. To make a long story short, some claim jumpers tried to move in. Things got ugly. The leader went to jail but kept giving me grief. Then someone burned his property to the ground, they arrested me for it and refused bail. In jail I had a "talk" with the claim jumper and convinced him to never even think of setting foot on one of my claims again. See what happens in jail pretty much stays in jail as long as you make a reasonable effort to not put the guards in a position where they have to do their job and report stuff. It couldn't have worked out better vis a vis stopping the claim jumpers if I had planned it that way.

The Crown thought they had enough to put me away for 2 years but a week before the matter went to trial the Crown persecutor finally admitted he had nothing so he dropped all but one of the charges and asked for my immediate release in return for a guilty plea on one of the lesser charges. I was in for 4 months and 2 days. A week after I was released I broke my shoulder and had some nasty complications as it healed. Spent 2 months in lala land stoned on morphine for the pain. Around September I healed properly, kicked the morphine and started rebuilding.

____________
BOINC <<--- credit whores, pedants, alien hunters

flashawk
Send message
Joined: 18 Jun 12
Posts: 297
Credit: 3,572,627,986
RAC: 0
Level
Arg
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwat
Message 34525 - Posted: 31 Dec 2013 | 5:14:49 UTC

Your blowing my mind over here, I grew up on a mining claim in the Central Sierra's and the foot hills (49er country). My family were all hard rock and hydraulic miners on my fathers side, we had a 1944 Dodge Power Wagon we drove in 32 miles where we had a tarpaper shack, tack room, feed shed, lunging stable and stock pins with 2 quarter horses, 1 thoroughbred, 2 appaloosas, 1 Dunn and a Surrey plus 8 mules and 2 burrows. All 6 of us kids (me and 5 sisters) would sand bag the creek and he had a 4" floating dredge and the slurry got pumped in to a 6 stage sluice box with carpeting in it.

The Forest Service dynamited our 87 year old cabin in 66 and he bought 2 14 man GP tents, we built wooden floors for them, put up perimeter shock wire (appaloosas are the nosiest creatures God created) and that was our life in the summer time. The ride on horse back was 7 1/2 miles from the end of the dirt road. I hope you've managed to settle down after your excellent adventure, I know about claim jumpers, we always had a care taker around to keep an eye on things in the winter.

You always manage to get me laughing with the predicaments you end up in, at least you had 3 hot's and a cot, I hope there weren't to many battle royals over the clicker.

Dagorath
Send message
Joined: 16 Mar 11
Posts: 509
Credit: 179,005,236
RAC: 0
Level
Ile
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34526 - Posted: 31 Dec 2013 | 7:18:50 UTC - in response to Message 34525.

The TV clicker? There wasn't one because it would have batteries in it and batteries can be rigged to light cigarettes, joints or start fires.

You were lucky to grow up there and that way. Not many of us get to live like that anymore. Gold is hard work requiring a big initial investment, the way I see it. You need your dredge, sluice box and all that and the law can blow it up if they don't want you around. I'm after gemstones. It's hard work too but all I've got invested is the cost of filing the claims, a 20 lb. sledge hammer for breaking rock plus smaller hammers for lighter work, a shovel, pry bar and a backpack. My claims are in a very geologically active area between 2 dormant volcanic mountains, 2,000 ft. above the valley floor. I don't like digging and the terms of my claim don't allow me to dynamite or use motorized equipment. I can dig by hand, pan creeks or smash rock with a hammer. My way is the latter. I have a few rock outcrops that have been productive so I hammer away at them. When I get tired of that I crawl through spring runoff chutes turning over boulders.

Yah, I get into some pickles but it beats the heck out of being normal. Glad you can get a chuckle out of my adventures, I do too :-)

____________
BOINC <<--- credit whores, pedants, alien hunters

tomba
Send message
Joined: 21 Feb 09
Posts: 497
Credit: 700,690,702
RAC: 0
Level
Lys
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34552 - Posted: 2 Jan 2014 | 15:58:07 UTC

Well - the PNY GTX 770 idea was a pig's breakfast. Should have been with me two days ago but...

1. Amazon France did not put my phone number on the address label
2. TNT "Express" were too idle to look in the phone book
3. TNT dropped it off this morning at a "Relais", a package distribution office 10 miles away.
4. Picked it up. WRONG GPU!! I ordered the one-fan rear-exhaust version. They shipped the three side-fans version and, judging from the packaging, I was a second user... What to do?

I installed the new beast on my old rig 'cause I noticed the three-fan version sells on Amazon France for twice what I paid. Did I get a steal (early data says a Nathan will complete in eight hours but 80C temp)? You be the judge:



Yes --- that's a de Havilland Mosquito, the only one flying today!!


Profile skgiven
Volunteer moderator
Volunteer tester
Avatar
Send message
Joined: 23 Apr 09
Posts: 3968
Credit: 1,995,359,260
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34555 - Posted: 2 Jan 2014 | 17:12:04 UTC - in response to Message 34552.

Try turning the fan speed up!
____________
FAQ's

HOW TO:
- Opt out of Beta Tests
- Ask for Help

tomba
Send message
Joined: 21 Feb 09
Posts: 497
Credit: 700,690,702
RAC: 0
Level
Lys
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34556 - Posted: 2 Jan 2014 | 17:15:07 UTC - in response to Message 34555.

Try turning the fan speed up!

I think I might need help on how to do that...:)

flashawk
Send message
Joined: 18 Jun 12
Posts: 297
Credit: 3,572,627,986
RAC: 0
Level
Arg
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwat
Message 34557 - Posted: 2 Jan 2014 | 17:45:48 UTC
Last modified: 2 Jan 2014 | 17:56:41 UTC

In the big controller box that says "Precision", at the bottom is the fan speed controller, click on "Auto" to switch it over to manual, slide it to 100% or how ever high it will allow it to go. Let the temperature settle down, then start working it back down in increments until you find a balance between cooling and noise that suits you.

Edit: Make sure "Apply at Windows start up" is checked or lit up, I'm not completely familiar with that version of Precision X, they have a much newer version (or one I recognize) at evga.com, creating a forum account is required though.

tomba
Send message
Joined: 21 Feb 09
Posts: 497
Credit: 700,690,702
RAC: 0
Level
Lys
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34559 - Posted: 2 Jan 2014 | 18:19:49 UTC - in response to Message 34557.

In the big controller box that says "Precision", at the bottom is the fan speed controller, click on "Auto" to switch it over to manual, slide it to 100% or how ever high it will allow it to go. Let the temperature settle down, then start working it back down in increments until you find a balance between cooling and noise that suits you.

Edit: Make sure "Apply at Windows start up" is checked or lit up, I'm not completely familiar with that version of Precision X, they have a much newer version (or one I recognize) at evga.com, creating a forum account is required though.

That "Apply at Windows start up" fixed it"! After the boot I could drag the slider. Great! Many thanks for the heads-up.

Dagorath
Send message
Joined: 16 Mar 11
Posts: 509
Credit: 179,005,236
RAC: 0
Level
Ile
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34560 - Posted: 2 Jan 2014 | 18:51:18 UTC - in response to Message 34552.

It was a pig's breakfast but don't deal with it by creating another pig's breakfast.

If you have cooling problems now then remember they are only going to get worse in the summmer unless you have air conditioning and can afford to run it 24/7. If you have 3 rear-exhaust GPUs in 1 rig I can show you how to build a very simple and inexpensive duct to collect the hot air at the back of the computer and duct it out a window, into the attic or even into another room but the first step in that process is to have all the hot air coming out of the rear of the computer where you can easily collect it and deal with it.

If you leave the 3 fan GTX 770 in your old rig you can solve the cooling problem NOW by increasing the fan speed BUT unless you have air conditioning, that's only going to bite you in the a** when summer comes because then you'll need to vent 2 rigs outside instead of just 1. In the long run the best solution is 3 rear-vent cards in 1 rig with 1 duct to carry the hot exhaust outside. And that can scale up to 4 cards in 1 rig if you can overcome the rumored driver issues.


____________
BOINC <<--- credit whores, pedants, alien hunters

Jacob Klein
Send message
Joined: 11 Oct 08
Posts: 1127
Credit: 1,901,927,545
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34573 - Posted: 4 Jan 2014 | 12:39:10 UTC
Last modified: 4 Jan 2014 | 12:42:29 UTC

For reference about high GPU temps...

I have 3 GPUs in my main rig, which runs quite hot.

The only way I can keep my Kepler 660Ti at maximum-boost clock speed, is to set a custom fan curve in Precision-X. So, I have "Auto" checked there, but I've gone into the options to set a custom curve, where I have set it to use maximum fan (80% for this GPU) before reaching the downclocking temperature threshold (70*C for a GPU Boost v1 GPU).

If your card has GPU Boost v1 (where Precision-X lets you adjust Power Target), I recommend creating a custom fan curve that will keep the temp below 70*C for maximum crunching performance.

If your card has GPU Boost v2 (where Precision-X lets you adjust both Power Target and Temp Target), I recommend creating a custom fan curve that will keep the temp below 80*C for maximum crunching performance. (I don't have such a card, but I'm told by others that the GPU doesn't start downclocking by 13Mhz intervals until 80*C on those GPUs)

Regards,
Jacob

tomba
Send message
Joined: 21 Feb 09
Posts: 497
Credit: 700,690,702
RAC: 0
Level
Lys
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34621 - Posted: 10 Jan 2014 | 18:18:45 UTC

Thread starter here...

I have to say that my new rig is a big disappointment.

1. The mobo (ASUS Sabertooth 990FX R2.0) boasts four PCIe slots. Yep. They're there! My plan was to fill them eventually with four top-of-the-line GPUs so I bought a big PSU. What do I find? I can only get two wide GPUs in there. Yes, there's room for two single-width GPUs too but I know of no single that is useful for GPUGrid (perhaps I'm mistaken...). Here's the pic (red lines are single PCIe slots):



2. The case (Cooler Master HAF932 Advanced) does not help with the PCIe problem. It has but seven PCI slots:



The left-most is useless since it aligns with the mobo. There was a chance that, by moving the PSU, I could get a third wide PSU in there on the right but there's no eighth slot to accommodate half of the double.

3. GTX 770s were recommended here. I had one for a few days. Its GPUGrid performance was 1.4x the 660s. At twice the price I passed.

I'm still within Amazon's return window for the mobo and the case. If you have recommendations Tom is listening!!.

Dagorath
Send message
Joined: 16 Mar 11
Posts: 509
Credit: 179,005,236
RAC: 0
Level
Ile
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34622 - Posted: 10 Jan 2014 | 20:39:25 UTC - in response to Message 34621.

Yikes! I thought the space between the slots was discussed upthread before you bought the board. Yes, that definitely is an issue. My advice at this point would be to search for a board with adequate spacing and if the price is acceptable return yours.

Another thing to be wary of is using after market cooling solutions as they often increase the width of the card to the point that it occupies 3 slots.

Regarding the case, you might have to modify the case by cutting a hole in the bottom and raising it up on blocks if you can't find one that accommodates 4 double-wide cards. Or just forget about the case and leave the boards sitting on a shelf. Or build a custom case.

____________
BOINC <<--- credit whores, pedants, alien hunters

Profile Retvari Zoltan
Avatar
Send message
Joined: 20 Jan 09
Posts: 2353
Credit: 16,375,531,916
RAC: 5,811,976
Level
Trp
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34623 - Posted: 10 Jan 2014 | 23:32:16 UTC - in response to Message 34621.

Thread starter here...

I have to say that my new rig is a big disappointment.

1. The mobo (ASUS Sabertooth 990FX R2.0) boasts four PCIe slots. Yep. They're there! My plan was to fill them eventually with four top-of-the-line GPUs so I bought a big PSU. What do I find? I can only get two wide GPUs in there. Yes, there's room for two single-width GPUs too but I know of no single that is useful for GPUGrid (perhaps I'm mistaken...).

You have three options:
1. install water blocks on your GPUs, and they will fit in single slots.
2. Use two top-end GPUs (at the moment: GTX 780Ti)
3. Use a different motherboard. I've checked the ASUS website, and I didn't find any AMD motherboards which could accomodate four double-slot GPUs. But I've found a Gigabyte MB which does: GA-990FXA-UD7

2. The case (Cooler Master HAF932 Advanced) does not help with the PCIe problem. It has but seven PCI slots:

The left-most is useless since it aligns with the mobo. There was a chance that, by moving the PSU, I could get a third wide PSU in there on the right but there's no eighth slot to accommodate half of the double.

The ATX standard consists only 7 PCI slots. The 8 slot wide ATX cases are very rare, mostly custom-made. (for example the one mentioned in this post). If I were running a host with four air cooled GPUs, I wouldn't put it in a case at all. It is very hard (loud) to dissipate 1kW from a PC case sustaining low component temperatures, using air cooling. Higher temperatures shorten the lifetime of the components. I do not recommend four GPUs in a single MB.

3. GTX 770s were recommended here. I had one for a few days. Its GPUGrid performance was 1.4x the 660s. At twice the price I passed.

The price of the cards are *not* in direct ratio of their performance. You always have to pay extra for the top-end cards. The 770 is 68% faster than a 660 - in theory. The faster the GPU the harder its performance will be hit by the WDDM overhead of the Windows 7 (Vista, 8, 8.1) (depending on the workunit batch). To fully utilize the performance of fast GPUs, you should run that fast card on an OS which does not have the WDDM overhead (Windows XP, Linux), and you have to leave a CPU core free from CPU tasks per GPU. Besides, in the long term it is much better to have two 770s in a single PC than four 660s. The cooling of these cards is *not* made for four-way crunching in 24/7. As time goes by new (faster) GPUs will be released, workunits will get longer, so the 770 is more future proof than a 660. BTW I think that the new Maxwell series GPUs (to be released later this year) won't have that WDDM overhead, or not that much.

TJ
Send message
Joined: 26 Jun 09
Posts: 815
Credit: 1,470,385,294
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34624 - Posted: 10 Jan 2014 | 23:51:19 UTC - in response to Message 34621.

Yes we told you that 4 GPU's would be a problem. Not only for fitting on the MOBO but also for heat.
There are bigger cases but they cost (a lot) more. Cosmos II is huge. MOBO's with more space between PCIe lanes are hard to find, but there are with 4, 5, 6, 7 of those slots but you have to pay between 400-700 for such a MOBO.

Skgiven has told many times in many threads that 660's or 660Ti's are the best for performance/costs. And that when more than 2 GPU's in a case may result in several issues.

Building one ones PC is fun, but you need to make a list of specifications you want and then see for the costs. It is not possible to build an "ideal" system cheap, but others have other ideas about that.

I would suggest you put the 770 in as the primary GPU and one 660 as the second GPU. Safe some money per month and build a second one later in the year.
____________
Greetings from TJ

Dagorath
Send message
Joined: 16 Mar 11
Posts: 509
Credit: 179,005,236
RAC: 0
Level
Ile
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34625 - Posted: 11 Jan 2014 | 1:34:25 UTC - in response to Message 34624.

It is not possible to build an "ideal" system cheap, but others have other ideas about that.


I think you can build an "ideal" and well cooled system cheap and cool it very well with air alone. You might have to think outside the box and build some of the components yourself but it's doable.


____________
BOINC <<--- credit whores, pedants, alien hunters

tomba
Send message
Joined: 21 Feb 09
Posts: 497
Credit: 700,690,702
RAC: 0
Level
Lys
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34682 - Posted: 15 Jan 2014 | 18:09:45 UTC

I had my hand slapped on another thread for not using my spare CPU threads on another worthwhile BOINC project. So I'm now running six Rosettas. With the two GPUGrids, the eight CPU cores are committed.

The effect is that the the CPU temperature has risen from 55C to 64C and the CPU fan RPMs has risen from 3700 to 4300. Is that a problem?

The CPU fan noise is a bit higher but acceptable.

My case has four big fans; front, back, side & top. They are connected only to the mobo. I did not take the option to power them using the supplied Molex connectors, which makes them run at 100%.

Here are the fan RPMs:



I've never seen fans 1-4 vary from these numbers.

How can I involve these fans in doing a better job of cooling the inside of my rig?




Jacob Klein
Send message
Joined: 11 Oct 08
Posts: 1127
Credit: 1,901,927,545
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34683 - Posted: 15 Jan 2014 | 18:49:01 UTC - in response to Message 34682.
Last modified: 15 Jan 2014 | 19:21:41 UTC

I didn't mean to slap your hand :-p.

My 5-year-old water-cooled quad-core hyperthreaded Intel i7 965 Extreme Edition (XE)... runs 77-87*C under full load.

Without Precision-X fan control and additional system fan control, my 3 GPUs would typically run 75-87*C. But one of them is a GTX 660 Ti, and I don't want it to downclock, so I strive to keep it below 70*C.

With Precision-X fan control (using a fan curve such that GPU reaches maximum fan before 70*C) and additional system fan control (via Dell XPS Thermal Monitor), my GPUs typically run 65-70*C.

And I run BOINC on all of this (3 GPUs, 8 logical CPUs), 24/7. The fans are loud, but the music is louder.

TJ
Send message
Joined: 26 Jun 09
Posts: 815
Credit: 1,470,385,294
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34687 - Posted: 15 Jan 2014 | 23:51:30 UTC - in response to Message 34682.

Hi Tom,

I have the same MOBO and CPU with AMD stock cooler. It can go to 6300rpm in summer here in the Netherlands.
I have two top fans, one front fan and one side fan connected to the MOBO connectors. i have the back fan connected to CHA_FAN4 as Asus recommend.
The values of the rpm's is varying almost constantly.

Have you made any setings for the fans in thermal Radar?
For each connecter on the MOBO and thus fan, you can set a scheme. I have used standard for all fans, as it is now winter, or it should be winter. In summer I use turbo and user.

Hope this helps a bit.
____________
Greetings from TJ

Dagorath
Send message
Joined: 16 Mar 11
Posts: 509
Credit: 179,005,236
RAC: 0
Level
Ile
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34688 - Posted: 16 Jan 2014 | 0:59:02 UTC - in response to Message 34682.
Last modified: 16 Jan 2014 | 1:01:15 UTC

If your CPU's max temp is 100*C then 64*C is a pretty decent CPU temperature. If you aren't bothered by fan noise, and your GPUs stay below 70*C and you can maintain those temperatures in the summer then I don't see much point in trying to improve anything.

If you want to make some inexpensive improvement then get some cable ties and tie every cable back to a wall inside the case to reduce turbulence and improve air flow. The best is if you have absolutely no cables running through the middle of the case. Bunch up the slack and tie it up with a cable tie then tie it back to a wall. Even better is to cut the cables to the exact length required and reposition the connectors so there is no slack at all.

Your 4 big fans are an example of the "high horsepower" mentality. If it works then it works but I've always preferred "smart horsepower" over "high horsepower" because it costs less and frequently does a better job. One school of thought says 4 fans pointing in 4 different directions creates a lot of turbulence in the case and doesn't improve cooling much. Experiments I have done and experiments others have performed and documented suggest that theory holds true in many cases, not necessarily your case, but in many cases. You have to remember that every case is different so what works very well for my case may not work as well for yours. Still, there are a few good principles that seem to work for everybody. The first is if you are not going to use internal ducts then keep the airflow going in one direction, preferably along the longest case dimension. That means fans on the front and rear only and not on the sides or top. Fans in the front pull air into the case while fans on the rear push hot air out, usually, but if the reverse direction works better for you then that's what you should do.

If you use an internal duct that directs cool air from outside the case directly onto the CPU or GPU then the game changes completely. The presence of the duct alone causes turbulence but since the cool air goes directly onto the heat source before it has a chance to heat up by mixing in with hot air, you get much better cooling performance from fewer fans. Less noise too.
____________
BOINC <<--- credit whores, pedants, alien hunters

tomba
Send message
Joined: 21 Feb 09
Posts: 497
Credit: 700,690,702
RAC: 0
Level
Lys
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34690 - Posted: 16 Jan 2014 | 18:53:30 UTC - in response to Message 34687.

Thanks for the response, TJ.

I have the back fan connected to CHA_FAN4 as Asus recommend.

I missed that. Tomorrow morning I'll check that the rear fan is on CHA_FAN4.

Have you made any settings for the fans in thermal Radar?

No! I had thought I had to get into the BIOS to do that. Nice to know I can do it in Windows!!

For each connector on the MOBO and thus fan, you can set a scheme.

Not so in my version of Thermal Radar; 1.01.29. #4 I can set individually, but the other three are in one group. I set the standard scheme for all three options; fan4, fans1-3 and the CPU fan.

When I get in there tomorrow I shall also try Dagorath's suggestion (thanks, Dagorath!) to disable the side and top fans so I have an uninterrupted cool air path from front to back. That makes sense since the case came with a funnel that directs air to the GPUs, both of which vent out the back.

Then I shall play with Standard vs. Turbo on the CPU and front and rear fans and see if I can optimize CPU core usage (yesterday I went to bed with all eight cores active and this morning I found an ASUS warning message: CPU temperature 65C)

Dagorath
Send message
Joined: 16 Mar 11
Posts: 509
Credit: 179,005,236
RAC: 0
Level
Ile
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34692 - Posted: 16 Jan 2014 | 21:39:30 UTC - in response to Message 34690.

That warning from Asus might not be cause for concern. Probably there is a setting in the Asus software that allows you to specify the temperature at which it will issue the warning and probably it's set to an arbitrary default of 65*C by Asus (assuming you haven't adjusted it yourself). 65 may or may not be appropriate for your CPU. What you need to do, since you are now officially a certified system builder :-) is find the specifications for your CPU and see what the official maximum operating is. You don't want to set the Asus alarm setting at that temperature because that is the temp where serious damage can start to occur. I like to keep my gear running no higher than 30*C below the official max while others say 10*C is good enough for them. The decision is yours but if were you I would set the Asus alarm at 20*C below max and strive to keep the temp 30*C below max. The point is you don't want the alarm going off when the CPU is not really in danger but you don't want it to wait until it's at the very brink of meltdown either, somewhere in between. Also, as flashawk pointed out a while ago, it depends on which sensor(s) the Asus software is reading. Complicated? Yes, but that's the life of a system builder :-)

Glad to hear your case has a funnel! That alone will make a big difference. If the side fan is pulling air into that funnel then leave it on. You can buy (or even make) ducts that direct air to your CPU too if you think it's running too close to the max operating temp specified by AMD.

Getting good cooling requires a little experimenting on your part since every case and the components inside have a unique physical geometry only you can work with directly. So yes, experiment with different fans being on and off and see if that increases the temps or lowers the temps.

If some of your fans don't seem to be affected by settings in the software or don't have RPM readings then they might be fans that don't have a tachometer in them and don't allow speed control (they run at one speed). You can tell by counting the wires leading to the fan. Every fan has at least 2 wires for power, black is usually -, red is usually +. If it has only 2 wires then it has no tachometer and no speed control. If it has a third wire (usually white or yellow but could be other colors too) then that wire is usually the PWM speed control wire. If it has a 4th wire (usually blue) that wire is usually the tach wire. If it doesn't have a tach or PWM wire then any numbers you see in the software for that fan are bogus, ignore them.

____________
BOINC <<--- credit whores, pedants, alien hunters

Jacob Klein
Send message
Joined: 11 Oct 08
Posts: 1127
Credit: 1,901,927,545
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34693 - Posted: 16 Jan 2014 | 21:52:06 UTC - in response to Message 34692.

More reference:
I run a program called Core Temp to monitor CPU temps, and it says that my Tj. Max is 100*C. I routinely run all 4 cores 75-87*C.

And a story: About a week ago, somehow Precision-X got set at a hardcoded low-fan-value, while BOINC was running. I had noticed the GPU was severely downclocked, and when I saw why, my jaw dropped to the floor. The 660 Ti was literally at 99*C, and still chugging, despite a ridiculous 40% fan value. I think I got lucky that it survived.

Dagorath
Send message
Joined: 16 Mar 11
Posts: 509
Credit: 179,005,236
RAC: 0
Level
Ile
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34694 - Posted: 16 Jan 2014 | 22:29:15 UTC - in response to Message 34693.
Last modified: 16 Jan 2014 | 22:34:25 UTC

The 660 Ti was literally at 99*C, and still chugging, despite a ridiculous 40% fan value. I think I got lucky that it survived.


Ouch! Mine have spiked up above 80°C on occasion but never that high. My gpu-d script monitors GPU fan temps and controls the fan to maintain a target temperature. I just added a feature that suspends crunching if the temp somehow gets out of control and goes above 80°. It runs only on Linux at the moment but yesterday I found Python bindings to NVML (NVidia Management Library) which will allow the script to work on Windows too. It could solve some of the problems with the SANTI tasks if I added a feature to downclock the GPU when it sees a SANTI task come in and restore the clock to normal for tasks that don't need downclocking. It could tweak voltage too.
____________
BOINC <<--- credit whores, pedants, alien hunters

Profile Retvari Zoltan
Avatar
Send message
Joined: 20 Jan 09
Posts: 2353
Credit: 16,375,531,916
RAC: 5,811,976
Level
Trp
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34695 - Posted: 16 Jan 2014 | 22:38:00 UTC - in response to Message 34692.
Last modified: 16 Jan 2014 | 23:10:51 UTC

If some of your fans don't seem to be affected by settings in the software or don't have RPM readings then they might be fans that don't have a tachometer in them and don't allow speed control (they run at one speed). You can tell by counting the wires leading to the fan. Every fan has at least 2 wires for power, black is usually -, red is usually +. If it has only 2 wires then it has no tachometer and no speed control. If it has a third wire (usually white or yellow but could be other colors too) then that wire is usually the PWM speed control wire. If it has a 4th wire (usually blue) that wire is usually the tach wire. If it doesn't have a tach or PWM wire then any numbers you see in the software for that fan are bogus, ignore them.

Actually the 3rd wire is the signaling wire for the tachometer, and in that case RPM control is done by applying the PWM to the + wire, or by lowering the voltage on the + wire. The 4th wire is the PWM control wire, you can connect more than one (4-pin) fan to such fan connectors, provided that only one of them should be connected to the 3rd (signaling) wire, so the system could monitor only that fan's RPM, but can control the RPM of all fans connected to that single fan connector through the 4th wire.

tomba
Send message
Joined: 21 Feb 09
Posts: 497
Credit: 700,690,702
RAC: 0
Level
Lys
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34698 - Posted: 17 Jan 2014 | 10:37:36 UTC

I got in there this morning. Moved the rear fan cable to CHA_FAN4 (it was on #2), and disconnected the top and side fans.

I've been running all eight CPU cores at 100% for two hours and the fan noise is fine! Here are the numbers:



Regarding the CPU temperature, instead of reporting actual temperature, the latest AMD Overdrive reports "Thermal Margin". Thermal Margin indicates how far the current operating temperature is below the maximum operating temperature of the processor. My numbers are below.

Right now (I think) I'm a happy bunny!

tomba
Send message
Joined: 21 Feb 09
Posts: 497
Credit: 700,690,702
RAC: 0
Level
Lys
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34699 - Posted: 17 Jan 2014 | 15:01:34 UTC

About 30 minutes ago I put the tower back in its hole. The thermal radar CPU temp has dropped from 61C to 59C and the AMD thermal margins are up to 20C.

Note in my last post that the two thermal radar PCIe temps are 11C apart. They still are. The two GPUs are identical. Any thoughts?

Profile skgiven
Volunteer moderator
Volunteer tester
Avatar
Send message
Joined: 23 Apr 09
Posts: 3968
Credit: 1,995,359,260
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34700 - Posted: 17 Jan 2014 | 15:21:40 UTC - in response to Message 34698.

Tomba, your "ASUS warning message: CPU temperature 65C" may be a Bios setting. If so, you can probably set it to a higher level or disable the warning in the Bios.
You can often control the CPU fan from the Bios. Settings are typically, Active, Passive/Silent, or Targeted (you set a temperature target and the fan adapts towards that). In the past I occasionally saw some OS software conflict with the Bios settings. Resolved using software or Bios updates.
____________
FAQ's

HOW TO:
- Opt out of Beta Tests
- Ask for Help

Dagorath
Send message
Joined: 16 Mar 11
Posts: 509
Credit: 179,005,236
RAC: 0
Level
Ile
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34701 - Posted: 17 Jan 2014 | 16:16:43 UTC - in response to Message 34699.

Note in my last post that the two thermal radar PCIe temps are 11C apart. They still are. The two GPUs are identical. Any thoughts?


What exactly are "PCIe temps"? Are those the temps reported by sensors situated on the mobo close to the PCIe slots? Or are they temps reported by sensors on the GPUs in the slots?

If the latter then I would say... If the target temperature is the same on both cards (and I would not just assume that it is) then the hotter one is either starving for cool air or it's fan is defective and isn't spinning as fast as it should. By "starving for cool air" I mean either the air it's receiving is too hot to allow the card to maintain the target temp or the airflow is somehow restricted.

____________
BOINC <<--- credit whores, pedants, alien hunters

tomba
Send message
Joined: 21 Feb 09
Posts: 497
Credit: 700,690,702
RAC: 0
Level
Lys
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34703 - Posted: 17 Jan 2014 | 16:30:08 UTC - in response to Message 34701.

Note in my last post that the two thermal radar PCIe temps are 11C apart. They still are. The two GPUs are identical. Any thoughts?


What exactly are "PCIe temps"? Are those the temps reported by sensors situated on the mobo close to the PCIe slots? Or are they temps reported by sensors on the GPUs in the slots?


I think they are mobo sensors, thus:



Tomorrow I'll get in there and check the airflow and perhaps switch the GPUs round to see if there's any difference.

TJ
Send message
Joined: 26 Jun 09
Posts: 815
Credit: 1,470,385,294
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34704 - Posted: 17 Jan 2014 | 18:23:56 UTC - in response to Message 34699.

About 30 minutes ago I put the tower back in its hole. The thermal radar CPU temp has dropped from 61C to 59C and the AMD thermal margins are up to 20C.

Note in my last post that the two thermal radar PCIe temps are 11C apart. They still are. The two GPUs are identical. Any thoughts?


One GPU is always hotter, I have seen in all my rigs with two GPU's. Its the primary one that is hotter. Same with same type/brand or different.

I have two top fans blowing air out and 1 side fan pulling air in, and have CPU of 55°C and PCIe at 48 and 43°C. Same MOBO and CPU. And the case closed.
____________
Greetings from TJ

tomba
Send message
Joined: 21 Feb 09
Posts: 497
Credit: 700,690,702
RAC: 0
Level
Lys
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34705 - Posted: 17 Jan 2014 | 18:42:22 UTC - in response to Message 34623.

Use a different motherboard. I've checked the ASUS website, and I didn't find any AMD motherboards which could accommodate four double-slot GPUs. But I've found a Gigabyte MB which does: GA-990FXA-UD7

Now there's a thought! Well spotted!!!

That would let me add the GTX 660 that's upstairs, running in my old rig, to my new rig, and would give me the expansion capability I was looking for originally.

I'm still within the Amazon 30-day return window. I'm tempted... :)

Dagorath
Send message
Joined: 16 Mar 11
Posts: 509
Credit: 179,005,236
RAC: 0
Level
Ile
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34706 - Posted: 17 Jan 2014 | 21:34:48 UTC - in response to Message 34703.

Tomorrow I'll get in there and check the airflow and perhaps switch the GPUs round to see if there's any difference.


You might notice a difference but my bet is that you won't. I think the difference is more likely due to some warm running chip/component on the mobo close to the sensor near the hotter slot. Switch the cards if you want to verify but I wouldn't worry about the one temp being higher than the other.

As for TJ's report that one GPU is always hotter than the other... I have noticed the same thing and like he says it's always the one in the primary slot. That's most likely due to heat from the card in the slot below it being sucked into the primary card's fan. Rear exhaust fans reduce that problem significantly but don't eliminate it completely because there is still radiant heat emanating from the secondary card. I have the same problem with 2 of my cards and when I have spare time I am going to build a custom intake duct that fits between the 2 cards and pushes lots of cool air between them as well as into the intake fans. But first I have to replace the mobo those cards are on because my cat murdered it yesterday by dragging it's little catnip scented toy into the custom cabinet I am building and dropping it on the mobo. I checked it with my ohm meter and the ribbon on the toy is very conductive! He went in through a temporary opening I had not blocked because I thought he would never be able to reach that opening. I think he must have lept off the TV which is amazing because the TV is about 1 meter away and the hole is pretty small. I found him there crying because he couldn't figure a way to get down, lol.

____________
BOINC <<--- credit whores, pedants, alien hunters

Dagorath
Send message
Joined: 16 Mar 11
Posts: 509
Credit: 179,005,236
RAC: 0
Level
Ile
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34707 - Posted: 17 Jan 2014 | 21:47:42 UTC - in response to Message 34705.

Use a different motherboard. I've checked the ASUS website, and I didn't find any AMD motherboards which could accommodate four double-slot GPUs. But I've found a Gigabyte MB which does: GA-990FXA-UD7

Now there's a thought! Well spotted!!!

That would let me add the GTX 660 that's upstairs, running in my old rig, to my new rig, and would give me the expansion capability I was looking for originally.

I'm still within the Amazon 30-day return window. I'm tempted... :)


I wouldn't plan on putting 4 GPUs on that mobo. They'll fit but I've been reading reports and info that indicate those mobos can't supply the current required by some video cards through the PCIe slot. The fact that the online stores I use here in Canada seem to be not re-stocking those boards kind of corroborates the notion they can't handle a big load through the PCIe slots. It might not even be able to handle 3 cards. Retvari and I were discussing it in my "What to build in 2014?" thread.

____________
BOINC <<--- credit whores, pedants, alien hunters

tomba
Send message
Joined: 21 Feb 09
Posts: 497
Credit: 700,690,702
RAC: 0
Level
Lys
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34708 - Posted: 18 Jan 2014 | 8:56:47 UTC - in response to Message 34707.

I wouldn't plan on putting 4 GPUs on that mobo. They'll fit but I've been reading reports and info that indicate those mobos can't supply the current required by some video cards through the PCIe slot.

Is it not the case that today's GPUs are powered from the PSU, not the mobo?

tomba
Send message
Joined: 21 Feb 09
Posts: 497
Credit: 700,690,702
RAC: 0
Level
Lys
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34709 - Posted: 18 Jan 2014 | 9:14:35 UTC

I have a problem.

Normally I do a Noelia in 10-11 hours.This morning I found this Noelia had been running for 22+ hours and was only 55% complete. I aborted it and got another. This one is forecast to run for 50 hours.

Below, EVGA Precision X shows a marked difference between my two 660s. I'd welcome some advice and guidance on what to do about this. Thanks.





Profile Retvari Zoltan
Avatar
Send message
Joined: 20 Jan 09
Posts: 2353
Credit: 16,375,531,916
RAC: 5,811,976
Level
Trp
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34710 - Posted: 18 Jan 2014 | 9:38:03 UTC - in response to Message 34709.
Last modified: 18 Jan 2014 | 9:40:41 UTC

I have a problem.

Normally I do a Noelia in 10-11 hours.This morning I found this Noelia had been running for 22+ hours and was only 55% complete. I aborted it and got another. This one is forecast to run for 50 hours.

Below, EVGA Precision X shows a marked difference between my two 660s. I'd welcome some advice and guidance on what to do about this. Thanks.

It can be clearly seen from the GPU clock reading on the lower EVGA Precision display, that your GPU is downclocked to 324MHz. This is a safety feature of the GPU (driver), and it can be reset back to normal only by a system restart. BTW in that case, you don't have to abort the workunit, a system restart will restore its original processing speed. To avoid such situation in the future, you need to lower your GPU clock slightly (or increase the GPU voltage slightly).

tomba
Send message
Joined: 21 Feb 09
Posts: 497
Credit: 700,690,702
RAC: 0
Level
Lys
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34711 - Posted: 18 Jan 2014 | 11:23:13 UTC - in response to Message 34710.

I have a problem.

Normally I do a Noelia in 10-11 hours.This morning I found this Noelia had been running for 22+ hours and was only 55% complete. I aborted it and got another. This one is forecast to run for 50 hours.

Below, EVGA Precision X shows a marked difference between my two 660s. I'd welcome some advice and guidance on what to do about this. Thanks.

It can be clearly seen from the GPU clock reading on the lower EVGA Precision display, that your GPU is downclocked to 324MHz. This is a safety feature of the GPU (driver), and it can be reset back to normal only by a system restart

Thanks for responding, Retvari. I rebooted. No change. I reinstalled the Nvidia driver. No change.

you need to lower your GPU clock slightly (or increase the GPU voltage slightly).

Help me out on how to do that please...

Jacob Klein
Send message
Joined: 11 Oct 08
Posts: 1127
Credit: 1,901,927,545
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34712 - Posted: 18 Jan 2014 | 12:40:05 UTC - in response to Message 34709.

Are you sure the 41*C GPU is actually loaded? It looks like it isn't.
You can double-click one of the graphs in the "Performance Log" at the bottom, to open the full list of graphs, so that you can view GPU Usage for both GPU1 and GPU2.

tomba
Send message
Joined: 21 Feb 09
Posts: 497
Credit: 700,690,702
RAC: 0
Level
Lys
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34713 - Posted: 18 Jan 2014 | 13:22:27 UTC - in response to Message 34712.

Are you sure the 41*C GPU is actually loaded? It looks like it isn't.
You can double-click one of the graphs in the "Performance Log" at the bottom, to open the full list of graphs, so that you can view GPU Usage for both GPU1 and GPU2.

Thanks for responding, Jacob. After working out how EVGA Precision X logging works, I constructed this snapshot:



So GPU 1 is loaded, but not very heavily!! It would be nice if there's a way to clone GPU2's settings to GPU1...

tomba
Send message
Joined: 21 Feb 09
Posts: 497
Credit: 700,690,702
RAC: 0
Level
Lys
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34716 - Posted: 18 Jan 2014 | 18:17:06 UTC - in response to Message 34715.

I'm beginning to think my GPU1 has lost it and needs to be replaced. It's still running at 10% of its potential.

Tomorrow I'll install it on my old rig and see what happens...

Dagorath
Send message
Joined: 16 Mar 11
Posts: 509
Credit: 179,005,236
RAC: 0
Level
Ile
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34718 - Posted: 18 Jan 2014 | 20:23:38 UTC - in response to Message 34708.

I wouldn't plan on putting 4 GPUs on that mobo. They'll fit but I've been reading reports and info that indicate those mobos can't supply the current required by some video cards through the PCIe slot.

Is it not the case that today's GPUs are powered from the PSU, not the mobo?


Check out the thread titled "What to build in 2014". Retvari explains. The PCIe bus itself (separate from the GPU) requires power too. I'm not sure exactly why it requires that many amps but I know from other reading that it does and I am confident Retvari has done the calculations in that other thread correctly. The pins on the 24 pin connector can supply that much amperage for a little while but they will run hot and eventually deteriorate due to the high temperature to the point that they melt and/or catch fire. Remember the connector is only a mechanical connection not a soldered connection therefore it cannot carry as much current as the wires leading to and from it. Hmmm. That makes me think a good upgrade would be to cut the connector off and solder the wires directly to the mobo/pins but that marries the PSU to the mobo, not convenient.
____________
BOINC <<--- credit whores, pedants, alien hunters

Profile Retvari Zoltan
Avatar
Send message
Joined: 20 Jan 09
Posts: 2353
Credit: 16,375,531,916
RAC: 5,811,976
Level
Trp
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34723 - Posted: 19 Jan 2014 | 10:49:53 UTC - in response to Message 34716.

I'm beginning to think my GPU1 has lost it and needs to be replaced. It's still running at 10% of its potential.

Tomorrow I'll install it on my old rig and see what happens...

That's a good idea. If the card is OK in your old rig, I suggest you to try this card in the original PC, but remove the second card for a day.

Profile skgiven
Volunteer moderator
Volunteer tester
Avatar
Send message
Joined: 23 Apr 09
Posts: 3968
Credit: 1,995,359,260
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34724 - Posted: 19 Jan 2014 | 11:33:19 UTC - in response to Message 34723.

Tomba, NVidia control panel - Prefer maximum performance. If that doesn't work tweak the clocks slightly (to try to force them to stick), suspend CPU work (in case the CPU is saturated and preventing the GPU using enough power - which causes it to run in low power mode), shut down and check the power connectors/do a cold restart. If it does start running normally, work out what caused the downclock, make sure the GPU fans are working properly...
____________
FAQ's

HOW TO:
- Opt out of Beta Tests
- Ask for Help

tomba
Send message
Joined: 21 Feb 09
Posts: 497
Credit: 700,690,702
RAC: 0
Level
Lys
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34729 - Posted: 19 Jan 2014 | 15:59:57 UTC - in response to Message 34716.

I'm beginning to think my GPU1 has lost it and needs to be replaced. It's still running at 10% of its potential.

Tomorrow I'll install it on my old rig and see what happens...

Removed the offending PNY GPU and replaced it with the ASUS GPU from my old rig. Both are running full speed ahead.

Put the offender into my old rig. The Nvidia driver kept stalling and the rig died. Tried a few times. Nada. Printed the Amazon returns doc and parcelled it up. Bye-bye tomorrow...

tomba
Send message
Joined: 21 Feb 09
Posts: 497
Credit: 700,690,702
RAC: 0
Level
Lys
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34730 - Posted: 19 Jan 2014 | 18:09:44 UTC - in response to Message 34724.

Tomba, NVidia control panel - Prefer maximum performance.

Thanks for responding, skgiven. There was a time when I had the Nvidia control panel on the task bar, but those days are gone. And I could not find it anywhere else:



Richard Haselgrove
Send message
Joined: 11 Jul 09
Posts: 1626
Credit: 9,295,466,723
RAC: 18,414,230
Level
Tyr
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34731 - Posted: 19 Jan 2014 | 19:06:25 UTC - in response to Message 34730.

It's a control panel:



You may have to switch the main Control Panel off categorised view in order to see it - that's why I like that handy alphabetical menu from the Start button.

Profile skgiven
Volunteer moderator
Volunteer tester
Avatar
Send message
Joined: 23 Apr 09
Posts: 3968
Credit: 1,995,359,260
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34745 - Posted: 21 Jan 2014 | 0:19:18 UTC - in response to Message 34731.

If you right click on your desktop screen...
____________
FAQ's

HOW TO:
- Opt out of Beta Tests
- Ask for Help

tomba
Send message
Joined: 21 Feb 09
Posts: 497
Credit: 700,690,702
RAC: 0
Level
Lys
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34855 - Posted: 31 Jan 2014 | 19:30:03 UTC

Thread starter here again!

The last time I posted I reported that I had disconnected the side and top case fans, and was running all eight CPU cores; two for GPUGrid and six for Rosetta.

For a couple of days I've seen this:



I run two identical, unclocked, PNY GTX 660s, but it looks like the 16C difference in temperature between PCIE-1 and PCIE-2 could be an issue.

What, if anything, to do?

TJ
Send message
Joined: 26 Jun 09
Posts: 815
Credit: 1,470,385,294
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34856 - Posted: 31 Jan 2014 | 20:13:14 UTC - in response to Message 34855.

I have the same MOBO same CPU two EVGA 660's but 5 fans and case closed despite what others are saying and have all temperatures lower than yours. Nothing is over 56°C. Running 24/7 ambient temperature 23.4°C.

So I would close the case and connect more fans, if you have to the MOBO and let the Asus software manage to fan speed.
____________
Greetings from TJ

tomba
Send message
Joined: 21 Feb 09
Posts: 497
Credit: 700,690,702
RAC: 0
Level
Lys
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34861 - Posted: 2 Feb 2014 | 8:25:58 UTC - in response to Message 34856.

I have the same MOBO same CPU two EVGA 660's but 5 fans and case closed despite what others are saying and have all temperatures lower than yours. Nothing is over 56°C. Running 24/7 ambient temperature 23.4°C.

So I would close the case and connect more fans, if you have to the MOBO and let the Asus software manage to fan speed.

Thanks for responding TJ :)

I connected the two missing fans and closed the case, as always (broke a top fan blade with my finger some time ago with one side off...). That got rid of the PCIe warning:



This morning found an ASUS warning "CPU at 65C". In fact it was at 67C, with it's fan on turbo. Stopped two Rosetta threads, which brought the CPU temp down to 63C. Looks like I cannot keep all my CPU cores busy...

What case are you using?




tomba
Send message
Joined: 21 Feb 09
Posts: 497
Credit: 700,690,702
RAC: 0
Level
Lys
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34862 - Posted: 2 Feb 2014 | 8:41:27 UTC

18th January I reported that one of my two PNY 660s appeared to have gone on the blink, it running very slowly. I replaced it.

It's happened again:



I powered off and waited 30 minutes. No change.

I don't see a way with EVGA Precision X to attempt to breath new life into this GPU. Is there other software I could try to do that?

Jacob Klein
Send message
Joined: 11 Oct 08
Posts: 1127
Credit: 1,901,927,545
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34864 - Posted: 2 Feb 2014 | 12:40:29 UTC - in response to Message 34862.
Last modified: 2 Feb 2014 | 12:44:23 UTC

Tomba:

Have you tried:
1) The latest drivers? 334.67 were recently released
2) Setting change (as skgiven previously recommended): Right click Desktop -> NVIDIA Control Panel -> Manage 3D settings -> Global Settings tab -> Power management mode -> Set to "Prefer Maximum Performance" -> Click Apply -> Restart computer -> Reopen NVIDIA Control Panel to verify setting stuck -> Verify results

Those are all I can think of. Remember to restart the PC after changing that setting in suggestion 2.

Any luck?

tomba
Send message
Joined: 21 Feb 09
Posts: 497
Credit: 700,690,702
RAC: 0
Level
Lys
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34865 - Posted: 2 Feb 2014 | 13:16:57 UTC - in response to Message 34864.

Tomba:

Have you tried:
1) The latest drivers? 334.67 were recently released
2) Setting change (as skgiven previously recommended): Right click Desktop -> NVIDIA Control Panel -> Manage 3D settings -> Global Settings tab -> Power management mode -> Set to "Prefer Maximum Performance" -> Click Apply -> Restart computer -> Reopen NVIDIA Control Panel to verify setting stuck -> Verify results

Those are all I can think of. Remember to restart the PC after changing that setting in suggestion 2.

Any luck?

Thanks for responding Jacob :)

1) I've been running 334.67 for four days.
2) Did that previously. Just cheked it stuck. It did.
3) No luck!!


Jacob Klein
Send message
Joined: 11 Oct 08
Posts: 1127
Credit: 1,901,927,545
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34866 - Posted: 2 Feb 2014 | 13:18:39 UTC - in response to Message 34865.

Hmph. You might consider opening a ticket with NVIDIA support, while you continue researching it. Could very well be a driver bug.

TJ
Send message
Joined: 26 Jun 09
Posts: 815
Credit: 1,470,385,294
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34867 - Posted: 2 Feb 2014 | 14:47:07 UTC - in response to Message 34866.
Last modified: 2 Feb 2014 | 14:47:32 UTC

Hmph. You might consider opening a ticket with NVIDIA support, while you continue researching it. Could very well be a driver bug.

Could be, but almost once a day a Santi WU down clocks my 660. After a half year of experimenting I came to the conclusion that Santi WU's and 660's don't work well together. So I have to boot that system once a day.

@ Tomba: it is an older and smaller case you have. I have to check what type exactly but it is about similar to a Cooler Master 6xx series.
I don't have the CPU-fan in turbo mode and it is using 7 cores continuously at 54-55°.
____________
Greetings from TJ

tomba
Send message
Joined: 21 Feb 09
Posts: 497
Credit: 700,690,702
RAC: 0
Level
Lys
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34868 - Posted: 2 Feb 2014 | 15:33:47 UTC

Here's another view of my problem, courtesy of ASUS GPU Tweak. I show the two GPUs side by side.

Perhaps there's a clue in there somewhere?...

tomba
Send message
Joined: 21 Feb 09
Posts: 497
Credit: 700,690,702
RAC: 0
Level
Lys
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34869 - Posted: 2 Feb 2014 | 16:14:48 UTC

I just submitted a problem report to PNY. Let's see what happens...

Jacob Klein
Send message
Joined: 11 Oct 08
Posts: 1127
Credit: 1,901,927,545
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34870 - Posted: 2 Feb 2014 | 16:21:43 UTC - in response to Message 34869.

It's more likely to be an NVIDIA driver problem. You should contact NVIDIA.

tomba
Send message
Joined: 21 Feb 09
Posts: 497
Credit: 700,690,702
RAC: 0
Level
Lys
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34871 - Posted: 2 Feb 2014 | 16:41:42 UTC - in response to Message 34870.

It's more likely to be an NVIDIA driver problem. You should contact NVIDIA.


OK! Did that...

Jim1348
Send message
Joined: 28 Jul 12
Posts: 819
Credit: 1,591,285,971
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34872 - Posted: 2 Feb 2014 | 21:19:01 UTC - in response to Message 34862.

18th January I reported that one of my two PNY 660s appeared to have gone on the blink, it running very slowly. I replaced it.

It's happened again:

I don't see a way with EVGA Precision X to attempt to breath new life into this GPU. Is there other software I could try to do that?

Try increasing the "Power Target" to something like 110%. That has fixed it for me for two GTX 660s on one PC running Win7 64-bit (331.65 drivers), and another PC running one GTX 660 on WinXP (332.21 drivers). Otherwise, the card hits its power limit on difficult work units, and down-clocks to protect itself. In hard cases you might have to increase the core voltage a little and downclock some (reduce the GPU core frequency by 50 MHz or so). As long as the temperature does not get too high, you should not have a problem.

Jacob Klein
Send message
Joined: 11 Oct 08
Posts: 1127
Credit: 1,901,927,545
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34873 - Posted: 2 Feb 2014 | 21:48:00 UTC - in response to Message 34872.
Last modified: 2 Feb 2014 | 21:48:15 UTC

Jim,
5 posts up, he shows that the power % is at something like 24%. He's not hitting the 100% power limit.

Jim1348
Send message
Joined: 28 Jul 12
Posts: 819
Credit: 1,591,285,971
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34874 - Posted: 2 Feb 2014 | 21:58:03 UTC - in response to Message 34873.

Jim,
5 posts up, he shows that the power % is at something like 24%. He's not hitting the 100% power limit.

OK, thanks. Maybe his card is bad as he suspects. I was wondering how long they last on average; Maxwell may be delayed a little, and we are running them harder than the average gamer would.

TJ
Send message
Joined: 26 Jun 09
Posts: 815
Credit: 1,470,385,294
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34876 - Posted: 3 Feb 2014 | 0:34:18 UTC - in response to Message 34874.

Jim,
5 posts up, he shows that the power % is at something like 24%. He's not hitting the 100% power limit.

OK, thanks. Maybe his card is bad as he suspects. I was wondering how long they last on average; Maxwell may be delayed a little, and we are running them harder than the average gamer would.

Tomba needs to reboot his system to get the cards at clock speed again, run in down clocked mode now!
____________
Greetings from TJ

tomba
Send message
Joined: 21 Feb 09
Posts: 497
Credit: 700,690,702
RAC: 0
Level
Lys
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34877 - Posted: 3 Feb 2014 | 11:04:28 UTC

I removed the offending PNY 660. Switched on and the second PNY ran full belt.

Removed the ASUS 660 from my old rig and installed the "offending" PNY. It is running full belt:



Conclusion? Problem with PCIe slot #1 in the new rig. To prove it I installed the ASUS 660 in that slot. Wrong conclusion! Both 660s are running full belt!!

So - I'm back in full production but I've no idea how!!



TJ
Send message
Joined: 26 Jun 09
Posts: 815
Credit: 1,470,385,294
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34878 - Posted: 3 Feb 2014 | 11:55:02 UTC - in response to Message 34877.

So - I'm back in full production but I've no idea how!!

Simply by rebooting the system. That is what you should do at once when you have a down clocked GPU. All other work is unnecessary.
____________
Greetings from TJ

Jacob Klein
Send message
Joined: 11 Oct 08
Posts: 1127
Credit: 1,901,927,545
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34879 - Posted: 3 Feb 2014 | 12:20:37 UTC - in response to Message 34878.

Rebooting the system is a workaround, not a fix.

The real problem is ??? .... Drivers?

Jim1348
Send message
Joined: 28 Jul 12
Posts: 819
Credit: 1,591,285,971
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34880 - Posted: 3 Feb 2014 | 13:58:00 UTC - in response to Message 34879.
Last modified: 3 Feb 2014 | 14:02:34 UTC

Rebooting the system is a workaround, not a fix.

Yes, the problem will reoccur when he hits another difficult portion of the work unit.

But I don't see that he has reduced the GPU clock yet. The clock is set to 1110 MHz, and running at 1123 MHz as I read it; most 660s won't run stably when the base is set much above 1000 MHz in my experience (though the boost will increase it above that, but I would let the on-chip circuitry handle that). And the voltage might have to go up a little; 1175 mV should be about right.

tomba
Send message
Joined: 21 Feb 09
Posts: 497
Credit: 700,690,702
RAC: 0
Level
Lys
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34881 - Posted: 3 Feb 2014 | 17:10:33 UTC - in response to Message 34878.

So - I'm back in full production but I've no idea how!!

Simply by rebooting the system. That is what you should do at once when you have a down clocked GPU. All other work is unnecessary.

TJ - in the couple of days since the problem arose I must have rebooted six times...

tomba
Send message
Joined: 21 Feb 09
Posts: 497
Credit: 700,690,702
RAC: 0
Level
Lys
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34882 - Posted: 3 Feb 2014 | 17:12:38 UTC - in response to Message 34880.

But I don't see that he has reduced the GPU clock yet.

A couple of times in this thread I've asked what app to use for doing just that...

Jacob Klein
Send message
Joined: 11 Oct 08
Posts: 1127
Credit: 1,901,927,545
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34883 - Posted: 3 Feb 2014 | 17:17:09 UTC - in response to Message 34882.

You can use Precision-X to change the "GPU Clock Offset" to not only raise it... but lower it.

Dagorath
Send message
Joined: 16 Mar 11
Posts: 509
Credit: 179,005,236
RAC: 0
Level
Ile
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34884 - Posted: 3 Feb 2014 | 17:37:28 UTC - in response to Message 34881.

So - I'm back in full production but I've no idea how!!

Simply by rebooting the system. That is what you should do at once when you have a down clocked GPU. All other work is unnecessary.

TJ - in the couple of days since the problem arose I must have rebooted six times...


What do you mean by reboot? Do you mean a shutdown->poweroff->restart or simply shutdown the OS and restart without a poweroff? Those are 2 different things. Your last reboot when everything started working must have involved a poweroff, IIUC, as it sounds like you switched some cards around. If previous reboots were not from a powered off condition then that's probably why they did not make any difference.

Also, powering off and letting it sit for 15 minutes sometimes makes a difference too. Some components fail when warm but work again after they are powered off and allowed to cool. They fail again when they run for a while and warm up again but that pattern is a clue to the solution. Also, certain settings and/or conditions in some hardware components don't reinitialize unless you leave the power off long enough for the capacitors to drain down (lose their charge). That can take a few minutes. So if all else fails do an OS shutdown and poweroff for 30 minutes, everything, even the monitor and modem.

____________
BOINC <<--- credit whores, pedants, alien hunters

Jim1348
Send message
Joined: 28 Jul 12
Posts: 819
Credit: 1,591,285,971
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 34885 - Posted: 3 Feb 2014 | 19:22:42 UTC - in response to Message 34882.
Last modified: 3 Feb 2014 | 19:25:06 UTC

But I don't see that he has reduced the GPU clock yet.

A couple of times in this thread I've asked what app to use for doing just that...

As Jacob said, you should be able to use Precision-X, though I have never used it. But MSI Afterburner also works on any card, with a little learning curve on how to save your settings and how to automatically re-enable them after a reboot (it is slightly obscure).

My favorite for Nvidia cards, since that is all we are talking about here anyway, is Nvidia Inspector
http://www.guru3d.com/files_details/nvidia_inspector_download.html

Just select "Show Overclocking", and change the Base Clock, Voltage Offset and Power Target as you wish. Then, click on "Apply Clocks & Voltage", and finally save your settings by right-clicking on the "Create Short Cuts" button, and create a Clock Startup Task for Win7 (the "Create Clocks Shortcut" is for WinXP, and you then move that shortcut from the desktop to the startup folder).

Occasionally I find a card where one or the other of these parameters can't be controlled by software and I have to modify the BIOS, but that is another subject that hopefully you can avoid.

Lluis
Send message
Joined: 22 Feb 14
Posts: 26
Credit: 672,639,304
RAC: 0
Level
Lys
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwat
Message 35614 - Posted: 12 Mar 2014 | 11:02:09 UTC - in response to Message 34885.
Last modified: 12 Mar 2014 | 11:07:09 UTC

Until three weeks ago I had a very old computer, so I hadn’t thought about joining the BOINC projects and still less GPUGRID. As I’m not a gamer when I bought a new computer for a general use I stand with a GTX630 graphic card. After a couple of days being in GPUGRID I decided to upgrade to a better card for crunching. So I’ve changed to a GTX660 OC. Now a couple of weeks later I’m impressed. If my calculations are correct my computer (mostly my graphic card!!!) have done more than 1.3 x 10^18 floating point operations (3 million credits).

I want to help more in biological investigations, and for me GPUGRID is a good choice, so I’m thinking in buying a second graphic card but I need advice. Any help will be welcome because I am a novice in these matters.

Mother board: GIGABYTE B85M-D3H (socket 1150. In specs says “Multi-GPU Support”)
CPU: Intel I5 4570; 8 Gb DDR3; 1 Tb hard drive; 1 DVD; 1 card reader
Cooler Master 600 W
Windows 7 64bits

Is possible with my actual computer to add a second graphic card?
600W are enough?
Is worth? Would I double my capacity of crunching?
Another GTX660 OC is a good choice or there are better cards for crunching at similar prices?
And finally, is necessary to change the software configuration? Is automated or I’ve to do it manually (because I don’t know how to do it)?

Thank you for your attention,
Lluis

mikey
Send message
Joined: 2 Jan 09
Posts: 298
Credit: 6,601,400,210
RAC: 16,435,860
Level
Tyr
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 35615 - Posted: 12 Mar 2014 | 11:48:28 UTC - in response to Message 35614.

Until three weeks ago I had a very old computer, so I hadn’t thought about joining the BOINC projects and still less GPUGRID. As I’m not a gamer when I bought a new computer for a general use I stand with a GTX630 graphic card. After a couple of days being in GPUGRID I decided to upgrade to a better card for crunching. So I’ve changed to a GTX660 OC. Now a couple of weeks later I’m impressed. If my calculations are correct my computer (mostly my graphic card!!!) have done more than 1.3 x 10^18 floating point operations (3 million credits).

I want to help more in biological investigations, and for me GPUGRID is a good choice, so I’m thinking in buying a second graphic card but I need advice. Any help will be welcome because I am a novice in these matters.

Mother board: GIGABYTE B85M-D3H (socket 1150. In specs says “Multi-GPU Support”)
CPU: Intel I5 4570; 8 Gb DDR3; 1 Tb hard drive; 1 DVD; 1 card reader
Cooler Master 600 W
Windows 7 64bits

Is possible with my actual computer to add a second graphic card?
600W are enough?
Is worth? Would I double my capacity of crunching?
Another GTX660 OC is a good choice or there are better cards for crunching at similar prices?
And finally, is necessary to change the software configuration? Is automated or I’ve to do it manually (because I don’t know how to do it)?

Thank you for your attention,
Lluis


When I looked at what Newegg said is a picture of your motherboard I did not see a second pci-e slot, just some pci slots. If that is true then NO a 2nd 660 will not work. The older computers often advertised multi gpu slots, but often MEANT multi pci slots as that is what the older gpu's were. In truth very few pci gpu's can crunch, I personally only know of one, and it is VERY hard to find as they haven't made it in years.

Lluis
Send message
Joined: 22 Feb 14
Posts: 26
Credit: 672,639,304
RAC: 0
Level
Lys
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwat
Message 35616 - Posted: 12 Mar 2014 | 13:00:18 UTC - in response to Message 35615.


I did not see a second pci-e slot, just some pci slots. If that is true then NO a 2nd 660 will not work.


Thank you for your quick answer. I’ve checked also with the technical service where I bought the computer and they told me more or less the same.
As in the specifications says:
1 x PCI Express x16 slot, running at x16 (PCIEX16)
* For optimum performance, if only one PCI Express graphics card is to be installed, be sure to install it in the PCIEX16 slot.
(The PCIEX16 slot conforms to PCI Express 3.0 standard.)
1 x PCI Express x16 slot, running at x4 (PCIEX4)
2 x PCI slots

They told me something like that the second PCI-e is 4 and is necessary a minimum of 8. I don't understand but it seems very similar at what you say.
It's a pity, but better to know before to act.
I repeat.
Thank you very much.

Profile Retvari Zoltan
Avatar
Send message
Joined: 20 Jan 09
Posts: 2353
Credit: 16,375,531,916
RAC: 5,811,976
Level
Trp
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 35617 - Posted: 12 Mar 2014 | 13:29:15 UTC - in response to Message 35615.
Last modified: 12 Mar 2014 | 13:30:54 UTC

There is a second PCIe slot on the GA-B85M-D3H motherboard:



However it's limited to PCIe 2.0 x4

This is motherboard can accommodate a second graphic card, but it's not the best choice to do so, as there is only 3 slot separation between the two PCIe slots, so there will be only 1 slot space left between the two GPUs, so the upper one will get hotter.
It is always better to have one faster GPU than to have more slower ones. Overclocked cards usually have such cooling which dissipates the heat into the case, while standard cards' cooling vents the heat directly out from the case through the rear grille. The latter is better when there are more than one GPU inside the PC's case. Overclocked cards are also more prone to crunching errors, as these cards are made for gaming, and gaming has lower stability standards than crunching.

Dagorath
Send message
Joined: 16 Mar 11
Posts: 509
Credit: 179,005,236
RAC: 0
Level
Ile
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 35618 - Posted: 12 Mar 2014 | 15:32:32 UTC

Hi Lluis and welcome to GPUgrid,

One of the biggest problems with GPUs is keeping them cool. As Zoltan said, if you put 2 cards on your motherboard you could have a problem keeping the top one cool due to heat rising into the card from the lower card. There are ways to overcome that problem but it requires some work on your part. And since you live in Spain where it can be very hot in the summer, cooling issues are worse for you.

The easiest for you would be to sell your 660 and replace it with a single 670, 680 or 690 to avoid having 2 cards in the computer. Unfortunately that is also the most expensive solution. If you want to do some work customizing and modifying your computer there are ways you could add a second card and have a good chance they will both run cool but there are no guarantees.

Your 660 OC probably has axial fans which look like this or like this. As Zoltan said, those cards dissipate the heat into the case which adds to the cooling problem. Cards with radial fans like this blow the heat out of the rear grill instead of into the case.

If a card like yours with axial fans is in the lower slot it will blow heat into the fan of the card in the upper slot and cause it to run hotter. That would not a problem for me in Canada because here the ambient air temperature (the temperature of the air in the room) is much lower than for you in Spain. One solution is to put the card with axial fans in the upper slot but then the hot air blows into the CPU fan. You can solve that by putting a very large heatsink and fan on the CPU which is additional work and expense. A cheaper solution is to make a barrier from cardboard or other other non-conducting material and place it above the card with axial fans. The barrier would direct the hot air away from the other GPU or the CPU. Another solution is to add a high volume fan to blow or suck the hot air from the radial fans out of the case before it reaches the other GPU or CPU. You can remove the side of the case and place a mains powered fan to blow air into the case or you can cut a hole in the case side and install a 12 volt fan. You can make 12 volt fans much more effective by adding ducts to put the airflow exactly where you want it. You can make ducts with a little work or you can buy ducts that have the correct hole dimensions for attaching easily to computer fans. Ducts are very effective and relatively inexpensive.

So the choice is add a second mid-range card and do some additional work or replace your mid-range card with a single high-end card.
____________
BOINC <<--- credit whores, pedants, alien hunters

Werkstatt
Send message
Joined: 23 May 09
Posts: 121
Credit: 339,775,664
RAC: 496,012
Level
Asp
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 35871 - Posted: 24 Mar 2014 | 22:00:26 UTC - in response to Message 35618.

As Zoltan said, if you put 2 cards on your motherboard you could have a problem keeping the top one cool due to heat rising into the card from the lower card.


One way to solve this proplem is to use a SILVERSTONE Raven case.
Google for that and check the pictures. The MB is mounted 90° turned cw. This avoids overheating of the 'top' cards because there is no top card.

Alexander

Profile Retvari Zoltan
Avatar
Send message
Joined: 20 Jan 09
Posts: 2353
Credit: 16,375,531,916
RAC: 5,811,976
Level
Trp
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 35877 - Posted: 25 Mar 2014 | 6:10:06 UTC - in response to Message 35871.
Last modified: 25 Mar 2014 | 6:10:43 UTC

As Zoltan said, if you put 2 cards on your motherboard you could have a problem keeping the top one cool due to heat rising into the card from the lower card.


One way to solve this proplem is to use a SILVERSTONE Raven case.
Google for that and check the pictures. The MB is mounted 90° turned cw. This avoids overheating of the 'top' cards because there is no top card.

Alexander

Not all SilverStone Raven cases have such arrangement, only the RV03, the RV02-E, and the RV01, but the latter one can accomodate only 3 double wide GPUs, as there is only 7 expansion slots at the top. You have to be careful when choosing the PSU for the RV03, as not all high powered PSUs can be fit sideways (PSU depth limit is 180mm). However, these three are excellent choice (especially the RV03 and the RV02-E).

Post to thread

Message boards : Number crunching : Building a Desktop for GPUGrid

//