Advanced search

Message boards : Number crunching : Nvidia Advice Sought

Author Message
tomba
Send message
Joined: 21 Feb 09
Posts: 497
Credit: 700,690,702
RAC: 0
Level
Lys
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 6833 - Posted: 21 Feb 2009 | 11:05:27 UTC

I'm looking at CUDA cards for BOINC; I'm not a gamer.

Given my rig the right card is the GeForce 9600 GSO. One of the XFX offerings takes my fancy. They come in standard 768Meg or over-clocked 384Meg. Which should I go for?

In other words, is BOINC CUDA processing happier with RAM, or cycle speed?

Another consideration is noise, since my rig is in our living space.

Thanks, Tom

ms113
Send message
Joined: 6 Feb 09
Posts: 19
Credit: 1,281,738
RAC: 0
Level
Ala
Scientific publications
watwatwatwatwat
Message 6836 - Posted: 21 Feb 2009 | 12:42:23 UTC - in response to Message 6833.

I would suggest at least a 9800GT.

But if you plan to work on the PC while crunching, then I would really take a GTX 260 or above. (concerning the choppiness!!)

That's the awful truth.

I use a 9800GT and it is still perceptible choppy.

Clock-Speed and number of SPs is more relevant than memory, as far as the minimum requirements are fulfilled.

Best regards,
Martin

Scott Brown
Send message
Joined: 21 Oct 08
Posts: 144
Credit: 2,973,555
RAC: 0
Level
Ala
Scientific publications
watwatwatwatwatwat
Message 6837 - Posted: 21 Feb 2009 | 15:42:28 UTC

I like my 9600GSO from ASUS. It is factory OC and is very comparable to the stock 9800GT (the overclock puts it over 500 GFLOPS) and sold for less than $90 US. I work on it all the time, and really the only significant choppiness occurs when I work in any 3-D apps. The large stock fan keeps it reasonably cool and is extremely quiet.

I'd also add that the 8800GS is essentially the same card. But be aware that any 9600GSO cards with 512mb or 1GB are actually based on the G94 chip and only have 48 shaders (and will therefore be MUCH slower).

Profile Paul D. Buck
Send message
Joined: 9 Jun 08
Posts: 1050
Credit: 37,321,185
RAC: 0
Level
Val
Scientific publications
watwatwatwatwatwatwatwatwatwat
Message 6840 - Posted: 21 Feb 2009 | 17:21:40 UTC

Three choices, 9800GT, GTX 260/280, or GTX 295 ...

Sadly, all three are going to increase the noise because the fan IS going to run ... the slower the card, the slower the fan the lower the noise. So the question is, less noise or better production.

The 9800 GT I have takes about 12-20 hours per GPU Grid task ...
The GTX 280 I have and the GTX 295 take about the same time per task of 4-6 hours ...

The difference is that the GTX 295 does two tasks at the same time.

The other difference between the cards is cost. My opinion is to buy the best you can afford, or even a little more as the card remains "usable" for a longer time.

You can look at my computers for more if you like ... I am happy with all the cards to this point, I have both PNY and EVGA types. YMMV

J.D.
Send message
Joined: 2 Jan 09
Posts: 40
Credit: 16,762,688
RAC: 0
Level
Pro
Scientific publications
watwatwatwatwatwatwat
Message 6842 - Posted: 21 Feb 2009 | 18:58:53 UTC - in response to Message 6840.

Three choices, 9800GT, GTX 260/280, or GTX 295 ...

Sadly, all three are going to increase the noise because the fan IS going to run ... the slower the card, the slower the fan the lower the noise.


The GTX 295 is definitely not the most quiet choice.
Chances are that you'll find the GTX 260 has the lowest cost per performance, while also being the quietest choice of the three.

You can afford running the relatively noisy GPU fans less in exchange for installing quieter case fans. Lower the temperature inside the case and the GPU temperature follows. For example, if you've room for another 120mm fan, you could install a quiet 800 RPM fan like the Scythe SY1225SL12L. To take cooling a step further, you can even add ducting to the intake fan to pipe cool outside air directly to the GPU. :-)

Profile Dieter Matuschek
Avatar
Send message
Joined: 28 Dec 08
Posts: 58
Credit: 231,884,297
RAC: 0
Level
Leu
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 6843 - Posted: 21 Feb 2009 | 19:16:54 UTC - in response to Message 6842.
Last modified: 21 Feb 2009 | 19:17:39 UTC

There is a remark to add.

Running GTX 295 card(s) requires IMHO an open case because the cards blow a lot of warm air out of their sides.
Then fan noise is no problem, at least in a sufficient cool room like now in the winter.
In the summer I probably have to shut down GPUGRID. :-(

tomba
Send message
Joined: 21 Feb 09
Posts: 497
Credit: 700,690,702
RAC: 0
Level
Lys
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 6861 - Posted: 22 Feb 2009 | 7:49:11 UTC

Thank you all for your responses.

I should have explained up front that "Given my rig..." means that its warranty runs for another 18 months and to go for something more powerful than the 9600 GSO would require a power supply upgrade, putting the warranty at risk.

Given all your dire warnings about noise, I'm tempted by Scott's ASUS; "is extremely quiet".

Scott - can I please ask you to identify which of these you have. Thanks.

Tom

Scott Brown
Send message
Joined: 21 Oct 08
Posts: 144
Credit: 2,973,555
RAC: 0
Level
Ala
Scientific publications
watwatwatwatwatwat
Message 6871 - Posted: 22 Feb 2009 | 14:45:39 UTC - in response to Message 6861.


Given all your dire warnings about noise, I'm tempted by Scott's ASUS; "is extremely quiet".

Scott - can I please ask you to identify which of these you have. Thanks.

Tom


I have the EN9600GSO TOP/HTDP/384M (factory OC equals about 490 GFLOPS). Any with the "Glaciator" ASUS fan are very quiet (at least they don't add any perceptible noise to the power supply fan which runs most of the time given my crunching BOINC CPU projects 24/7--I also have a 9500GT TOP with the same fansink).

Regarding PS, the 9800GT may still be a possibility for you. I am running the 9600GSO with a 375W PS which is below the recommended 400W and the 9500GT with a 250W PS which is well below the recommended 350W. I just picked up an 8800GS on EBAY for a deal to try in a machine with a 300W PS. Depending on what else you do with the system (especially since you said you are not a gamer), you should be able to run successfully below the recommended PS within reasonable limits. The GPUGRID apps do not max-out the electric draw of the card (at least not yet).

tomba
Send message
Joined: 21 Feb 09
Posts: 497
Credit: 700,690,702
RAC: 0
Level
Lys
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 6890 - Posted: 22 Feb 2009 | 17:55:03 UTC - in response to Message 6871.

Regarding PS, the 9800GT may still be a possibility for you.


Scott - I feel a little pain as my arm is forced up between my shoulder blades...

So here's the Full Monty: I run CPU BOINC 24/7 on my Dell Dimensions 9100, Pentium D, 4 gigs, with a 375 watt PS. The 9600 CSO requires a 400 watt PS and so, I just established, does the 9800GT! So the 9800GT, with a Glaciator, is a candidate.

All I have to do now is find a deal, and finding a deal in France is a big challenge!

Tom

Scott Brown
Send message
Joined: 21 Oct 08
Posts: 144
Credit: 2,973,555
RAC: 0
Level
Ala
Scientific publications
watwatwatwatwatwat
Message 6892 - Posted: 22 Feb 2009 | 18:54:41 UTC - in response to Message 6890.

[quote]
All I have to do now is find a deal, and finding a deal in France is a big challenge!

Tom


That is the difficult task...:)

If you are patient and don't mind some warranty limitations, I have seen the 9800GT TOP at NEWEGG.com in their "open box" section a few times over the last month for $90 US (see http://www.newegg.com/Product/ProductList.aspx?Submit=ENE&N=2000380048%204809%201305520548&name=NVIDIA&SpeTabStoreType=99).

of course new, it is only $130 US:

http://www.newegg.com/Product/Product.aspx?Item=N82E16814121268&nm_mc=AFC-C8Junction&cm_mmc=AFC-C8Junction-_-Video+Cards-_-ASUS-_-14121268


The ASUS 9600GSO I can find at Price.com in the $100-$110 US range, but as these have been replaced with the newer (and slower) 48 shader version, they are getting harder to find...


tomba
Send message
Joined: 21 Feb 09
Posts: 497
Credit: 700,690,702
RAC: 0
Level
Lys
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 6924 - Posted: 23 Feb 2009 | 11:38:28 UTC - in response to Message 6892.

When it was to be a 9600GSO, which needs one 6-pin supplementary power connector, I checked inside my tower. I found a spare 2x3 plug, marked P12, with three black and three white wires. I assumed that would do the job.

Now I see that the 9800GT needs two 6-pin connectors, so back inside the case. The only possibility I see is two unused 4-pin plugs marked P11, one near the graphics card and one at the other end of the case. How can I use them to provide the second connector for the 9800GT?

Thanks for the where-to-buy links, Scott. I'm still looking...

Tom

Profile Dieter Matuschek
Avatar
Send message
Joined: 28 Dec 08
Posts: 58
Credit: 231,884,297
RAC: 0
Level
Leu
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 6925 - Posted: 23 Feb 2009 | 12:32:09 UTC - in response to Message 6924.

@ tomba

You only need to buy two PCIe power adapters which convert 6-pin PCIe sockets to 4-pin inline plugs (called 5.25'' connectors because 5.25''-devices like DVD-drives are connected with these).

Scott Brown
Send message
Joined: 21 Oct 08
Posts: 144
Credit: 2,973,555
RAC: 0
Level
Ala
Scientific publications
watwatwatwatwatwat
Message 6933 - Posted: 23 Feb 2009 | 13:21:43 UTC

If bought new (rather than "open box") the ASUS cards needing the six-pin adapter will usually ship with one molex (the 4-pin adapters) to 6-pin adapter, though buying one is fairly cheap (for example, http://www.nanosys1.com/cbl-pwr-lp4pcie.html). The adapter will require two 4-pin molex power connections to feed into the 6-pin connector. Also, avoid chaining other devices on the same power connectors as the GPU (which might entail rearranging and chaining together the power for other internal devices).



tomba
Send message
Joined: 21 Feb 09
Posts: 497
Credit: 700,690,702
RAC: 0
Level
Lys
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 7098 - Posted: 2 Mar 2009 | 8:03:34 UTC - in response to Message 6933.

Update by tomba...

It turned out that she who controls the household budget (and must be obeyed) put the 9800GT out of range so I bought locally a Nvidia 9600GSO, 384 megs.

Noise is not a problem running GPUGRID, though seti@home WUs do up the decibels somewhat. Strange.

I wonder, though, if I have a GPU heat problem. When its ticking over - no WU - it runs at 72-73C. With a GPUGRID WU running, the GPU temperature is a constant 102C. Is that OK???

Tom

ms113
Send message
Joined: 6 Feb 09
Posts: 19
Credit: 1,281,738
RAC: 0
Level
Ala
Scientific publications
watwatwatwatwat
Message 7101 - Posted: 2 Mar 2009 | 10:35:58 UTC - in response to Message 7098.
Last modified: 2 Mar 2009 | 10:37:28 UTC

Update by tomba...

It turned out that she who controls the household budget (and must be obeyed) put the 9800GT out of range so I bought locally a Nvidia 9600GSO, 384 megs.


Congratulation to your new GPU.
I see you got one of the prior (=better!!) 9600 GSOs using 96 SPs.
In terms of speed, this card is nearly comparably to the 9800 GT.
.. good one, and maybe the best price-performance ratio at present.

Noise is not a problem running GPUGRID, though seti@home WUs do up the decibels somewhat. Strange.


Crunching on the GPU utilizes the GPU-Processors so it raises the temprature of the CPU but also in your pc-case and that may raise the different fans in your system .. therefore it may raise the noise.

I wonder, though, if I have a GPU heat problem. When its ticking over - no WU - it runs at 72-73C. With a GPUGRID WU running, the GPU temperature is a constant 102C. Is that OK???


Temperature generally depends on all the things your GPU is running (2D, 3D, Boinc, etc.)
However:
Nvidia specifies a max. GPU-Temp of 105C. Therefore imho 102C seems really to high to me.
(I don't have a 9600GSO to compare with, but my 9800GT runs at 55C when crunching GPUGrid and at 35C when "idle")
What are the different temperatures of the other system-devices?

Maybe your overall system temperature within your pc-case is to high.

First, you can try to run the PC with the case opened and check how the temperatures behave.

Further more you can think eventually about changing fans.

Best regards,
Martin

Scott Brown
Send message
Joined: 21 Oct 08
Posts: 144
Credit: 2,973,555
RAC: 0
Level
Ala
Scientific publications
watwatwatwatwatwat
Message 7104 - Posted: 2 Mar 2009 | 13:11:02 UTC - in response to Message 7098.


I wonder, though, if I have a GPU heat problem. When its ticking over - no WU - it runs at 72-73C. With a GPUGRID WU running, the GPU temperature is a constant 102C. Is that OK???

Tom


Good choice with the 9600GSO...I really love mine.

Your heat, however, is too high. Mine runs in the 70-75C range under GPUGRID load 24/7 (I can drop it much lower with just desktop apps). Open up your case as suggested by ms113 to lower temps a bit. Perhaps more importantly, when open check 1) that the GPU fan is spinning consistently and 2) that your have cleared out the dust in the system.

Depending on the brand of card you purchased (ASUS, EVGA, XFX, etc.), the accompanying software should include some monitoring programs that can tell you at what speed the GPU fan is operating (e.g., with ASUS this program is SmartDoctor). Typically, factory settings will run the fan at 40-60%. If the monitoring software has the feature (and the card has a bios that allows it) you may be able to up the fan speed as needed (I'd say 100% in your case). If not, you should be able to find third party software that can do so for you (I think RIVATuner does this???).

A last solution may be to add an aftermarket cooling system to the card, but this kinda defeats the economic restrictions imposed by the "boss" at home. :)

If your system temps seem okay (say in the 60-70C range for CPU and 40C for system), then you may want to exchange the card to see if it is just faulty.

Profile Zydor
Send message
Joined: 8 Feb 09
Posts: 252
Credit: 1,309,451
RAC: 0
Level
Ala
Scientific publications
watwatwatwat
Message 7114 - Posted: 2 Mar 2009 | 21:21:15 UTC - in response to Message 7104.
Last modified: 2 Mar 2009 | 21:30:21 UTC

The temp sounds like the fans are running "at default". I have an ASUS 9800GTX, and use the ASUS smart Doctor and/or RivaTuner. If you are not used to Riva Tuner, watch for its "advanaced user" warnings - its a powerful tool, and there are areas where issues can occur if it is not used correctly. If you dont see the warnings all is well. If you see them, focus that you really know what you are doing with the section of the tool that you just entered - if in doubt hit "escape" - else leave that particular section of the tool alone.

I have a midtower case, and running the GPU fan on the 9800GTX at 80% keeps the GPU at around 60 on my beastie. I would sort out the heating/cooling issue before running any Crunching to keep the load off it until cooling is sorted. 102 is as pointed out very high, too high, and is boarderline, dont load the gpu with heavy tasks until you resolve cooling.

Edit:
A further thought - on some case designs the construction of the front intake fan filter can be misleading. There is often a foam membrane inside the front grill. That can quickly clogg up as its not visible as such, and easily forgotten. The outside "holes" in the front grill look clear, but in fact behind it, the membrane is full of dust preventing any air intake - or at best degrading it severely. Worth double checking yours if you have a front intake fan/grill. That fan not working properly will have a big impact.

Regards
Zy

tomba
Send message
Joined: 21 Feb 09
Posts: 497
Credit: 700,690,702
RAC: 0
Level
Lys
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 7211 - Posted: 5 Mar 2009 | 7:28:10 UTC - in response to Message 7114.

Update by tomba...

Thank you all for your help with my XFX 9600GSO. Here's where I am:

I downloaded the Nvidia performance application. It told me the fan was running at 30%. I tried a number of settings and found that 60% gave me 82 degrees on the GPU when CUDA-ing with no perceptible increase in fan noise. Anything more than 60% the noise was noticeable. At 100% it was a whirling dervish!

Yesterday I saw that the three clock settings were significantly lower than those on the Nvidia site so I upped them to those settings: 680/800/1800. I did a full WU and got a good result. The temperature is now a consistent 85 degrees.

I invested a fiver on one of these. We'll see what "Super quiet" means.

Tom

fractal
Send message
Joined: 16 Aug 08
Posts: 87
Credit: 1,248,879,715
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 7352 - Posted: 11 Mar 2009 | 18:12:47 UTC - in response to Message 6833.

I see you already made your choice, but I also see that nobody answered your question

In other words, is BOINC CUDA processing happier with RAM, or cycle speed?


Cycle speed * number of shaders is more important than more memory.

tomba
Send message
Joined: 21 Feb 09
Posts: 497
Credit: 700,690,702
RAC: 0
Level
Lys
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 7927 - Posted: 28 Mar 2009 | 6:52:32 UTC - in response to Message 7211.

I invested a fiver on one of these. We'll see what "Super quiet" means.


Well, I can't hear it, but it made not a hapeth of difference to the graphic card temperature!

The other day I noticed "Find Optimal" in the Nvidia control panel device settings. I ran it. Now my 9600GSO clocks are running at:

Graphics: 660
Processor: 1566
Memory: 999

That's faster on all counts than a 9800GT!

Tom

j2satx
Send message
Joined: 16 Dec 08
Posts: 20
Credit: 36,912,780
RAC: 0
Level
Val
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 7948 - Posted: 29 Mar 2009 | 0:15:39 UTC - in response to Message 7927.


Is the Nvidia control panel you mention part of the video card software or motherboard software for Nvidia chips?

Thanks.

tomba
Send message
Joined: 21 Feb 09
Posts: 497
Credit: 700,690,702
RAC: 0
Level
Lys
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 7952 - Posted: 29 Mar 2009 | 4:56:49 UTC - in response to Message 7948.

Is the Nvidia control panel you mention part of the video card software or motherboard software for Nvidia chips?


It's video card software. Cheers, Tom

j2satx
Send message
Joined: 16 Dec 08
Posts: 20
Credit: 36,912,780
RAC: 0
Level
Val
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 7970 - Posted: 29 Mar 2009 | 19:51:53 UTC - in response to Message 7927.


I'm going blind looking for "find optimal" in the Nvidia Control Panel (182.08).

What version are you using?

Thank you.

Profile Bender10
Avatar
Send message
Joined: 3 Dec 07
Posts: 167
Credit: 8,368,897
RAC: 0
Level
Ser
Scientific publications
watwatwatwatwatwatwat
Message 7972 - Posted: 29 Mar 2009 | 20:41:07 UTC - in response to Message 7970.

After some surfing...it looks like you have to install Ntune...?
____________


Consciousness: That annoying time between naps......

Experience is a wonderful thing: it enables you to recognize a mistake every time you repeat it.

j2satx
Send message
Joined: 16 Dec 08
Posts: 20
Credit: 36,912,780
RAC: 0
Level
Val
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 7973 - Posted: 29 Mar 2009 | 20:52:59 UTC - in response to Message 7972.


Trying to figure nTune out now. However, this note bothers me.

Note: nTune does not support overclocking, underclocking or performance auto-tuning on multi-processor systems.

Profile Bender10
Avatar
Send message
Joined: 3 Dec 07
Posts: 167
Credit: 8,368,897
RAC: 0
Level
Ser
Scientific publications
watwatwatwatwatwatwat
Message 7974 - Posted: 29 Mar 2009 | 21:15:37 UTC - in response to Message 7973.
Last modified: 29 Mar 2009 | 21:16:10 UTC

Try this...YMMV as I am running XP 64 pro

1. go to your computers control panel, and then double click on accounts, once that is open click on turn user accounts controls on or off,then turn off user accounts controls, apply and restart computer.

(Not sure about this step. I turned my Guest account on and off...)

2. download and install ntunes from nvidia and restart your comp
I downloaded 5.05.54.00

3. once its all installed and you have restarted your computer there will be two new programmes, "start/program/nvidia corperation" the programmes are nvidia control panal and nvidia moniter.

Open Ntune, and you "should" see a menu listing for "Performance". Expand this, and agree to the "user agreement". This will open up the "Performance" options for you.

"nvidia monitor" monitors temps and fan speed and nvidia control panel "NTUNE", allows extra control of clock speeds
____________


Consciousness: That annoying time between naps......

Experience is a wonderful thing: it enables you to recognize a mistake every time you repeat it.

j2satx
Send message
Joined: 16 Dec 08
Posts: 20
Credit: 36,912,780
RAC: 0
Level
Val
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 7979 - Posted: 30 Mar 2009 | 1:24:17 UTC - in response to Message 7974.

Running W7 32-bit, but didn't have any issue installing.

There does not seem to be any "automagic optimizer" for the GPU, there is for the motherboard and CPU.

It will let me manually modify the GPU..........not what I want to do. I wanted to be auto.

Anyway, just have a piddly 8600 GT on this puter I play some games on. Puter has AMD 4000+ single-core, so nTune works there.

I tried on a quad-core AMD 9850BE with a 9800 GTX+ and an AMD 9850BE with an 8800 GT and neither seemed to want to work. I'll take another run at them tomorrow.

Profile Bender10
Avatar
Send message
Joined: 3 Dec 07
Posts: 167
Credit: 8,368,897
RAC: 0
Level
Ser
Scientific publications
watwatwatwatwatwatwat
Message 7980 - Posted: 30 Mar 2009 | 1:37:18 UTC - in response to Message 7979.
Last modified: 30 Mar 2009 | 1:42:54 UTC

Ok, I don't use Ntune for anything myself. I just saw you were having a problem getting it werking...
____________


Consciousness: That annoying time between naps......

Experience is a wonderful thing: it enables you to recognize a mistake every time you repeat it.

tomba
Send message
Joined: 21 Feb 09
Posts: 497
Credit: 700,690,702
RAC: 0
Level
Lys
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 7991 - Posted: 30 Mar 2009 | 9:57:01 UTC - in response to Message 7970.

I'm going blind looking for "find optimal" in the Nvidia Control Panel (182.08). What version are you using?

2.2.390.00.

Cheers, Tom

tomba
Send message
Joined: 21 Feb 09
Posts: 497
Credit: 700,690,702
RAC: 0
Level
Lys
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 8013 - Posted: 31 Mar 2009 | 14:19:33 UTC - in response to Message 7211.

The temperature is now a consistent 85 degrees.

I've done nothing, but for the last two days the GPU temperture is a constant 66 degrees!!

Tom

tomba
Send message
Joined: 21 Feb 09
Posts: 497
Credit: 700,690,702
RAC: 0
Level
Lys
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 8018 - Posted: 31 Mar 2009 | 18:14:17 UTC - in response to Message 8013.

The temperature is now a consistent 85 degrees.

I've done nothing, but for the last two days the GPU temperture is a constant 66 degrees!!

Oops! Just discovered that GPU processing stopped without any notice. I took down BOINC, restarted him, and processing resumed. And the resumed WU got credit. And temperature is back up to 85.

Is this a feecher of BOINC 6.5.0? Is there a better BOINC?

Tom

dyeman
Send message
Joined: 21 Mar 09
Posts: 35
Credit: 591,434,551
RAC: 0
Level
Lys
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 8020 - Posted: 31 Mar 2009 | 20:25:56 UTC - in response to Message 7991.

I'm going blind looking for "find optimal" in the Nvidia Control Panel (182.08). What version are you using?

2.2.390.00.

Cheers, Tom


To get "Find Optimal", download and install NVIDIA System Tools.

Profile Paul D. Buck
Send message
Joined: 9 Jun 08
Posts: 1050
Credit: 37,321,185
RAC: 0
Level
Val
Scientific publications
watwatwatwatwatwatwatwatwatwat
Message 8028 - Posted: 1 Apr 2009 | 6:05:14 UTC - in response to Message 8018.

Is this a feecher of BOINC 6.5.0? Is there a better BOINC?

To *MY* mind, no, not at this time.

I am starting to look at 6.6.20, though not yet on a GPU capable system. Not for a little while yet ...

Not sure why you stopped processing on the GPU. But, if resource share is low, 6.5.0 like all non-6.6.x versions thinks the GPU and CPU are the same and does not distinguish between them. I run my GPUs with share 500 so that they get anywhere up to 48% system use to try to "force" them to run all the time. So far, GPU Grid cooperates but MW does not ... sigh ...

That COULD be one reason it stopped.

I do not recommend that you try the 6.6.x versions YET ... but if you are brave, ... well ...

Or, wait a week or two and there are a couple brave souls already trying 6.6.18 and 6.6.20 ... early reports are, well, no show stoppers ... though I do think that the issue I raised with the scheduler request should be fixed. (though it is not likely, sadly) ...

Profile Michael Goetz
Avatar
Send message
Joined: 2 Mar 09
Posts: 124
Credit: 60,073,744
RAC: 0
Level
Thr
Scientific publications
watwatwatwatwatwatwatwat
Message 8031 - Posted: 1 Apr 2009 | 10:58:58 UTC - in response to Message 8028.

I've been running 6.6.20 for several days now. Was running 6.6.18 prior to that.

Nothing horrible has happened, and it does keep the CPUs and GPU busy. I'm not positive I understand the work scheduling or wotk fetch algorithms anymore, but nothing has missed deadlines and nothing has run out of work, so we'll see.

One of the things I didn't like about the pre-6.6.x clients was the problem described above -- it would stop working the GPU because it scheduled the GPU and CPU together, so you have to raise the priority on the GPU projects. I haven't had to do that with 6.6.x.

FWIW, I'm running SETI and GPUGRID on the GPU and a gaggle of wildly different projects on the CPU.

Mike

____________
Want to find one of the largest known primes? Try PrimeGrid. Or help cure disease at WCG.

Profile Paul D. Buck
Send message
Joined: 9 Jun 08
Posts: 1050
Credit: 37,321,185
RAC: 0
Level
Val
Scientific publications
watwatwatwatwatwatwatwatwatwat
Message 8039 - Posted: 1 Apr 2009 | 18:01:08 UTC - in response to Message 8031.

Nothing horrible has happened, and it does keep the CPUs and GPU busy. I'm not positive I understand the work scheduling or wotk fetch algorithms anymore, but nothing has missed deadlines and nothing has run out of work, so we'll see.

I am not sure anyone does.

One of the flaws in the way BOINC is developed is that they code and only later document what they code when they have moved onto other things. Setting your design goals down on paper BEFORE you code is always better because then you can see if you have hit the target you were aiming at.

I put 6.6.20 on the Mac Pro and it seems to be reasonably well behaved so after lunch maybe I will install it on my i7 and see what happens there ... after that, it is wait and pray ... :)

Martin Chartrand
Send message
Joined: 4 Apr 09
Posts: 13
Credit: 17,030,367
RAC: 0
Level
Pro
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 8291 - Posted: 8 Apr 2009 | 0:08:48 UTC - in response to Message 7098.

I had a similar problem with my "old" E6850 C2.
At idle it was in the 60's and load 92 and 95C respectively with a thermaltake power fan.

I decided that this was not normal and sent the chip back to intel (through my local store) within 7 business day intel sent a brand new E6850 but as usual never said why.

I decided to exchange it against a Q9550 and I got $175 bucks back for a E6850.
Now even under stress no core goes pass 43C and thats at 100% load.

So there was a problem with it although Intel said that their processor can handle 100C.

But that's way too high as far as I am concerned.

Martin Chartrand

Post to thread

Message boards : Number crunching : Nvidia Advice Sought

//