Advanced search

Message boards : Graphics cards (GPUs) : GPU Performance tests

Author Message
Profile Krunchin-Keith [USA]
Avatar
Send message
Joined: 17 May 07
Posts: 512
Credit: 111,288,061
RAC: 0
Level
Cys
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 3836 - Posted: 15 Nov 2008 | 16:08:15 UTC

I've run an experiment all week long on 3 computers. Basically it was usage ass normal, without any tinkering with boinc exccept for the test, adding new versions or any drivers, etc.

6.48 to 6.52 shows no noticable difference. It is hard to tell on Windows XP systems due to the high CPU usage. The more CPU used, the lower the time per step, but it is hard to control cpu usage with other applications and programs running.

I let it run with full CPUs, so all three are running two other CPU tasks along with the CUDA/CPU=0.90. CPU usage for acemd is about 20% average with some spikes into the 30's. On the HT max for one CPU is 50%.

I had 6.48 running this way before 6.52 came out.

I let 6.52 run 3 days this way.

Then I changed to use 50% CPU usage in preferences so I would have 1 CPU application running and 1 CUDA/CPU=0.90 running, but having it own CPU. CPU usage for acmed is now average around 40%. There is some CPU waste here, but there is a large gain in GPU elapsed times.

GPU #0 8800GT at 640Mhz, Factory overclock model
GPU #1 and #2 8800GT at 600MHz, stock speeds and settings

times step for each host
running 100%
#0 75.61 to 95.91, avg 81.08 over last 9 with 6.48 and 81.97 for last 3 with 6.52
#1 77.49 to 87.29, avg 82.32 over last 9 with 6.48 and 80.97 for last 4 with 6.52
#2 69.61 to 84.84, avg 77.71 over last 6 with 6.48 and 76.68 for last 4 with 6.52

running CPU=50%, Notice the range min-max, not as varied.
#0 66.16 to 66.75, avg 66.48 for last 3 with 6.52
#1 71.79 to 72.98, avg 72.48 for last 3 with 6.52
#2 69.70 to 71.30, avg 70.23 for last 3 with 6.52

These running under 50% cpu, so the GPU has its own CPU, the times are more in line with when the project ran as CPU=1, both those back then and now are relatively the best I've gotten.

Since 6.45 Best, Worst, Overall Average
#0 66.14, 95.91, 71.48 for 72 tasks
#1 71.42, 108.72, 81.11 for 67 tasks
#2 69.48, 136.66, 74.98 for 68 tasks

It seems for Windows XP users it would be better for them to run one less CPU to gain the best performance from the GPU. I leave this up to each user to run their own tests and see which is more efficient for them.

Now some will say this is a waste because you are not earning max credits. For me it is not so much. Using another project I'm running, I took last 20 results. The CPU earns 0.18cs per minute or 261 per day for one CPU.

And there is only about 10% or less CPU not being used, saving a few cents of electricity.

The decrease in GPU time on same host (#0) lets say from 81.50 to 66.50 means an increase from 2.8cs/minute to 3.43cs/minute earned for the GPU, since credits are fixed, a difference of 0.63cs/minute or a gain of 909cs per day, net gain of 647cs per day. I think the loss of 1 CPU is worth it in this case. Not to mention more results get back to the project quicker.

It would be nice to have less CPU use for the GPU app so the CPU could be used for other applications, but this is a defect in Windows since it can not be as efficient as linux.
____________
Alpha Tester ~~ BOINCin since 10-Apr-2004 (2.28) ~~~ Join team USA

[boinc.at] Nowi
Send message
Joined: 4 Sep 08
Posts: 44
Credit: 3,685,033
RAC: 0
Level
Ala
Scientific publications
watwatwatwatwatwatwat
Message 3837 - Posted: 15 Nov 2008 | 17:55:28 UTC

Thank you, Keith.

It is a nice overview on the influence of the CPU-Utilization under XP.

Any idea, when the windows GPU-Application will run with less CPU-usage?

Post to thread

Message boards : Graphics cards (GPUs) : GPU Performance tests

//