Advanced search

Message boards : Number crunching : Total processing power?

Author Message
Paronfesken
Send message
Joined: 16 Jul 09
Posts: 17
Credit: 4,787,632
RAC: 0
Level
Ala
Scientific publications
watwatwatwatwat
Message 17485 - Posted: 1 Jun 2010 | 5:31:47 UTC

So what's gpugrids total processing power?
And can you compare it to a supercomputer?

Keep on crunching! :D

Betting Slip
Send message
Joined: 5 Jan 09
Posts: 670
Credit: 2,498,095,550
RAC: 0
Level
Phe
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 17488 - Posted: 1 Jun 2010 | 8:30:32 UTC - in response to Message 17485.
Last modified: 1 Jun 2010 | 8:33:03 UTC

So what's gpugrids total processing power?
And can you compare it to a supercomputer?

Keep on crunching! :D


Well heres what BOINCSTATS say and it's a bit disappointing,
241,109.1 GigaFLOPS / 241.109 TeraFLOPS
(Rossetta are achieving nearly half this on CPU only)

There are only 2418 active users and 3289 active hosts.

These figures are at the time of posting.

I would've thought we would be into the Petaflops range but c'est la vie.
____________
Radio Caroline, the world's most famous offshore pirate radio station.
Great music since April 1964. Support Radio Caroline Team -
Radio Caroline

Profile Paul D. Buck
Send message
Joined: 9 Jun 08
Posts: 1050
Credit: 37,321,185
RAC: 0
Level
Val
Scientific publications
watwatwatwatwatwatwatwatwatwat
Message 17490 - Posted: 1 Jun 2010 | 17:30:28 UTC
Last modified: 1 Jun 2010 | 17:31:49 UTC

A few of reasons that the project has such low numbers ...

1) No ATI application... I have been moving to ATI because of cost and power draw of Nvidia cards as well as reliablility and availability ... so, instead of having 7 cards working on GPU Grid I am down to one and it shares with other CUDA projects (as before)... as soon as the ATI application comes out, I am sure that the numbers will spike up...

2) No CPU application, for many practical reasons including the time it would take to do a model...

3) The time it takes to do a model anyway ... :)

This is a resource intense project and some don't like the "cost" in time to run the models ... so the run elsewhere...

4) GPU Computing is still in its infancy ... there are really not that many people doing BOINC in the first place (about 350,00 is all) and there are not that many people with GPUs ... still ... GPUs are pretty expensive as far as most folks are concerned ... so, few can afford the mid to higher end cards that are needed here ...

{edit}
Yes you can compare it to a super-computer ... but the comparison is more apt when you compare the totality of what is being done on BOINC to that theoretical Super-Computer... All projects, all users, all hosts ...

Profile liveonc
Avatar
Send message
Joined: 1 Jan 10
Posts: 292
Credit: 41,567,650
RAC: 0
Level
Val
Scientific publications
watwatwatwatwatwat
Message 17491 - Posted: 1 Jun 2010 | 18:32:12 UTC - in response to Message 17488.
Last modified: 1 Jun 2010 | 19:06:06 UTC

241.109 TeraFLOPS is still quite OK for a small "niche" Project. It's about the same as a 2005-2006 Supercomputer w/o the big price tag. As members hopefully grow & upgrade their PC's & GPU's, likewise will the processing power of GPUGRID. That it's NVIDIA friendly ATI unfriendly, maybe is not that bad. It's better to be really good at something then not really good at anything. If GPUGRID spent too much time trying to get their project to run well on ATI GPU's & CPU's, it might not be as good with NVIDIA GPU's.

But still I saw something interesting that WCG was doing. They have an app they claim uses CPU time when people running the application on Facebook, log on to Facebook. It might be inefficient, but it's easy, uses very little resources, & require no knowledge or maintenance. Even if a drop in the ocean is nothing, maybe a million drops (or even a hundred million drops) would be noticeable.

Come to think about it. Twitter is looking for ways of making money, so is Facebook. "Some" people have "accepted" having banners & adds for free services & the "option" to pay for premium services w/o the adds & banners. What about apps running, paid for by "Sponsors"???

I'm not talking about malware, botnets, keyloggers, or any other privacy/security infringements. But rather a sponsor paid app that works only on using the cpu/gpu??? while people are on Facebook/Twitter to work on micro-WU's so Facebook/Twitter can remain free for all.
____________

Profile skgiven
Volunteer moderator
Volunteer tester
Avatar
Send message
Joined: 23 Apr 09
Posts: 3968
Credit: 1,995,359,260
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 17494 - Posted: 1 Jun 2010 | 21:21:14 UTC - in response to Message 17490.
Last modified: 1 Jun 2010 | 21:25:14 UTC

A few of reasons that the project has such low numbers ...

1) No ATI application... I have been moving to ATI because of cost and power draw of Nvidia cards as well as reliablility and availability ... so, instead of having 7 cards working on GPU Grid I am down to one and it shares with other CUDA projects (as before)... as soon as the ATI application comes out, I am sure that the numbers will spike up...

2) No CPU application, for many practical reasons including the time it would take to do a model...

3) The time it takes to do a model anyway ... :)

This is a resource intense project and some don't like the "cost" in time to run the models ... so the run elsewhere...

4) GPU Computing is still in its infancy ... there are really not that many people doing BOINC in the first place (about 350,00 is all) and there are not that many people with GPUs ... still ... GPUs are pretty expensive as far as most folks are concerned ... so, few can afford the mid to higher end cards that are needed here ...

{edit}
Yes you can compare it to a super-computer ... but the comparison is more apt when you compare the totality of what is being done on BOINC to that theoretical Super-Computer... All projects, all users, all hosts ...


Another few reasons why GPUGrid has such small numbers,
Paul runs his GTX260 on Win 7.
Paul does not drive his GTX295 carefully.
Paul went out and bought ATI cards!
So basically it’s all Pauls fault :P

Seriously, I would add,
At first it is difficult to get started and the information is vague (usable cards, drivers, failures). At least these issues have improved over the last year as has the project.
I do think that people can afford a GT240 (£50 up and a good card for a multimedia system).
People also see high power usage for top cards, but I would say that in terms of power, a GPU does way more work per Watt and is way less expensive than a CPU.
The best Fermi or a GTX295 costs under £500, but an i7-980X costs £850, and an expensive system to go into.
A £50 GT240 will do more work than an i7-980X.
Which suggests people are not well informed, not that bothered or just stupid, stubborn or lazy.
As for getting ATI cards to run here - People might start to go elsewhere with their ATI, and at the same time people might stop buying or using their NVidia’s, which are inherently better for this project (shaders).

Opening a door does not mean people will only walk in!

- It's better to be really good at something than not really good at anything!

Profile Paul D. Buck
Send message
Joined: 9 Jun 08
Posts: 1050
Credit: 37,321,185
RAC: 0
Level
Val
Scientific publications
watwatwatwatwatwatwatwatwatwat
Message 17497 - Posted: 2 Jun 2010 | 13:34:21 UTC - in response to Message 17494.

Another few reasons why GPUGrid has such small numbers,
Paul runs his GTX260 on Win 7.
Paul does not drive his GTX295 carefully.
Paul went out and bought ATI cards!
So basically it’s all Pauls fault :P

- The GTX260 card has not had an error in quite a few tasks.
- I stopped using the GTX295 here because it was throwing errors/driver halts while running a couple projects
- Guilty
- Guilty ...

Then again, one of the reasons I got the first card was because the magicians behind the scenes suggested that an ATI application was on the horizon ... note that I am still holding my breath ...

Oh, as to the migration to Win7, I decided to try it because I did have a copy from a sale of VIsta and I wanted to see if the 64 bit version was worth the bother (XP-64 certainly was not when I tried it)... it was, and everything work right out of the box and so I converted over ... now I have more memory on all my systems (almost 3G on three of them) and the advantage of running 64-bit science applications on those projects that have them (ABC, PG, etc.) to gain a slight bit of speed on the CPU side...

Still, I used to be #13 here ... so I guess it truly is all my fault ... :)

Profile skgiven
Volunteer moderator
Volunteer tester
Avatar
Send message
Joined: 23 Apr 09
Posts: 3968
Credit: 1,995,359,260
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 17500 - Posted: 2 Jun 2010 | 15:58:13 UTC - in response to Message 17497.

Ticking over with a GTX260 isn't bad!

Perhaps a GTX460 will tempt you in the future, should it turn up with a 150W TDP and at a fair price?

I also thought GPUGrid would be adding ATI support, so I picked up a 4850 (long time ago). I then saw that they would not support anything less than a 5000series card, and then that even these cards were too slow. With some bad AMD-app-ware it looked like it was going nowhere, so after running it for a while at Folding I pulled the card.

The thing about getting ATI going here is that many people will just stay where they are, unless a very well optimised app is developed here. Even then many people wont be interested and others will go with the credits.
So unless they get something going that is quite effective, efficient and reasonably optimized, it would be a non-starter. There would be no point in developing a bad reputation.

Profile Paul D. Buck
Send message
Joined: 9 Jun 08
Posts: 1050
Credit: 37,321,185
RAC: 0
Level
Val
Scientific publications
watwatwatwatwatwatwatwatwatwat
Message 17515 - Posted: 3 Jun 2010 | 13:45:32 UTC - in response to Message 17500.

Ticking over with a GTX260 isn't bad!

Perhaps a GTX460 will tempt you in the future, should it turn up with a 150W TDP and at a fair price?

I don't pay that much attention to the power draw, it is more of a side note with me.... but when I compare performance across platforms it is one factor to consider. My upgrade path of choice would be to get an i7 980(+) with at least a 3 PCI-e MB with more than 6G of memory and a SSD ... I was looking at a bare bones Tiger DIrect system that actually had a pair of Nvidia 4xx cards with it... my preferred choice would be to put a pair of 5870s in the new system and then perhaps one of the 4xx cards... that would make my "smallest" system one with an i7 920 and a pair of 4870s ...

Profile skgiven
Volunteer moderator
Volunteer tester
Avatar
Send message
Joined: 23 Apr 09
Posts: 3968
Credit: 1,995,359,260
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 17519 - Posted: 3 Jun 2010 | 14:30:39 UTC - in response to Message 17515.

Your slowest system would match my best system (i7-920 + GTX470)!
Though arguably my system with 4 GT240's is better than my Fermi system.
My weakness in terms of power is having 4 systems, each with a single GT240.

I dont even like the idea of having one system with a Fermi and one with a GTX260. Unfortunately I cant put the GTX260 into the same system as the Fermi, or a system with a GT240. At some stage I may move 3 GT240's into one system, and save two or three hundred Watts per hour!

Profile robertmiles
Send message
Joined: 16 Apr 09
Posts: 503
Credit: 755,434,080
RAC: 210,052
Level
Glu
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 17558 - Posted: 10 Jun 2010 | 14:37:41 UTC
Last modified: 10 Jun 2010 | 14:41:06 UTC

I need to pay attention to the power draw, for a few reasons:

1. My computer room often overheats as it is.

2. I'm restricted on getting any more desktop computers without removing an equal number first. I intend to keep one 32-bit machine so I can still run a few programs that won't run on 64-bit machines.

3. I've determined that my two desktop computers each have only one slot that can take a graphics board, and neither has a power supply that can handle a graphics board that takes much more power.

Also, a few more problems less specific to power draw:

1. I'm having problems identifying which type of slot each computer has, so I can make sure any board I order should plug in.

2. For the last two months, all attempts to do a backup on any of my three computers have failed.

3. No more space to put any more monitors or laptops.

Therefore, can any of you suggest an Nvidia-based graphics board to replace a 9800 GT that provides significantly more GPU computing power with no increase in either slots used or power draw?

Profile skgiven
Volunteer moderator
Volunteer tester
Avatar
Send message
Joined: 23 Apr 09
Posts: 3968
Credit: 1,995,359,260
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 17563 - Posted: 11 Jun 2010 | 0:55:15 UTC - in response to Message 17558.
Last modified: 11 Jun 2010 | 1:09:22 UTC

Although you could get a cheap GT240, which would get more points and use less power (about 65W) it is not a fantastic improvement (from 25% to 100%). It does not need a seperate 6pin power cable, and they do run cool, but I would suggest you wait for a month or so; until the GF104 based Fermis are released (mid July is the ETA). A few versions are expected, so one should do the trick.

ExtraTerrestrial Apes
Volunteer moderator
Volunteer tester
Avatar
Send message
Joined: 17 Aug 08
Posts: 2705
Credit: 1,311,122,549
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 17596 - Posted: 13 Jun 2010 | 16:07:02 UTC - in response to Message 17497.

an ATI application was on the horizon ... note that I am still holding my breath ...


I suppose it still is. It's just difficult to touch that horizon to grab it ;)

MrS
____________
Scanning for our furry friends since Jan 2002

Profile robertmiles
Send message
Joined: 16 Apr 09
Posts: 503
Credit: 755,434,080
RAC: 210,052
Level
Glu
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 17718 - Posted: 27 Jun 2010 | 21:35:14 UTC - in response to Message 17563.

Although you could get a cheap GT240, which would get more points and use less power (about 65W) it is not a fantastic improvement (from 25% to 100%). It does not need a seperate 6pin power cable, and they do run cool, but I would suggest you wait for a month or so; until the GF104 based Fermis are released (mid July is the ETA). A few versions are expected, so one should do the trick.


Looks like a good plan. Thanks.

On a related subject: I've noticed that both my desktop computers have a few unused slots, just not the same type as the one the 9800 GT card can plug into. Could you suggest any way to determine if any of them are something some other type of Nvidia card can use, without buying a card that MIGHT fit in them first? If so, I'll start checking if any of those other cards fit the power restrictions.

I have determined that they are not memory slots - all of the memory slots are already in use.

Profile liveonc
Avatar
Send message
Joined: 1 Jan 10
Posts: 292
Credit: 41,567,650
RAC: 0
Level
Val
Scientific publications
watwatwatwatwatwat
Message 17720 - Posted: 27 Jun 2010 | 22:09:27 UTC - in response to Message 17718.
Last modified: 27 Jun 2010 | 22:11:44 UTC

Although you could get a cheap GT240, which would get more points and use less power (about 65W) it is not a fantastic improvement (from 25% to 100%). It does not need a seperate 6pin power cable, and they do run cool, but I would suggest you wait for a month or so; until the GF104 based Fermis are released (mid July is the ETA). A few versions are expected, so one should do the trick.


Looks like a good plan. Thanks.

On a related subject: I've noticed that both my desktop computers have a few unused slots, just not the same type as the one the 9800 GT card can plug into. Could you suggest any way to determine if any of them are something some other type of Nvidia card can use, without buying a card that MIGHT fit in them first? If so, I'll start checking if any of those other cards fit the power restrictions.

I have determined that they are not memory slots - all of the memory slots are already in use.




What do they look like? If you use a PCI-e then there's no AGP (not that I've ever heard of). PCI-X, PCI, PCI-E 1x,4x,8x are what they might be. I haven't heard of any Nvidia PCI-X GPU, PCI-E 1x & 4x won't fit, & PCI-E 8x looks the same as a PCI-E 16x. If you do have an AGP, you could use an ATI HD3850 AGP, if that works??? But since GPUGRID doesn't support ATI, not here... PCI was a long time ago, Nvidia stopped using that, even before they stopped using AGP.
____________

Profile skgiven
Volunteer moderator
Volunteer tester
Avatar
Send message
Joined: 23 Apr 09
Posts: 3968
Credit: 1,995,359,260
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 17721 - Posted: 27 Jun 2010 | 23:33:51 UTC - in response to Message 17720.

This is a GT220 by SPARKLE (due out soon), that should would work here. It uses PCIE-X1,


There is also a GT240 version. This looks like it could be easily altered to be a low profile card. So these could be useful for small PC's, PC's with small PSU's and for people with one GT240 and a spare PCIE-X1 slot (or two)!

link

Post to thread

Message boards : Number crunching : Total processing power?

//