Advanced search

Message boards : Number crunching : Will an Athlon 64 X2, 3800+ 2000 MHz drive 2 x GTX 670's?

Author Message
Ken Florian
Send message
Joined: 4 May 12
Posts: 56
Credit: 1,832,989,878
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 25313 - Posted: 28 May 2012 | 16:30:42 UTC

I have a an AMD Athlon 64 x2 3800+2 Manchester running Win7 x32.

Is that cpu sufficient to drive 2 x GTX 670's?

It will only run GPUGrid tasks

How much GPU performance, if any, is lost if this CPU won't drive it at its max?

I'm considering an 850 watt power supply and not planning on OC'ing.

Thanks

Ken Florian

5pot
Send message
Joined: 8 Mar 12
Posts: 411
Credit: 2,083,882,218
RAC: 0
Level
Phe
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 25314 - Posted: 28 May 2012 | 17:09:54 UTC

Need to know the motherboard to be specific.

BUT I can say this. You will lose A LOT. Not only are the PCIe lanes going to be 1.x, but the CPU @ 2.0GHz, and the slow RAM will also hurt.

850W is also overkill.

Post full system specs and you may get a better answer.

Ken Florian
Send message
Joined: 4 May 12
Posts: 56
Credit: 1,832,989,878
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 25315 - Posted: 28 May 2012 | 18:17:04 UTC - in response to Message 25314.
Last modified: 28 May 2012 | 18:32:24 UTC

Hi 5pot,

I have access to two each of the following. I'd been hoping I could coerce them into service with an upgraded ps and gpu...but it sounds like these are both too far gone to be of much use...certainly not the $500 to $600 per box to bring the gpu and ps to current standards.

Curious to know your further thoughts.


• AMD Phenom Quad Core 9500
• XFX Geforce 8200, 1066
• WinXP x64
• 8G ram

• AMD Athlon 64 x2 3800+2
• Gigabyte GA-K8N Pro-SLI
• Win7 x32
• 2G ram

[edit - Win7 x64 to Winxp x64]

Profile skgiven
Volunteer moderator
Volunteer tester
Avatar
Send message
Joined: 23 Apr 09
Posts: 3968
Credit: 1,995,359,260
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 25316 - Posted: 28 May 2012 | 21:03:59 UTC - in response to Message 25315.

Both systems could support a GPU, but putting two GTX670's into a 2GHz Athlon dual core system would be very unbalanced. The cards performance would suffer from lack of PCIE bandwidth, CPU power and memory freq. I expect you would lose at least 20% performance per card, quite possibly twice that.
____________
FAQ's

HOW TO:
- Opt out of Beta Tests
- Ask for Help

Ken Florian
Send message
Joined: 4 May 12
Posts: 56
Credit: 1,832,989,878
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 25324 - Posted: 29 May 2012 | 1:34:00 UTC - in response to Message 25316.
Last modified: 29 May 2012 | 1:57:35 UTC

How about 1 gpu per box...still not worth it because of the diminished performance?

The desire to put in two gpus comes only from wanting to contribute as much as possible.

Maybe better to build a real cruncher, over time.

Profile skgiven
Volunteer moderator
Volunteer tester
Avatar
Send message
Joined: 23 Apr 09
Posts: 3968
Credit: 1,995,359,260
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 25327 - Posted: 29 May 2012 | 7:41:36 UTC - in response to Message 25324.

Maybe get one card, test it in both systems, and compare run times against other performances here.
____________
FAQ's

HOW TO:
- Opt out of Beta Tests
- Ask for Help

Paul Raney
Send message
Joined: 26 Dec 10
Posts: 115
Credit: 416,576,946
RAC: 0
Level
Gln
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwat
Message 25330 - Posted: 29 May 2012 | 12:07:01 UTC - in response to Message 25324.

Overclock your CPUs. You should get 40% out of them without much effort. The increased performance will help to keep your GPUs busy.

Even with stock CPU coolers, the AMDs should have lots of available performance.

Let us know how it goes.


____________
Thx - Paul

Note: Please don't use driver version 295 or 296! Recommended versions are 266 - 285.

Ken Florian
Send message
Joined: 4 May 12
Posts: 56
Credit: 1,832,989,878
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 25335 - Posted: 29 May 2012 | 18:06:03 UTC - in response to Message 25330.

Hi Paul,

I will update this post as I learn more.

One thing I do not know how to interpret...the Athlon is running 1 550TI on GPUGrid. Nothing else, except default windows services, is running.

swan_sync = 0
boinc 7.0.25

The GPUGrid WU shows 0.451 cpus + 1 nvidia.

What is the correct interpretation of "0.451"? Does it mean that the core is working as hard as GPU needs it to, or would that number be higher or lower, with a more capable CPU?

Thank you for taking the time to help.

Ken Floria

TheFiend
Send message
Joined: 26 Aug 11
Posts: 99
Credit: 2,500,112,138
RAC: 0
Level
Phe
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 25346 - Posted: 30 May 2012 | 7:32:29 UTC - in response to Message 25335.


The GA-K8N Pro-SLI is based on the standard nForce4 SLI chipset, which means it has two 8x PCI-E slots, although one slot can be configured to run with the full 16 lanes if you don't have SLI. Unfortunately, the two high-speed PCI-E slots are only separated by one of the two 1x PCI-E slots, and you also have to manually flip an SO-DIMM card when you want to enable or disable SLI - most motherboards now allow you to do this in the BIOS.


Not a good board for running dual GFX cards..... but you could configure it to run one slot @ x16

Instead of 2 x 670, why not run 1 x 680?

Paul Raney
Send message
Joined: 26 Dec 10
Posts: 115
Credit: 416,576,946
RAC: 0
Level
Gln
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwat
Message 25409 - Posted: 1 Jun 2012 | 1:20:46 UTC - in response to Message 25335.

Ken:

That is a great question. I usually see something like 0.51. I don't know what this really means.

Does anyone else have a clue about this information? What does that 0.41 or 0.51 mean?

thanks
____________
Thx - Paul

Note: Please don't use driver version 295 or 296! Recommended versions are 266 - 285.

Ken Florian
Send message
Joined: 4 May 12
Posts: 56
Credit: 1,832,989,878
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 25410 - Posted: 1 Jun 2012 | 1:30:39 UTC - in response to Message 25346.

After several days of refreshing my memory on things like actual electricity costs and the cost of building a temporarily modern crunching machine, I am leaning toward saying thanks but no thanks to inheriting this set of 5 to 7 year old machines. The emerging plan is to build one machine each year for the next couple of years to dedicate to grid computing-based research.

I'm going to keep my one gpugrid machine running for now...might as well...but might be able to bump my contribution in a significant way in the coming weeks.

Ken

Ken Florian
Send message
Joined: 4 May 12
Posts: 56
Credit: 1,832,989,878
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 25412 - Posted: 1 Jun 2012 | 2:06:13 UTC - in response to Message 25409.

I continue to refine my thinking...hence the question:

Assuming ram, psu, and mobo are up to the task...what is the fastest CPU required to optimize 2 GTX 670's or 680's in one physical machine?

For instance, does a 3930K enable the GTX to deliver more than a 3770?

Profile skgiven
Volunteer moderator
Volunteer tester
Avatar
Send message
Joined: 23 Apr 09
Posts: 3968
Credit: 1,995,359,260
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 25439 - Posted: 2 Jun 2012 | 10:28:44 UTC - in response to Message 25412.
Last modified: 2 Jun 2012 | 11:56:46 UTC

Assuming ram, psu, and mobo are up to the task...what is the fastest CPU required to optimize 2 GTX 670's or 680's in one physical machine?
For instance, does a 3930K enable the GTX to deliver more than a 3770?

In terms of CPU performance (which supports the GPU), there is little out-and-out performance difference between any 32nm or 22nm CPU (clock for clock). So one isn't going to inherently outperform the other. Take the 3770K for example, it only outperforms an i7-2700K by on average 3.7%. The 3960K cant match an i7-990 for crunching most CPU projects. The more recent CPU's do offer some energy efficiency, but to some extent this is a gimic and obfuscates the reality. The IB CPU's are less dense core wise (4 cores compared to the 6 of an i7-980). Add the energy requirements of all the other system components and compare the work done per Watt and IB doesn't add up for a big crunching rig. While it does bring the PCIE bridge onto the CPU, which should improve performance for some GPU projects, it is really only suited to a 1 GPU rig. If you want to add a second GPU (and to answer your question) then you would be better off with a 2011board and a CPU such as a 3820 or 3930K (depending on your finances). That way you can increase GPU density, without the PCIE performance loss, which should prove to be more power efficient overall (performance per Watt).

Fermi cards are PCIE2, so a CPU/board combo supporting PCIE 3 will offer no PCIE advantage over a PCIE2 system. Some CPU/motherboard systems are also very restrictive with PCIE lanes. This makes 1366 a better option for multi PCIE2 GPU setups. IB and SB don't have enough PCIE lanes, so the more cards you add the greater the loss per card.

Anyway, this is uncharted territory; we don't know for sure if IB actually results in app/task improvement, we are not using a Kepler app yet, havn't compared a full task using PCIE3 vs PCIE2 for Kepler cards here...
____________
FAQ's

HOW TO:
- Opt out of Beta Tests
- Ask for Help

Paul Raney
Send message
Joined: 26 Dec 10
Posts: 115
Credit: 416,576,946
RAC: 0
Level
Gln
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwat
Message 25443 - Posted: 2 Jun 2012 | 11:25:26 UTC - in response to Message 25439.

One of my systems is a Q6600 @ 3GHz that keeps a GTX 590 at 98% utilization on Windows XP. On a second system, the a Q6600 @ 3.2GHZ is feeding 2 GTX 570s and they stay busy.

Your question is great in that some amount of CPU is required to "feed" the GPU data but you will likely find the requirements are far less than expected.

A i7-920 @ 3.7GHz keeps 2 GTX 580s working 24 x7. You need not go crazy on the CPU. Just get the 670 running with some CPU and get it attached to the project. We need the help!
____________
Thx - Paul

Note: Please don't use driver version 295 or 296! Recommended versions are 266 - 285.

Ken Florian
Send message
Joined: 4 May 12
Posts: 56
Credit: 1,832,989,878
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 25460 - Posted: 2 Jun 2012 | 21:24:12 UTC - in response to Message 25443.

skgiven and Paul,

Thanks. You are being a tremendous help as I think about how to get involved in a bigger way.

For curiosity, I moved the 550TI from the Athlon / Gigabye (which had been set on non-sli mode in the so-dimm) to an Intel DP35DP / Core2Duo 8200. The GPUGrid task now reads 0.364 CPU's + 1 NVDIA GPU.

The Athlon typically showed a GPU load of about 90%...the 8200 looks to be running it at about 94%. These are very informal observations...not recorded.

I just found skgiven's post about Win7 running about 11% slower than XP...very interesting and relevant to my project.

Ken

Paul Raney
Send message
Joined: 26 Dec 10
Posts: 115
Credit: 416,576,946
RAC: 0
Level
Gln
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwat
Message 25461 - Posted: 2 Jun 2012 | 21:47:38 UTC - in response to Message 25460.

My Windows XP system typically keeps the GPUs at 97% utilization. My Windows 7 systems are usually at 89%. This is with the same processor and same type of GPU.

Windows XP and Linux clearly provide the greatest utilization while Windows 7 appears to be the most common OS.

____________
Thx - Paul

Note: Please don't use driver version 295 or 296! Recommended versions are 266 - 285.

wiyosaya
Send message
Joined: 22 Nov 09
Posts: 114
Credit: 589,114,683
RAC: 0
Level
Lys
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 25463 - Posted: 3 Jun 2012 | 3:04:06 UTC - in response to Message 25461.
Last modified: 3 Jun 2012 | 3:10:05 UTC

My Windows XP system typically keeps the GPUs at 97% utilization. My Windows 7 systems are usually at 89%. This is with the same processor and same type of GPU.

Windows XP and Linux clearly provide the greatest utilization while Windows 7 appears to be the most common OS.

Looks like you've got one Q6600 system running which is a much older CPU that handles much newer GPUs with little difficulty.

My $0.02 follows:

While I would think that perhaps gaming performance would be hurt by running the original poster's GPUs in the suggested machines, I wonder whether GPU performance for applications like GPUGrid would be bothered all that much if at all. The one question that I pose is exactly where do all the GPUGrid calculations and memory utilization take place?

If those calculations and memory utilization take place on the GPU itself, then it seems that it is most likely that the proposed GPUs would run fine in the systems that the initial poster suggested especially since they would be dedicated to GPUGrid. However, if there is a significant amount of memory transfers between main system memory and the GPU card, then this, as I see it, would tend to hamper performance - even in an application like GPUGrid.

Somehow, though, I don't see GPUGrid applications as reaching a level of main memory transfers that are anywhere near as intensive as modern games are, and since these systems would be dedicated entirely to GPUGrid, it seems like these systems, assuming the motherboards would actually run the GPUs, would work out just fine, and I think it is likely that the difference in performance would be minimal.

Here is my first-hand experience in running a modern GPU in a much older system and a latest and greatest technology system:

I have one system which runs an Opteron 1220 (which is the same generation as the Athlon 64 X2 from the original poser) on a Gigabyte GA-MA770-DS3 motherboard, and it was quite capable of running the GTX 580 I just bought. There seems to be little, if any, difference in performance of the 580 in that system for GPUGrid and the 580 in its new home - an i7-3820 on an ASRock X79Extreme6 motherboard.

The difference in main memory bandwidth between these two motherboards is extreme, to say the least, and that seems to make no difference in performance for GPUGrid as WUs on the 3820 system finish in about the same time as they did when the 580 was running on the Opteron system.

All the best!
____________

Rayzor
Send message
Joined: 19 Jan 11
Posts: 13
Credit: 294,225,579
RAC: 0
Level
Asn
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwat
Message 25528 - Posted: 6 Jun 2012 | 14:44:40 UTC

Ken, for an older technology, why not use older GPU's?

I have a similar CPU, Computer ID: 124381, an Intel Core2 Duo E6320 @ 1.86GHz, backed up by 1GB RAM, with a GTX 275 stuffed inside it. This PC is using Windows XP Professional x86 Edition, Service Pack 3, (05.01.2600.00). Current RAC is about 110,000 as of 6/6/2012. One core is dedicated for for the GPU, Swan_Sync, Variable Value: 0. When it's second core is in use, off busy running SETI, the performance of the GTX 275, on GPUGRID does not suffer. It will have the same performance whether the CPU usage is at 50% or at 100%. When this card was used in a more modern PC, Core i3 530, Windows 7, 4GB RAM, it's RAC was a good 10,000 to 12,000 lower, than when used with this PC's E6320, proving once again, that the operating system, XP, not the CPU as much, is more superior to Windows 7, performance when used in GPUGRID.

To answer your question, I think it can handle 2 of these ancient GTX's at the same time, by utilizing both of it's cores, but I can only speculate that these could be the newer Kepler's. The PCIe 1.0 architecture of my Motherboard, will hold back the performance of newer GPU's, but it seems to be compatible with the ancient 200 Series GTX's. From searching through the RAC of the top 200 computers on GPUGRID, I think it could handle a Fermi as well.

BTW, this PC was built from left over parts, cost was nil. Rather than keep it retired in the closet, I figured, why not donate it, use it, for Scientific projects. It might have half the RAC of a modern PC, but it is contributing to this project.

Ken Florian
Send message
Joined: 4 May 12
Posts: 56
Credit: 1,832,989,878
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 25540 - Posted: 7 Jun 2012 | 1:48:47 UTC - in response to Message 25528.

Folks,

This is all great thinking and helpful input. The analysis I'm meandering through is the best performance for dollar spent, all-inclusive over the useable lifetime of the equipment. "All-inclusive" means electricity costs for 6 years and purchase(s) of new equipment, if any. I want to support GPUGrid and probably WCG.

I am an old hand at building my own pc's but I am very new to understanding gpu / cpu crunching optimized for research. Between this thread and this thread http://www.gpugrid.net/forum_thread.php?id=2998
it is possibly looking like building a single box for a hybrid gpu/cpu crunching is not the most cost effective solution.

For instance, reserving 1 core of a 3930 for 1 GTX-6x0 seems 'wasteful' if a GTX-6x0 could be driving to its limit by, say, a "modern" i3 cpu.

One angle I have been considering is a 3930 cpu with 2 GTX-6x0's in it for GPUGrid and a 2 cpu's for WCG.

I am all for using old gear in whatever way possible wut when I look at the annual electricity costs of, say the Phenom at 24x7, compared to the 3820 and the amount of 'research' each could deliver, the cost to keep the Phenom running compared to buying new seems silly.

No conclusions...just musings and thanks to all.

Ken

Paul Raney
Send message
Joined: 26 Dec 10
Posts: 115
Credit: 416,576,946
RAC: 0
Level
Gln
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwat
Message 25541 - Posted: 7 Jun 2012 | 2:09:56 UTC - in response to Message 25540.

You make some great points and the cost of electricity can be a factor in some parts of the world. In Tennessee, we pay about $0.08 per KWH. If the entire computers uses 1 KWH per hour, that is 168 KWHs a week or 732 KWH a month. That is about $58 per month per computer. It will take a while to save enough for a new rig. In addition, Intel and AMD will continue to innovate and find more efficient ways to complete more instructions per second so slap something together and get started.

You will learn a lot about your builds in the process. I never knew how many hotspots my systems had until I started overclocking GTX 570s and 580s. These things can generate some heat. Learn on the older stuff so you can apply your new skills on your new purchases. Don't learn with a $1,000 GTX 690. Ouch!

You need not dedicate a processor to each GPU. Just get a rig running and play around with the settings. I use Swan_Sync but I continue to let BOINC have 100% of the processors. This means that Rosetta at Home continues to run 1 process per processor. My GPUs are seldom starved for data.

We look forward to whatever you decide to do to contribute to the project!

____________
Thx - Paul

Note: Please don't use driver version 295 or 296! Recommended versions are 266 - 285.

Ken Florian
Send message
Joined: 4 May 12
Posts: 56
Credit: 1,832,989,878
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 25543 - Posted: 7 Jun 2012 | 2:45:39 UTC - in response to Message 25541.

Thanks Paul!

My one 550TI is running now in on an intel 8200.... no plan to turn it off...just deciding where to go next. "Negotiations" with my minister of finance went astonishingly well. We were due to upgrade our contributions to the common good...here is a fine place to do so.

Ken

Paul Raney
Send message
Joined: 26 Dec 10
Posts: 115
Credit: 416,576,946
RAC: 0
Level
Gln
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwat
Message 25552 - Posted: 7 Jun 2012 | 12:19:40 UTC - in response to Message 25543.

Could you provide some tips on your negotiation techniques? One of the issues I have is the fact that all "parts" purchases need to be approved. Something about how we spend too much money on my hobby, bla bla bla.

I try to buy almost everything on craigslist or ebay to keep the costs down and I still have a few thousand invested in hardware. My electric bill, well, it is always a topic of discussion.

Glad to hear your negotiations went well and we look forward to a multi GTX 690 rig in your future!
____________
Thx - Paul

Note: Please don't use driver version 295 or 296! Recommended versions are 266 - 285.

Ken Florian
Send message
Joined: 4 May 12
Posts: 56
Credit: 1,832,989,878
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 25565 - Posted: 8 Jun 2012 | 0:08:13 UTC - in response to Message 25552.

Could you provide some tips on your negotiation techniques?


Paul,

I'd tell you but I'd have to kill myself...the intrigue…the temptation to fudge the numbers...the forlorn eyes...and 27 years of pitches that sound dreadfully similar...skills are developed after so many years.

But seriously, as the Finance Minister quickly realized, gpugrid (and others), are a perfect intersection of my computer hobby and meaningful philanthropy. I'd been an early contributor to SETI but, somewhere in the midst of many dozen OS reinstalls, it got lost in the shuffle.

When I "re"-discovered GPUGrid, I approached the, ahem, Minister of Finance with a proposal to re-examine the Commonwealth's charitable contributions. To fund gpugrid I proposed to shift a few coins from some institutions we'd long funded and increase our overall monthly contribution. The "pitch" was in the total monthly contribution to the arts, sciences, and humanitarian causes. I believe the key was in presenting this as amortized over the probable useable lifetime of the equipment. Much like a car salesperson (okay, exactly like a car salesperson), I prostrated myself on the "it's really the monthly payment you should care about. What if I could get your monthly payment down to $X? Would that make it affordable for you?"

Very nearly shameless or shameful depending on one's point of view. If you'd like the fairly decent spreadsheet I built to make the case, PM me and I'll email it to you.

Ken

p.s. It probable wouldn't help to point out that the probable useful life of a GTX-690 is not 15 years...details...details.

Paul Raney
Send message
Joined: 26 Dec 10
Posts: 115
Credit: 416,576,946
RAC: 0
Level
Gln
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwat
Message 25566 - Posted: 8 Jun 2012 | 2:36:27 UTC - in response to Message 25565.

Ken:

Do you have a blog because I would love to follow it. You are a funny guy.

I have one open slot in one of my computers and I will likely save up some cash to buy a GTX 590.

My first order of business is to get my existing 590 to run for more than 24 hours.

Keep crunching!
____________
Thx - Paul

Note: Please don't use driver version 295 or 296! Recommended versions are 266 - 285.

TheFiend
Send message
Joined: 26 Aug 11
Posts: 99
Credit: 2,500,112,138
RAC: 0
Level
Phe
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 25568 - Posted: 8 Jun 2012 | 6:36:26 UTC

I use a sneaky approach to get round my other half. What see doesn't see she doesn't know about. I schedule parts deliveries to arrive whilst she is out at work - I work a rotating shift pattern so orders are placed timed to arrive when I'm off and she is at work. My cases stay the same, but it's whats inside changes.

Profile skgiven
Volunteer moderator
Volunteer tester
Avatar
Send message
Joined: 23 Apr 09
Posts: 3968
Credit: 1,995,359,260
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 25569 - Posted: 8 Jun 2012 | 8:37:52 UTC - in response to Message 25568.

Thoroughly fiendish,

____________
FAQ's

HOW TO:
- Opt out of Beta Tests
- Ask for Help

TheFiend
Send message
Joined: 26 Aug 11
Posts: 99
Credit: 2,500,112,138
RAC: 0
Level
Phe
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 25570 - Posted: 8 Jun 2012 | 8:50:15 UTC

Fiendish? Of course...... but then so is she..... her growing collection of shoes is testament to that..... she is a budding Imelda Marcos!!!

Going to be sticking with my old Phenoms II's for a while, switching to Bulldozer would lose performance for my systems, so my next upgrades will probably be GPU and most probably it will be a 570 or 670 for my second hex core rig.

Ken Florian
Send message
Joined: 4 May 12
Posts: 56
Credit: 1,832,989,878
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 25579 - Posted: 8 Jun 2012 | 19:36:04 UTC - in response to Message 25566.

Do you have a blog because I would love to follow it. You are a funny guy.


Paul,

Thanks for the kind words. I enjoy writing and occasionally the words come out just so. I've never earned a dime for writing. I had a blog for awhile but lost interest because of lack of interest. Maybe time to revive it.

Let it be known that IF, the Minister of Finance read my post, a resigned "Why didn't you tell them about the water torture-like nature of your incessant 'pitches'?" would have been muttered.

Bad news on the 590? Do you suspect malfeasance by the ebay seller or have you tried to push it too hard?

Ken

Ken Florian
Send message
Joined: 4 May 12
Posts: 56
Credit: 1,832,989,878
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 25580 - Posted: 8 Jun 2012 | 19:41:19 UTC - in response to Message 25568.

I schedule parts deliveries to arrive whilst she is out at work.



Hi TheFiend,

I'm glad you mentioned this method. I was too self-conscious to state it in my other post since I was trying to take the moral high road. However, if truth be told, my so-called moral high road may be more lamentable because the end is the same....more gear...while it hides behind altruistic language.

In the pursuit of science, I remain, your co-conspirator.

Ken

5pot
Send message
Joined: 8 Mar 12
Posts: 411
Credit: 2,083,882,218
RAC: 0
Level
Phe
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 25583 - Posted: 8 Jun 2012 | 21:18:20 UTC
Last modified: 8 Jun 2012 | 21:22:06 UTC

Great posts everyone in regards to the Minister of Finance. Found them very funny and true.......

kflorian:

in case you're looking for a good case as well. This is a Corsair Graphite 600T, and I have found this one to be excellent for GPU crunching, since you can add intake fans that suck in air from the outside and it goes directly to the GPUs. There are a total of 4 slots on the side, and each one can use a 120mm fan. I prefer radiator type fans that have curved blades for a higher static pressure. This allows them to suck more air thru those holes. Currently using this method to cool a 680 and a 670. There is also one 200mm fan on top sucking hot air out, as well as another 200mm fan at the front of the case that sucks air in as well. The knob at the top of the case is a fan controller, and can come in handy if noise is an issue.

For what it's worth, currently my system:

3820@4.3
Gigabyte 670 stock speeds
EVGA 680 stock
ASUS Sabertooth
120 GB SSD
H70 CORE w/ a push/pull config
Seasonic X Gold? 750W PSU

This setup currently uses 550W at wall. I do have one regret, and that is not getting the H100. Temps are not an issue AT ALL, but I still wish I would have forked over the extra $30. I am also going to be switching the 670 to another rig when my 2nd 680 arrives. Those things are hard to find....

Ken Florian
Send message
Joined: 4 May 12
Posts: 56
Credit: 1,832,989,878
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 25586 - Posted: 9 Jun 2012 | 0:58:07 UTC - in response to Message 25583.

5pot,

thanks for the config specs. I like that case, too. I've been eyeing the Antec P280. What are your typical ambient temperatures in the room in which the machine is located?

Ken

Paul Raney
Send message
Joined: 26 Dec 10
Posts: 115
Credit: 416,576,946
RAC: 0
Level
Gln
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwat
Message 25589 - Posted: 9 Jun 2012 | 3:21:30 UTC - in response to Message 25586.

Ken:

The GTX 590 is fine. The entire rig needs to be adjusted. A few changes tonight and it is running again at 662MHz. We will see how many hours we get this time. This XP system has been around for a while. Keeping 2 GPUs running is almost twice the work of keeping 1 running.

GTX 580s are down to $200 on ebay. I am starting to think my 570s need to be upgraded :-)
____________
Thx - Paul

Note: Please don't use driver version 295 or 296! Recommended versions are 266 - 285.

5pot
Send message
Joined: 8 Mar 12
Posts: 411
Credit: 2,083,882,218
RAC: 0
Level
Phe
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 25613 - Posted: 10 Jun 2012 | 20:26:05 UTC

@kflorian:

Currently my ambient temps are 24C. I recently got another 680, so both are now in this rig crunching away for Einstein@Home. Waiting for GPUgrid to open up. The top GPU is at 60C w/ 67% fan and the bottom one is at 57C w/ 64% fan. Both hover around 87% utilization. I must have misread my Watts before, because currently my APC UPS (universal power supply) is currently stating roughly 450W. HIGHLY SUGGEST getting one to protect your investment. If you do decide to go 3 670/680 I would say get a 850W PSU for optimum efficiency.

Here is a pic of the three fans in a L shape attached to the case:


As well as the two cards and the system inside:

Post to thread

Message boards : Number crunching : Will an Athlon 64 X2, 3800+ 2000 MHz drive 2 x GTX 670's?

//