Advanced search

Message boards : Multicore CPUs : Threadripper Review: AMD’s 16-core 1950X

Author Message
Profile Beyond
Avatar
Send message
Joined: 23 Nov 08
Posts: 1112
Credit: 6,162,416,256
RAC: 0
Level
Tyr
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 47784 - Posted: 17 Aug 2017 | 20:34:58 UTC

"AMD’s Threadripper Rips Intel a New One

If Intel had any doubts about whether AMD could compete at the top of the HEDT space, they’re undoubtedly gone by now. Threadripper doesn’t just compete, it often leaves Intel eating dust. Across all of our application benchmarks, Threadripper wins 11 tests, loses five, and ties two. That’s a very solid set of performances, particularly for a company whose top-end CPU was sucking wind six months ago. Intel has already announced that it intends to launch 12, 14, 16, and 18-core processors by the middle of September, but with a top-end price tag of up to $2,000 even those chips will struggle to match Threadripper’s price/performance ratio.

There are, meanwhile, real questions about what to expect from Intel’s upcoming 18-core processors. We’ve asked the company if it will solder its high-end CPUs, but have yet to hear back from them. Given that Skylake-X is already pushing the limits of what paste can handle, the CPU giant would seem to have little option, but they’re playing mum on this point.

There will still be plenty of people who opt to stick with Intel and play a wait-and-see game with AMD. This even makes sense, depending on your business and market — if software costs are the bulk of your business expenses, the difference between Intel and AMD’s hardware prices aren’t going to have a huge impact on your bottom line. But if hardware costs have an impact on your bottom line — and for most of us, they do — AMD’s Threadripper is in a class of its own. From Ryzen 3 to Threadripper, AMD has redefined performance at every price point, to the benefit of consumers, businesses, and pretty much everybody — except, of course, Intel."

https://www.extremetech.com/computing/253955-threadripper-reviewed-amds-16-core-1950x-rips-intels-core-i9

PappaLitto
Send message
Joined: 21 Mar 16
Posts: 511
Credit: 4,672,242,755
RAC: 0
Level
Arg
Scientific publications
watwatwatwatwatwatwatwat
Message 47789 - Posted: 18 Aug 2017 | 18:50:03 UTC

Any victory for AMD benefits us all

[VENETO] boboviz
Send message
Joined: 10 Sep 10
Posts: 158
Credit: 388,132
RAC: 0
Level

Scientific publications
wat
Message 47838 - Posted: 30 Aug 2017 | 19:17:18 UTC - in response to Message 47789.

Any victory for AMD benefits us all


Why? Gpugrid does NOT support any AMD products (gpu or cpu). So?

PappaLitto
Send message
Joined: 21 Mar 16
Posts: 511
Credit: 4,672,242,755
RAC: 0
Level
Arg
Scientific publications
watwatwatwatwatwatwatwat
Message 47839 - Posted: 30 Aug 2017 | 22:42:12 UTC

AMD represents competition for nvidia and intel, forcing both to lower pricing and increase R&D budget, speeding up the progress of faster, cheaper processors

Toni
Volunteer moderator
Project administrator
Project developer
Project tester
Project scientist
Send message
Joined: 9 Dec 08
Posts: 1006
Credit: 5,068,599
RAC: 0
Level
Ser
Scientific publications
watwatwatwat
Message 48522 - Posted: 28 Dec 2017 | 9:28:55 UTC - in response to Message 47839.

CPU tasks make no distinction between AMD and Intel. Both will work.

mmonnin
Send message
Joined: 2 Jul 16
Posts: 337
Credit: 7,621,228,413
RAC: 10,749,388
Level
Tyr
Scientific publications
watwatwatwatwat
Message 48523 - Posted: 28 Dec 2017 | 12:52:46 UTC - in response to Message 48522.

I would use my 1950x to test these but GPUGrid won't add a second, separate Computer to my profile to allow for separate project preferences. I keep GPU work and CPU work on separate clients. Other projects have no issues with this.

Josh Weaver
Send message
Joined: 16 Dec 17
Posts: 1
Credit: 1,739,050
RAC: 0
Level
Ala
Scientific publications
wat
Message 48529 - Posted: 28 Dec 2017 | 20:40:57 UTC - in response to Message 48523.

I have a 1920x and only half of the cpu is being used by the various projects on here. While that half seems to do very well, I would love to use the other half! Granted I am new to this but having a science background, I was very excited to be able to go from mining other coins to this. I planned on moving some more graphics cards over here but I have not learned how to transfer the gridcoins yet.


PappaLitto
Send message
Joined: 21 Mar 16
Posts: 511
Credit: 4,672,242,755
RAC: 0
Level
Arg
Scientific publications
watwatwatwatwatwatwatwat
Message 48530 - Posted: 28 Dec 2017 | 20:46:32 UTC - in response to Message 48529.

I have a 1920x and only half of the cpu is being used by the various projects on here. While that half seems to do very well, I would love to use the other half! Granted I am new to this but having a science background, I was very excited to be able to go from mining other coins to this. I planned on moving some more graphics cards over here but I have not learned how to transfer the gridcoins yet.

What do you mean transfer the gridcoins. All BOINC projects make gridcoin, no need to move anything. Every contribution from every project goes into your wallet.

mmonnin
Send message
Joined: 2 Jul 16
Posts: 337
Credit: 7,621,228,413
RAC: 10,749,388
Level
Tyr
Scientific publications
watwatwatwatwat
Message 48541 - Posted: 29 Dec 2017 | 13:27:40 UTC - in response to Message 48529.

I have a 1920x and only half of the cpu is being used by the various projects on here. While that half seems to do very well, I would love to use the other half! Granted I am new to this but having a science background, I was very excited to be able to go from mining other coins to this. I planned on moving some more graphics cards over here but I have not learned how to transfer the gridcoins yet.





Do you mean BOINC has a task assigned to all threads but CPU usage is at only half? If so use an app_config so tasks use 4 threads so more tasks can run and utilize the CPU more efficiently.

3de64piB5uZAS6SUNt1GFDU9d...
Avatar
Send message
Joined: 20 Apr 15
Posts: 285
Credit: 1,102,216,607
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwat
Message 48542 - Posted: 29 Dec 2017 | 14:18:56 UTC - in response to Message 48529.
Last modified: 29 Dec 2017 | 14:19:52 UTC

I planned on moving some more graphics cards over here but I have not learned how to transfer the gridcoins yet.


My apologies for posting this one off topic...

Use this cryptocurrency exchange for changing GRC to BTC or USD.
https://c-cex.com/
____________
I would love to see HCF1 protein folding and interaction simulations to help my little boy... someday.

klepel
Send message
Joined: 23 Dec 09
Posts: 189
Credit: 4,719,291,465
RAC: 2,158,474
Level
Arg
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 48543 - Posted: 29 Dec 2017 | 15:43:58 UTC - in response to Message 48542.
Last modified: 29 Dec 2017 | 15:46:02 UTC

Use this cryptocurrency exchange for changing GRC to BTC or USD.
https://c-cex.com/

This exchange never worked for me! I use:
https://poloniex.com/ or https://bittrex.com/

You might as well post your questions on the GRIDCOIN thread. There you can get help.

flashawk
Send message
Joined: 18 Jun 12
Posts: 297
Credit: 3,572,627,986
RAC: 0
Level
Arg
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwat
Message 49146 - Posted: 9 Mar 2018 | 2:00:39 UTC

This is an old thread but I was wondering how others are doing with their Threadrippers? I have a 1920x and my 1080's take longer than my 1070 on my intel rig.

I'm thinking it must be a BIOS setting but there's a plethora of CPU and bus settings.Anybody got any ideas or experiences?

mmonnin
Send message
Joined: 2 Jul 16
Posts: 337
Credit: 7,621,228,413
RAC: 10,749,388
Level
Tyr
Scientific publications
watwatwatwatwat
Message 49147 - Posted: 9 Mar 2018 | 15:25:26 UTC

I run GPUGrid on my 1070Ti and 1070 in my 1950x system. The 1070Ti is def faster than my 1070.

1070Ti at 14.3k secondsat at 2x tasks.
http://www.gpugrid.net/result.php?resultid=17240094

1070 at 15.3k seconds at 2x tasks.
http://www.gpugrid.net/result.php?resultid=17240112

This is a 111.3k task of mine that took 23.7k seconds at 2x.
http://www.gpugrid.net/result.php?resultid=17237151

This is a 111.3k task of yours that took 16k seconds
http://www.gpugrid.net/result.php?resultid=17237258

This is a 111.3k task of yours on the 1080 that took 21.6k seconds. So indeed slower assuming its running the same number of tasks at once.
http://www.gpugrid.net/result.php?resultid=17232057

Granted yours is in Win10 and mine in Ubuntu. Are you running 1x by chance? I am at about 1.95GHz with a slight OC beyond boost clocks. The 1070Ti is around 1.9Ghz boost clock.

What slot is it in and is it getting 16x lanes? There are 64 available but some can be used for M.2/U.2 SSDs if you have any. Some boards start dropping cards to 8x after 3 cards. I'm not sure what kind of performance hit that incurs at GPUGrid.

flashawk
Send message
Joined: 18 Jun 12
Posts: 297
Credit: 3,572,627,986
RAC: 0
Level
Arg
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwat
Message 49149 - Posted: 10 Mar 2018 | 0:05:47 UTC

I think I figured it out, I was running 20 climate models at the same time. When I cut it back to 8 my compute load jumped from 52% to over 90% on each card, I have 2 980 Ti and 2 1080 FTW2 cards.

My times seem to be more normal now, I guess CPDN doesn't like those virtual cores, it's a real shame too, it would have been nice to use them. They have some real screwy rules over there, they took away 3 years of credits from me.

That's why I got all pissy when they started messing with my badges the other day, they have some law in the UK where they can't keep personal data over 3 years. So they dumped all my credits and forum posts, they've done it to quite a few people.

flashawk
Send message
Joined: 18 Jun 12
Posts: 297
Credit: 3,572,627,986
RAC: 0
Level
Arg
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwat
Message 49150 - Posted: 10 Mar 2018 | 2:36:34 UTC

Hey mmonnin, thank you for doing all that research for me. It really is appreciated. I'm using an Asus ROG Zenith Extreme motherboard and I saw a setting in the BIOS for gaming, I think I might turn that off for better multithreaded performance.


mmonnin
Send message
Joined: 2 Jul 16
Posts: 337
Credit: 7,621,228,413
RAC: 10,749,388
Level
Tyr
Scientific publications
watwatwatwatwat
Message 49151 - Posted: 10 Mar 2018 | 14:23:55 UTC - in response to Message 49150.

Hey mmonnin, thank you for doing all that research for me. It really is appreciated. I'm using an Asus ROG Zenith Extreme motherboard and I saw a setting in the BIOS for gaming, I think I might turn that off for better multithreaded performance.




No problem. A high GPU util is a quick judge if the cards are being fed enough. Since you're on windows you can install Process Lasso so keep the CPU affinity of the CPDN tasks on one set of threads and the GPUGrid exe on a separate set. I do this at all times on my Win7 computer to separate my GPU task that need CPU from other CPU tasks. It really helps to keep the CPU loaded and GPUs utilized. Affinity settings will persist with each new task and past reboots.

flashawk
Send message
Joined: 18 Jun 12
Posts: 297
Credit: 3,572,627,986
RAC: 0
Level
Arg
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwat
Message 49152 - Posted: 10 Mar 2018 | 21:09:16 UTC

I'll check that program out, thanks. I found one called AMD Ryzen Master that's got some interesting settings in it, it's for both socket AM4 and sTR4 motherboards and CPU's.

I just wish there was a way to use more of the vertual cores without degrading the performance of my GPU's.

Post to thread

Message boards : Multicore CPUs : Threadripper Review: AMD’s 16-core 1950X

//