Jump to content

Welcome to ExtremeHW

Welcome to ExtremeHW, like most online communities you must register to view or post in our community, but don't worry this is a simple free process that requires minimal information for you to signup. Be a part of ExtremeHW by signing in or creating an account.

  • Start new topics and reply to others
  • Subscribe to topics and forums to get email updates
  • Get your own profile page and make new friends
  • Send personal messages to other members.
  • Take advantage of site exclusive features.

ExtremeHW Community Folding Project


Recommended Posts

  • Replies 194
  • Created
  • Last Reply

Top Posters In This Topic

Top Posters In This Topic

Popular Posts

Hello all,   I am pleased to announce that we will be looking to put setup a dedicate folding machine that will fold on behalf of ExtremeHW and its members. Not only is Folding@Home a fantas

if that thing is belly button height, or higher, nigh impossible to lift a rig with 3+ GPUS out, it's going to be like Excalibur.

LOL I told ya!. I literally am going down that road right now with M40's. Need full risers. Try here for boards: https://www.newegg.com/p/pl?N=100007627 600239779&Order=1 You should look for a bif

Posted Images

Administrators
I'm jealous, I kind of want to build a dedicated rack-mount folding pc now, just ryzen 3 and a GT 1070 or something dedicated folding away.

 

Don't be jealous, just send all your GPU's my way :p

 

I am toying with the GTX 1070 or similar AMD GPU, need to see what GPU is similar to the GTX 1070 in folding just to look at costs.

Link to post
Share on other sites

I would definitely go with NVIDIA GPUs. As it stands F@H is quicker on NVIDIA GPUs, and if we see CUDA capable core_22 WUs in the future, the gap is only going to get bigger. The upstream project (OpenMM) has CUDA builds and the performance is much better than OpenCL. I don't have any inside knowledge if the devs at F@H are going to push out a new core with CUDA support, but if maximum performance is the goal they should.

 

The x1 link will be a bottleneck, but I don't have any recent testing to compare. In Linux, running a GTX 1070 on a PCIe 3.0 x4 link reduces performance by 1-2% vs PCIe 3.0 x16. I can post up some numbers later today of performance on a x1 riser.

Link to post
Share on other sites
Administrators
I would definitely go with NVIDIA GPUs. As it stands F@H is quicker on NVIDIA GPUs, and if we see CUDA capable core_22 WUs in the future, the gap is only going to get bigger. The upstream project (OpenMM) has CUDA builds and the performance is much better than OpenCL. I don't have any inside knowledge if the devs at F@H are going to push out a new core with CUDA support, but if maximum performance is the goal they should.

 

The x1 link will be a bottleneck, but I don't have any recent testing to compare. In Linux, running a GTX 1070 on a PCIe 3.0 x4 link reduces performance by 1-2% vs PCIe 3.0 x16. I can post up some numbers later today of performance on a x1 riser.

 

Well I did do some Google searches and from what I could find out, the 1x bandwidth does bottleneck F@H but only by a few percent. However if you do come by your own hard numbers that would certainly be very useful ! As for the Nvidia cards, it is very likely the route I will go just for raw performance and the fact that I am more familiar with Nvidia folding over AMD (Granted not a huge difference but there are nuances)

 

 

Link to post
Share on other sites

 

Well I did do some Google searches and from what I could find out, the 1x bandwidth does bottleneck F@H but only by a few percent. However if you do come by your own hard numbers that would certainly be very useful !

7e0e6068e98a.jpg

 

The first WU that I am testing is a core_21 WU. I'm booting off this GPU, so their might be a little more performance to be had. The difference should be pretty minimal since I'm not running X, and the GPU is only rendering the console while testing.

Silly NVIDIA and their P2 clocks. I forgot to bump the memory up to 8008MHz, so I left it the same for the x1 and x16 test.

 

OS: Arch Linux

Kernel: 5.6.8

Driver: 440.82

GPU: GTX 1070 1974core|7604mem

 

p16906

PCIe 3.0 x16: TPF - 02:13 | ppd - 686,675

PCIe 3.0 2.0 x1: TPF - 02:23 | ppd - 615,920

 

I'll compare a few more WUs as they come in. Hopefully I can grab a few of the more demanding and higher ppd core_22 WUs.

Edited by tictoc
Link to post
Share on other sites
Administrators
7e0e6068e98a.jpg

 

The first WU that I am testing is a core_21 WU. I'm booting off this GPU, so their might be a little more performance to be had. The difference should be pretty minimal since I'm not running X, and the GPU is only rendering the console while testing.

Silly NVIDIA and their P2 clocks. I forgot to bump the memory up to 8008MHz, so I left it the same for the x1 and x16 test.

 

OS: Arch Linux

Kernel: 5.6.8

Driver: 440.82

GPU: GTX 1070 1974core|7604mem

 

p16906

PCIe 3.0 x16: TPF - 02:13 | ppd - 686,675

PCIe 3.0 x1: TPF - 02:23 | ppd - 615,920

 

I'll compare a few more WUs as they come in. Hopefully I can grab a few of the more demanding and higher ppd core_22 WUs.

 

Thanks for setting this up, hugely helpful and gives great insight into this. So it looks like a 10 second increase on Time Per Frame is the penalty when going to 1x...at least for this WU type.

 

Will be interesting to see what other data you come up with. I can see this being very useful for anyone considering a mining to folding rig conversion. I will likely make a table of this info as it will be helpful to others.

 

Thanks bud !

Link to post
Share on other sites

First core_22 WU.

 

This is a good ppd WU, so due to QRB, the ppd penalty is larger when dropping down to PCIe x1. The way the WU queues and transfers data is also different. The TPF on every fourth frame increases by 11 seconds, so I averaged all the frame times for the WU.

 

OS: Arch Linux

Kernel: 5.6.8

Driver: 440.82

GPU: GTX 1070 1974core|7604mem

 

p14253

PCIe 3.0 x16: TPF - 02:30 | ppd - 837,797

PCIe 3.0 2.0 x1: TPF - 02:53 | ppd - 676,404

Edited by tictoc
Link to post
Share on other sites
Administrators
First core_22 WU.

 

This is a good ppd WU, so due to QRB, the ppd penalty is larger when dropping down to PCIe x1. The way the WU queues and transfers data is also different. The TPF on every fourth frame increases by 11 seconds, so I averaged all the frame times for the WU.

 

OS: Arch Linux

Kernel: 5.6.8

Driver: 440.82

GPU: GTX 1070 1974core|7604mem

 

p14253

PCIe 3.0 x16: TPF - 02:30 | ppd - 837,797

PCIe 3.0 x1: TPF - 02:53 | ppd - 676,404

 

Well it is clear that the loss incurred will vary greatly from WU to WU. That will be the trade off to this type of build. However nonetheless the contribution the rig will give will far outweigh the losses due to bandwidth.

 

On another note the CPU Cooler, Fan Filters and GPU PSU (Part 1, ATNG 1300Watt Mining PSU) has arrived. This will enable the rig to be setup and tested so far as the main components to ensure all is good. Once I receive the rest of the components for InverseTundra then I can complete the base build fully.

a62bcc53c33b.jpg

 

 

Link to post
Share on other sites

The ppd hit is not too bad all things considered. The cost to go with a different platform, probably far outweighs the higher performance. I'll keep posting results as I see different WUs.

 

Not sure if I saw this anywhere, but you are definitely going to want to run Linux on that box. Not only is it more performant the Windows, but it will handle the gear with much less brain damage. Hit me up if you need any helping getting the OS up and running.

Link to post
Share on other sites
Administrators
The ppd hit is not too bad all things considered. The cost to go with a different platform, probably far outweighs the higher performance. I'll keep posting results as I see different WUs.

 

Not sure if I saw this anywhere, but you are definitely going to want to run Linux on that box. Not only is it more performant the Windows, but it will handle the gear with much less brain damage. Hit me up if you need any helping getting the OS up and running.

 

Linux will be the way I go for this machine, Windows for testing initially but the finished build will go the Linux route. May indeed need some pointers on that one, I have done it before but it was a while ago now. Sure, keep posting those stats. I will make a table of them for the OP as well as it is useful information.

Link to post
Share on other sites
Administrators
Shipped out my goodies (PSU, Risers, and a spare M.2 Drive) to E today. Let us all pray the box survives the trek over the pond in one piece.

 

Huge thanks again for the generous donation. I will keep an eye out for it and will get it into the build ASAP !

Link to post
Share on other sites

Two more WUs.

 

p16905 (core_21)

PCIe 3.0 x16: TPF - 02:15 | ppd - 686,311

PCIe 3.0 2.0 x1: TPF - 02:26 | ppd - 610,229

 

p16445 (core_22)

PCIe 3.0 x16: TPF - 01:20 | ppd - 892,930

PCIe 3.0 2.0 x1: TPF - 01:31 | ppd - 736,022

Edited by tictoc
Link to post
Share on other sites
Administrators
Two more WUs.

 

p16905 (core_21)

PCIe 3.0 x16: TPF - 02:15 | ppd - 686,311

PCIe 3.0 x1: TPF - 02:26 | ppd - 610,229

 

p16445 (core_22)

PCIe 3.0 x16: TPF - 01:20 | ppd - 892,930

PCIe 3.0 x1: TPF - 01:31 | ppd - 736,022

 

Nice ! Thanks for that, will add it to the table. I have started the build today, what a pain lol. When it comes to the type of case I am using, you have to make up your own cable management as you go, it will not look pretty but it will certainly be a workhorse !

Link to post
Share on other sites

 

Linux will be the way I go for this machine, Windows for testing initially but the finished build will go the Linux route. May indeed need some pointers on that one, I have done it before but it was a while ago now. Sure, keep posting those stats. I will make a table of them for the OP as well as it is useful information.

 

If you chose Ubuntu stay away from 20.04LTS for now. I tried it on one of my new rigs and it was nothing but headaches, stick with 18.04. I have setup 2 rigs on Ubuntu now and I can pass along all my notes for those if you go that route.

 

EDIT: I am seeing a decent PPD boost between the 1070 in my Ubuntu rig vs the 1070(clocked 50Mhz faster) in my Win10 rig any where from 30-100k depending on the project, so for a dedicated rig Linux is still the way to go.

 

 

Edited by franz
Link to post
Share on other sites
Premium Platinum

Hey @franz, do you think Ubuntu on a VM will give you the same boost or would it be less?

I have been thinking about installing Ubuntu on my second rig, cause it's a dedicated system.

After seeing this I should have gone that route.. :p

Link to post
Share on other sites
Administrators

 

If you chose Ubuntu stay away from 20.04LTS for now. I tried it on one of my new rigs and it was nothing but headaches, stick with 18.04. I have setup 2 rigs on Ubuntu now and I can pass along all my notes for those if you go that route.

 

EDIT: I am seeing a decent PPD boost between the 1070 in my Ubuntu rig vs the 1070(clocked 50Mhz faster) in my Win10 rig any where from 30-100k depending on the project, so for a dedicated rig Linux is still the way to go.

 

 

Thanks for the heads-up. I used Ubuntu on my last dedicated rig as it happens, though do not remember a huge amount haha but good to know I can depend on you too for some assistance :)

Link to post
Share on other sites
Administrators

So we had a little set back today. Seems that the board shipped to me with an older BIOS revision meaning that it will not boot with the CPU (i7-9700) we have. Oddly it does fully boot and the only diagnostic light that is active is the one that refers to lack of boot media and not the CPU. All I know is I get no video out using onboard on either HDMI or DVI. I also tried multiple screens and a re-seat of the CPU but nothing, so I will assume the board cannot utilize the CPU properly. So I will go ahead and get a cheapo CPU that was compatible with the mobo at launch to ensure compatability, flash to the latest BIOS and pop the back in i7-9700.

Edited by ENTERPRISE
Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×
×
  • Create New...

Important Information

This Website may place and access certain Cookies on your computer. ExtremeHW uses Cookies to improve your experience of using the Website and to improve our range of products and services. ExtremeHW has carefully chosen these Cookies and has taken steps to ensure that your privacy is protected and respected at all times. All Cookies used by this Website are used in accordance with current UK and EU Cookie Law. For more information please see our Privacy Policy