It finally happened – here is the long feared hardware episode you’ve not been waiting for! As Mo recently started freelancing, he decided he’d need a new computer. Also he decided to build it himself. Enjoy the resulting chaos.
A word of warning – if you decide to build a PC yourself you do so at your own risk.
So I’m freelance again. And get asked to bring my own workstation to clients more frequently. However I’m a bit hesitant to schlepp my dear Entagma-computer out to clients. So instead I opted to build a second, newer machine. These were the main things I expected from it:
– should be able to fit 4 GPUs
– at least 64GB RAM
– decent CPU speed for sims / general Houdini madness
– m2 SSD for quick caching/ fast boot
– power supply with ample headroom for power hungry hardware
– no liquid cooling shenanigans
External DVD: lg.com/de/brenner-laufwerke/lg-GP57EB40
OS: Win 10 Pro 64Bit
I went mostly with components I’d built machines out of previously. Also I was trying to hit something like an optimal price/performance ratio. While during the past years Intel seemed to have an edge ove AMD when it came to multithreading performance, AMD seems to finally have caught up with their new Threadrippers (19xx and second gen 29xx) at a very attractive price point. So I opted for an AMD CPU and mainboard here. Went with the 1920x as it was only 90€ more expensive than the slower 1900x yet reasonably priced in order to not cause bad headaches should I decide to upgrade to a 2950x in the future.
The mainboard I chose was mainly based on its mechanical layout. Four PCIe x16 sots spaced apart two slots can accommodate four full grown GPUs – just what I wanted. It’s got 8 RAM slots which when fitted with 16GB memory modules can be filled to 128GB of RAM. For now I filled half of them totalling 64GB of RAM. (Threadrippers can be picky when it comes to RAM modules. I went with modules which I knew worked.) The only annoying thing are the very colorful (and bright!) LEDs scattered all over the board. But that’s not so much an issue if you’ve got a case without windows.
The case is a Fractal Design Define XL R2 – nothing too fancy, no windows, no excessive LEDs no glaring colors. Just about the most subtle case you can get. I’ve worked with this case before too. It fits four GPUs, lots of cables and some hybrid coolers (if you’re into them) pretty nicely. Also it comes with enough case fans pre-installed. It fits a big CPU cooler well too.
Keeping the CPU cool is a job for the 140mm Noctua cooler. I dislike liquid cooling for most cases. It adds complexity, weight and cost. A decently sized air cooler can keep your components cool without blowing your eardrums too. I like hybrid cooled GPUs thoug – when you stack 4 graphics cards directly on top of each other it makes sense to remove the heat through some sort of tubing.
Hard disk wise I opted for a 1TB m2 SSD which in theory should be a bit quicker than your plain vanilla SATA SSD. Which is nice for caching and quickly booting.
For the power supply I decided to get something hefty – just to be sure that I’ll be able to upgrade to increasingly powerful (and power hungry) GPUs. I found the Enermax Platimax 1700W to fit my specs nicely. Not sure though if it’s available outside of Europe.
Add an external DVD drive and a copy of Win10 Pro and you’ve got all parts needed to start building…
The three thing I love the most together, Hardware, Houdini & entagma. Am I dreaming?
awake, it’s time to go to work
loved this video! so informative!
I too sorely need to build myself a workstation but I’ve been putting it off because I dread running into precisely the sort of snafus you encountered during this build, so thank you for sharing your experience and paving the way so that others like me have the benefit of not going in blind!
What may, I ask was the final price for this build? Given you were attempting to maximize cost to performance, this would be an interesting statistic.
Best of luck to you on your new venture as a freelancer!
Hi Michael and thanks for the encouraging words! I ended up paying 2480€ but prepare to spend some more on 2x RTX2080s… 🙂
Just for the sake of chiming in on build pricing, my build was several months prior to this. I opted for the top-line Ryzen Threadripper CPU (AMD Ryzen Threadripper 1950X 16-Core Processor, 3400 Mhz, 16 Core(s), and two GTX 1080ti’s, which unfortunately for me were at their peak price at the time, at around $1100 US each. I also have 64GB RAM, a 500gb Solid State boot drive, and a 1 terrabite storage drive, in a Phantex (I think) case. All air cooled, with some extra fans packed in. All of this cost approximately 5k US, which is probably less than half of the new iMac pro, for those of you sad fools who still cling to Apple stations, for considerably more power, modularity and affordability.
As a side-note, I might suggest that the release of the new RTX 2000 series Nvidia GPU’s will drive down further the cost of the already superior cost/performance ratio-superior GTX 1080ti – and that you could do my build, with all four 1080 ti’s, that the motherboard can acomodate, for around the same amount of money, give or take $250-500. I do recommend that option, I love my build.
Thanks Entagma! you guys are awesome and much appreciated!
And yes Moritz, best of luck freelancing, you’ll kill it!
Thx man! 🙂 We’ll see how it goes…
Hey Brandon! Maybe could you help me? I have Threadripper 1950X on MSI X399 SLI PLUS with DDR4 G.Skill Trident Z F4-3200C16D-32GTZKW 32 Gb (2×16 Gb) and F4-3200C16D-32GTZSK (2×16 Gb) total 64 Gb, PSU Raidmax RX-1000GH 1000W, MSI Radeon Vega 56 Air Boost. I have so many troubles. It seems sometimes PC works slowly in Houdini FX. Especially when make rendering. Also PC can suddenly reboot or just display disappears. But there is a code on motherboard AA, seldom I get 00 error. What’s wrong with my PC? I’ll appreciate for any help!
Maybe it’s not relevant, but you should do some investigations.
I had pretty much the same troubles when using my computer. And it appears that not all the MSI motherboard are totally stable in that type of config.
I had the MSI 299 SLI PLUS, too many crashes, reboot,…
So I change it to another one still from MSI (can’t remember the exact model but not SLI). But I still had same codes errors 00 and reboots. And one day to another my motherboard burnt. 🙂
Just in case 😉
ThatMac Pro still has built in advantages. I’m about to upgrade to a, 16‑core Intel Xeon W processor, Turbo Boost up to 4.4GHz
96GB of DDR4 ECC memory, Two Radeon Pro Vega II with 32GB of HBM2 memory each And the powerful Apple Afterburner. All with the normal plug, setup and create ability that I have always enjoyed with Mac.
*all you really need to put together a build is a screwdriver! 😉 *protip
*also: you can install win from an usb stick! 😉
This is great tech advice overall. But quite honestly, I would rather purchase a workstation from a large company like Dell, MSI, or HP. If you call and talk to them, they give you amazing business level workstations for an amazing price.
Their machines have been tested for compatibility and performance for the work we do, and they offer tech support when something goes wonky at 2am when the client wants their work done by 10am. There is way to much risk with a hand built machine.
It’s easy, and there may be some level of tech support, but you pay more, and you get less.
I see a lot of folks going with Threadrippers- I haven’t kept up with CPUs lately, is this just a cost/performance ratio decision, or does Houdini perform better with that processor line?
In this case my decision was based on price/performance ratio. Haven’t seen any detailed comparisons between current Intel and AMD CPUs regarding Houdini’s performance. Cheers, Mo 🙂
yep, interesting even though i get mine built for me, nice to be able to see how to upgrade without blowing yourself up. i have the same case, ryzen 16 core, same memory and two 1070Ti’s but i bought mine a while ago.
looooove this so much!!!
I wondering how well 1920x is handling comlex setups and effects over super detailed geometry? Can I benefit from multicore performance in houdini if I will render through GPUs ?
Thank you for your videos:)
“The fans are turning… it’s doing something” 😀
Always that dread of “Did I assemble it right?” the first time you turn it on 😉
Informative as usual, and maybe you can make a short one about your experiences with the AMD CPU after some use.
What kind of heat does the assembled unit kick out into the room?
as most electrical power a computer consumes is dissipated as heat, we could estimate the heating by using a wattmeter. Which I sadly don’t have now 🙂
So instead let’s look at the component’s TDP (Thermal dissipatio power) as in the spec sheets:
Threadripper 1920x: 140w
2x GTX 1070: 2x 150w
2x RTX 2070: 2x 175w
That makes for around 790W under full load. Plus a few watts from SSDs and mainboard. So around half a hairdryer. Again – rough estimate, a wattmeter would help. Also in determining how much power it draws when not under full load.
I am planning an upgrade and am asking for an inquiry.
I presently have a 1950X with 128 GB of ram but my GPU is rather old 680 GTX.
I am speculating that by forgoing a GPU upgrade AGAIN and moving to a 2990WX would be a better bang for the buck than keeping the 1950X with an RTX2000. The way I see it CPU power helps me at all times and I always have access to my RAM. With a GPU I am expecting to limit my simulation and rendering capabilities due to GPU RAM size.
I tend to do more simulation and Volume effects than anything.
Totally depends on your typical use case. If you’re doing mainly sims and/or using CPU render engines, there’s little sense in investing into hefty GPUs.
In my case I’m typically hired to deliver finished projects, so I need to render. Fast. And I’m not yet willing to invest in a farm. So GPU rendering it is. (Currently this workstation is running 2xRTX2070s and 2xGTX1070) But again – it all comes down to use cases.
Awesome video and came just in time; I was just thinking of upgrading my hardware. I wanted to ask you which RTX 2070s you got? And what you think about them so far. I am a a bit new in the world of custom computers and am wondering if 4 x 2.5 slot wide GPUs can fit in the x399 given how the PCI slots are spaced! Thank you!
Awesome Video Mo!
Why did you go for RTX 2070s instead of 2080s? I am building my own setup for Christmas and was wondering what brand/cooling of RTX cards you’re buying! I have read dual fans GPU are not the most efficient thermally when it comes to stacking GPUs…
thx for the kind words. I went for RTX2070s instead of 2080s because usually the price/performance ratio is better with th XX70-series. As I’m not participating in the race of how many Octanebench Points I can get out of a single workstation, in case I’d need more GPU power I’d rather build a second workstation with additional cards instead.
If however your goal is to max out your single workstation – go for 2080s or 2080Tis.
I’ve not seen a mainboard yet that’d support more than 2 slot high PCIe cards by default. That’s why I went with these cards here – radial fan and a tiny bit of space to draw in air when stacked on top of each other:
Thanks so much for the quick response Mo! Super helpful. I had assumed you went with the blower style for cooling! It makes more sense for multi-gpu builds (or at least that’s what I seemed to have learned in the past hours).
Last quick question, for your 1070s you went with two of those Gigabyte Minis?
from top to bottom, I’ve got:
Very helpful video you put together. Thoughtful, clear, well shot and edited. Do you do anything special with your rendering pipeline to use Mantra for multiple GPUs? Or do you use a different renderer?
Thanks for all your work!
thanks a lot 🙂 In fact I mainly use Redshift for rendering, which is a GPU only engine and supports multi GPU rendering 🙂
I wouldn’t take my beloved workstation to a client. There’s professional solutions for this kind of thing you know.
A Teradici card in your workstation that allows you 60fps interactivity anywhere in the world with a zero client (a macbook pro 12″ for example with the Teradici software loaded) and a decent internet connection will cost you around $375.00 US more or less depending on the model PCIExpress card you buy.
Much cheaper than building a new workstation and lugging around to a new site then leaving it there overnight for weeks on end where it could be stolen or vandalized.
Srry I meant an iPad Pro 12″ with the Teradici zero client software loaded.
I believe this is the kind of technology that Catia v6 uses for its cloud computing solution for global engineering firms with hundreds of engineers modeling and testing stuff on worldwide projects.
Also it solves the problems of having to buy additional copies of software you already own.
You can turn your home workstation on and off from anywhere in the world with a good internet connection and just use it.
Thanks for the link, Francis!
I think there are also decent remote solutions that do not rely on specific hardware, e.g. https://anydesk.com/remote-desktop
I’m buying a 1950x for houdini of course. Do you have any comment about your experience. So far so good?
so far I’m super happy with this build. Now running two RTX 2070s, one 1070Ti and one 1070, all neatly chugging along.
I am getting into Houdini/Maya for VFX, so I have a build/upgrade going on. Getting the TR x2950. Until this weekend really did not know you could run dift GPUs together on the same board.
Q: so Houdini makes use of all the CUDA cores of various GPUs?
I have 1 1070ti at the moment, should I simply add more of them as $ is available to do so?
Would you recommend Windows 10 Pro Workstation OS or is Win10 Pro basic fine?
Looking to educate in Houdini. Would you recommend piecing it together w tutorials ie Rebelway and Applied Houdini, or is Gnomon 2 yr program better way to go?
Thanks for suggestion.
Houdini per se doesn’t use multiple GPUs. Redshift does though – and as that’s our primary render engine we’re advocating multi-GPU workstations. So if you’re using Redshift, indeed adding another GPU will increase your render performance (to a certain point, at least up to 4 GPUs per system work nicely, beyond that PCIe lane numbers might factor in…).
I’d personally use Win 10 Pro as I like Bitlocker and occasionally use Remote Desktop. (Which both don’t ship with Win 10 Home.)
Regarding starting learning Houdini: I cannot comment on the Gnomon program, have no idea about it. I personally pieced it together but there might be quicker ways…
thanks for this video! I built perfectly working Houdini workstation! But I have question, is there any problem filling this system with 4x RTX 2070 (you mentioned it earlier)? If yes please send me some references, or describe yourself, I can’t find anything saying that.
And if yes maybe I’ll change your build (2x 2070, 1070 and 1070ti) to 2x 2070, 2x 2060 concerning redshift 3.0 (I guess 2060 weren’t able when you were building it) ?
Thanks, and keep going!
although I cannot guarantee that this system will run four GTX2070s (I’m still running my 2070/1070/1070ti config) it to me seems very likely and nowadays I’d buy four 2070s if I wanted the computational power. The most crucial thing to remember is physical layout: pay attention to buying GPUs that are no more than 2 slots high. Last time I looked up price/performance charts the 2070s were the best bang for the buck. Not too sure about the 2060s – if you find relevant benchmarks it’d be great if you share them 🙂
Thanks for the post.. really helpful information and a great comment thread..
Im currently looking at a build with 4 x 2080ti cards stacked alongside a liquid cooled CPU and a bunch of additional case fans. I noticed you mentioned above your tendency to like hybrid cooled cards. I wondered if you thought it would be ok to stack these 4 GPUs (Blower not hybrid) and still not run at excessive temperature? the liquid cooling or Hybrid card options seem a little too pricey…
think about it this way: If your GPU reaches its max. allowed temperature it’s gonna start throttling down by reducing its core frequency. Thus GPUs that have thermal issues will run slower than properly cooled ones.
If you’re already thinking of going with highest end GPUs and not necessarily price/performance optimized ones, I’d highly recommend going hybrid cooled as this investment to me wouldn’t make sense if your 2080tis will run throttled in the end. If your budget doesn’t allow for this, start out with three or twp hybrid GPUs and upgrade at a later point.
If you want to save some some money go for three hybrid cooled GPUs and use an air cooled GPU in the bottom slot where airflow isn’t obstructed.
That being said if you still want to go all air cooled use blower GPUs like the
Asus Geforce RTX 2080 Ti Turbo (which also has at least a bit more space to draw in air).
I guess my main reason to recommend going for hybrid cooling in this case is this: The 2080Ti already isn’t cheap for what it delivers, so why reduce it’s value by risking it throttling down. A concern I wouldn’t have when going with cheaper 2070s.
I have a somewhat related question. I am wanting to get into GPU rendering, and so I will be building a new pc. While I am waiting a to see if a new Threadripper gets announced, I’ve gone ahead and bought a 2080 ti blower-style card, so I can take advantage of if with my current pc.
However, this card is very loud. I knew a single fan would be much louder, but now I don’t know if I would like to buy more of these, because the decibels will add up.
You seem to suggest that 2070’s don’t run as hot. Does this mean that they are also less noisy?
The reason I bought a 2080 TI is because I probably want to go for 2 cards (initially) and I think it’s beneficial to have the best single card for other programs like Nuke. Other reasons include more VRAM and lower power draw overall and less hassle.
But if 2070s are quieter I think I will go for those. I’d rather stay away from water or hybrid cooling. Like to hear your thoughts.
Thanks for this guide! Insightful.
hm… If you’re going for air cooled cards, blowers always will end up being more noisy. I personally cannot compare the RTX 2070s to the 2080Tis as I only have 2070s and a 2080 – they are about the same in noise. (I personally got used to it over the years.) If you want something quieter you might still want to consider going hybrid/water cooled.
Another option if you’re only installing two GPUs would be to go for those two/three fan style gamer cards. Had kinda mixed experiences with these (witnessed three cards where the fans broke after two years of heavy use) but if they’ve got space in between each other they might end up quieter than single fan blower cards…
HI Moritz, thanx for your videos, please tell 64 gb ram still enough for you or you already upgraded to 128? 🙂
For me it still works 🙂
this build was really helpfull and gave me lots of great starting points. Some time has passed and I’m looking hard at the newer sTRX4 Threadrippers. 3960x to be precise.
I really liked the idea of the Noctua Cooling fan you had in this setup. What would you suggest for Mainboard – CPU – Cooler combination for these new sTRX4 chipsets?
Looks like the 3960x gets really hot during Cinebench or the Blender Monkey benchmark running at 80 degrees. I really dont want to toast a brandnew workstation while rendering over the weekend finding either the machine turned itsself off or plain turned to (very expensive) dust on a Monday morning.
I would be very thankfull for some input!
Btw we met briefly at FOAM in Feburary, so Hi again! 🙂
according to Noctua, the NH-U14s should be able to handle a stock 3960x: https://noctua.at/en/cpu/AMD_Ryzen_Threadripper_3960X
Maybe install a second fan on that cooler. As far as I know there has yet to be a confirmed report of a stock CPU catching fire due to excessive computational load. Typically CPUs just reduce their clock speed when they pass a certain temperature. Temperature wise I’ve heard unconfirmed rumours of the 3960x throttling at 85°C – but please better confirm this for yourself. (The be quiet! Dark Rock Pro TR4 looks like an interesting Noctua alternative too.)
That being said there are all in one water coolers out there for the 3960x – Just from a gut feeling I’d go for a 360 radiator here. Water coolers are well suited to “catch” heat spikes as water has a high thermal capacity. However when we’re talking long sustained loads, the water in the loop will heat up and has to dissipate that heat through a radiator too, with most radiators being of a more conventional design than their purely air counterparts. But again the 360-rad is just a gut feeling, please check for yourself and make up your mind.
That being said, personally I’m not a huge fan of water cooling a CPU as most of the time an air cooler will be able to handle the thermal load while being easier to maintain and easier to diagnose if it fails. Admittedly the 3rd gen Threadrippers are the first instances where we’re seeing TDP in excess of 250W, so water cooling might actually be a technologically necessary solution in some instances.
Summing up: Noctua says the cooler will be able to cope with a 3960x. A water loop might be a good idea nevertheless. No hard data yet as I’ve not built/worked with a 3rd gen Threadripper.
Air coolers are generally cheaper, safe from leakage, perform great and last forever. Look up the cooler benchmarks, air coolers just do FINE or even better in many cases.
I really love Noctua but since NH-D15 does not support TR4 socket, here is my recommendation.
– be quiet DARK ROCK Pro TR4
– Thermalright SILVER ARROW TR4
– use one of these two and get NH-D15 TR4 if Noctua ever releases.
I was wondering about taking advatange of RTX tech in the GPUs for rendering with redshift, or nodes in Houdini. I saw that in Maya, Redshift has a “turn on OptixRT” toggle…is it there for Houdini and is that all I should do to take advantage of RTX tech? Thanks!
thanks for your fantastic walkthrough of your workstation build.
I am looking into rebuilding your workstation (although with less GPUs for now). I am wondering if now, two years later, you would make different choices for your components, or if your workstation still holds up your expectations?
Also, is it possible that some components have become more expensive by now (due to less demand)? I can’t seem to find a supplier (here in the UK) where the parts add up to your initial costs of 2.400 Euros. They seem to be much more pricey now. :-/
my workstation is still running fine. I upgraded to a Threadripper 2950X in the meantime and installed a second M2 SSD for dual boot.
If i’d build myself a workstation today I’d most likely be looking into Threadrippers 39xx which are admittedly a good bit more expensive but also a good bit faster. Or if I’d be on a budget, the Ryzen 9 39xx CPUs look like a good deal.
It’s not often that prices for the exact same components go up. In this case it might be due to the fact that some of the hardware I used might not be produced anymore and thus is scarce – but as it doesn’t make sense to build a workstation with legacy parts, I’d search for alternatives instead.