DIY GPU server: Build your own PC for deep learning

Building your own GPU server isn't hard, and it can easily beat the cost of training deep learning models in the cloud

Become An Insider

Sign up now and get FREE access to hundreds of Insider articles, guides, reviews, interviews, blogs, and other premium content. Learn more.

There comes a time in the life of many deep learning practitioners when they get the urge to build their own deep learning machine and escape the clutches of the cloud. The cloud is ideal for getting started with deep learning, and it is often the best answer for training large-scale deep learning models. But there is a vast area in between where having your own deep learning box can be significantly more cost-effective.

Not that it’s cheap. You will spend from $1,500 to $2,000 or more on a computer and high-end GPU capable of chewing through deep learning models. But if you’re doing extensive model training for days at a time, then having your own dedicated machine could pay for itself in three or four months—especially when you factor in cloud storage and ingress costs alongside compute time.

In this article, I’ll walk you through the deep learning machine I built earlier in the year, describing some of the choices you’ll encounter when building such a machine and the costs you’ll likely incur. Prices are direct quotes from Amazon as of December 2017.

If you want to get deeper into deep learning—whether that means research on larger datasets or entering Kaggle competitions, or both—building your own deep learning box makes a lot of sense. Running models on your own machine will likely be the best approach until you start doing work on huge datasets and require tens of GPUs for your training.

To continue reading this article register now