Setting up cloud environment for Deep Learning

Piotr Janusz
7 min readJun 30, 2019

This article describes how to configure cloud GPU environment for deep Learning. I’m using Vast.ai + Anaconda or miniconda + python 3.6 and tensoflow-gpu keras set up.

There are plenty of GPU cloud service providers out there, where you can take your deep learning model and train it. Some of them are cheaper the others and some of them are easier to set up.

Today I’d like to share with you a quick tutorial on how to set up conda/tensorflow/keras environment on vast.ai.

Miniconda is basically Anaconda environment only different is that Miniconda does not pre install python packages meaning you can install only what you want and what you choose to work with.

Note: once you rent GPU on vast.ai it comes with pre-installed tensorflow. This article is for those like myself who wants conda separate environment for deep learning with full control on python and packages.

Vast.ai is one of the cheapest ‘rent GPU’ site where in something like 10 min you are ready to train deep learning model on Nvidia GPUs. GTX 1080 Ti, RTX 2018 Ti or Tesla v100 you name it.

At the time I’m writing this article, you can rent GPU as cheap as 0.25$/h for GTX 1080, 0.16$/h for GTX 1070 or 0.65$ for Tesla V100. Those are of course sample GPU rental prices which most certainly will be different by the time you’ll check them out. You can find GTE 1080 Ti 1, two or even four on single machine. The same goes for RTX cards.

First things first. After you log in to the website you need credits to rent a GPU. One credit is one US dollar.

Minimum credit to add 5.

Vast.ai billing page

Hit Billing on left hand site and add your credit card. (Please use your credit card with caution as always! Not because vast.ai is risky or anything but this is the way you should do on any website. There are credit cards which you can charge with $ when you need credit card and discharge when you don’t need $ on it. Meaning nothing to be stolen!)

Note that by only adding credit card you get 2 credits which will allow you to play with virtual machine of your choice up to few hours (depending on the hardware you choose) without actually paying for anything. This let’s you decide if vast.ai is for you.

Once this is done you are ready to choose your own machine. Hit ‘Create’ on the left hand site which will take you to the list of what sort of hardware is available for the moment along with the price for it.

Vast.ai has one of the best dashboard with full info on the hardware. This is something that is outstanding (alongside price) at vast.ai comparing to other ‘rent GPU’ cloud service providers. You can’t overestimate this.

Cloud Deep Learning machine details

One quick look tells you what sort of GPU and memory you will pay for, what sort of CPU, how much system RAM etc. Really nice and handy dashboard.

Note you are paying essentially for three things,

  1. Up time which is price per hour
  2. Storage
  3. Internet traffic

Hover over price per hour to see all this.

Alright, knowing all this you are ready to choose your hardware for your deep learning model and rent your GPU.

Instance that I’m renting is

RTX 2080 8GB memory, 64GB RAM and Xeon E5 for just $0.18 a hour. Can you imagine that ?

When you rent the machine it will need few minutes to set up environment. When it’s done you should see ‘connect’ blue button.

Hit connect and jupyter notebook should open.

Next you need to open up terminal to log in to your machine. To do that hit new on right hand site, and choose Terminal.

Quick check if what you ware promised is what you get:

df -m #for allocated disc size MBsfree -m #Same with memorynvidia-smi # and your GPU memory

So far so good. We have what we are paying for — it’s always the case but since you are following this tutorial I’m assuming your doing it first time and it is really good to fill all this power yourself. :)

Next step is to install Miniconda.

I’m creating separate folder for my work ‘DL_conda’ in /home

cd /homemkdir DL_condacd DL_conda

Go to https://repo.continuum.io/miniconda/ and check which version is the latest for you. Once you figure it out come back to terminal and type:

curl-O https://repo.continuum.io/miniconda/Miniconda2-4.6.14-Linux-x86_64.sh

(replace Miniconda2–4.6.14-Linux-x86_64.sh with whatever version is the latest)

Install miniconda by typing

bash Miniconda2–4.6.14-Linux-x86_64.sh

(hint — type bash M and hit Tab button. It should fill in all remaining file name since it’s the only file starting with M in the folder and in fact the only file). You than will be asked to hit Enter, agree to user license and hit another enter.

Once miniconda is installed you need to restart the terminal to start using conda. Just type exit and close the browser tab, than on the jupyter notebook home page hit New and Terminal

Quick check on the conda, type

conda -V #to see the version

To see your conda environment type:

conda info -- envs 

You should see one environment called (base), that is all fine since you just installed conda.

Lets create environment for our tensorflow, keras GPU deep learning.

conda create-n gpu python=3.6

You’ll be asked to confirm by typing ‘y’ all the packages that needs to be installed to make your environment. Please do so.

Once this is done you can activate your environment by typing

source activate gpu

To confirm you are now working within your dedicated environment you can do two things:

  1. You should see ‘(gpu)’ before your user name
  2. You can type
conda info --envs 

to list down all available environments and ‘*’ next to active environment.

Once you pass this the environment is ready to install all the packages you need to start coding and working with deep learning model.

I’m installing my own set by typing:

conda install pandas numpy tensorflow-gpu keras opencv ipykernel

Once this is done, there is only one thing left to be done which is to add your environment to jupyter notebook. This is why we install ipykernel. Type:

python -m ipykernel install --user --name=gpu 

Restart jupyter notebook by closing browser tab with jupyter home and hit ‘connect’ again on your machine in vast.ai. Once you have restarted your jupyter you should now see the option to start jupyter workbook with your ‘gpu’ environment packages like this:

Anaconda jupyter notebook environment.

That is it. You are now ready to work with Jupyter notebook and your favourite pyton packages.

Whenever you need more packages just come back to the terminal, don’t forget to activate your environment

source activate gpu

and by typing

conda install package_name

install whatever you are missing for your project.

Whatever you’ll do with your machine please, please do not forget to hit ‘STOP’ on your vast.ai client -> instances to put the machine down and avoid charging for uptime while you’re actually not working with it.

BONUS:

It’s really good to know how to download data if you want to work with Kaggle and participate in theirs competitions.

For this you need to do couple of quick things:

  1. Activate your environment by typing:
source activate your_environment_name
  1. Type: pip install kaggle
  2. Create your Kaggle API key — this is done on your kaggle profile page. Just hit ‘create API Key’
  3. Come back to your machines terminal
  4. Create kaggle.json file in /root/.kaggle
  5. Copy and past your api key to the kaggle.json file

Regarding point 4 do the following:

mkdir /root/.kagglecd /root/.kaggle

Install editor:

apt-get install vim

Once this is done, type

vim kaggle.json

Press ‘i’ to enter interactive mode and pass your key which should look like this:

{“username”:”piotrjanusz”,”key”:”numbersandlettersofyourkey”}

Hit ESC

And than

:w

To write changes and

:q

To exit

Download data for dog breeds competition

mkdir /home/dogbreedscd /home/dogbreeds

Check what files are available for the contest

kaggle competitions files dog-breed-identification

And download them:

kaggle competitions download dog-breed-identification

After you unzip it you are ready to work with the data.

Hope this will help you with your journey.

Thanks for reading.

Feel free to connect with me on Linkedin

--

--