Welcome to Paperspace’s ML-in-a-Box (MLIAB)
Paperspace's ML-in-a-box is a fully featured desktop in the cloud built for getting up and running with machine learning, data science, and much more. The Paperspace Machine Learning template is intended to provide a fully functional machine learning environment for interactive development. ML-in-a-box includes a Desktop environment but can also be accessed via our web-based terminal or SSH.
The ML template is based on Ubuntu. The template includes NVidia’s libraries for using the GPU to run Machine Learning programs, as well as a variety of libraries for ML development in Python and Lua. The template also comes pre-installed with Google Chrome, and the Atom text editor for your convenience.
What's on the template?
Your Machine Learning Paperspace Linux machine comes pre-installed with the following software:
- NVIDIA Driver: 410.48
- CUDA 10.0.130-1
- cuDNN 184.108.40.206-1+cuda10.0
- TensorFlow 1.11.0-rc1
- PyTorch 1.0.0a0+39bd73a
- Keras 2.2.2
- Theano 1.0.3
- NVIDIA-docker 2.0.3
- Docker 18.06.1-ce
- Atom 1.31.2
- jupyter-notebook 5.7.0
- python 3.6.6
- Chromium Browser 69.0.3497.81
You can install additional software as you would on any computer.
Where is everything installed?
We put all software packages in the ~/src directory
Most of the software installed on this template was installed as root in the standard system locations. Software installation (and updating) can be performed using the standard apt-get utility included with Ubuntu (e.g.
sudo apt-get update; sudo apt-get -y upgrade), or via python’s pip and pip3 package managers for the included python libraries (e.g.
sudo pip3 install --upgrade tensorflow-gpu).
Some software has been installed manually. The build and source-trees for this software can be found in /home/paperspace/src (e.g.
cd ~paperspace/src/torch; git status). Also, other versions of cuDNN can be found in /home/paperspace/cudnn-packages.
The /usr/local/bin/ml-versions.sh script will output the versions of selected software that’s installed on the system, including cudnn, tensorflow, Torch, and others.
NOTE: by default the MLIAB template uses the UFW firewall package to help protect your system if it has a public ip. The ssh port is open, but we recommend using the ufw tool to limit the addresses from which you can ssh to systems with a public IP. Most other ports are blocked. You can use the ufw tool to open up ports for other tools you might install (e.g.
sudo ufw allow from 220.127.116.11 to any port 8888)
How do I run Docker?
How do I make a template?
Follow these instructions to prepare your machine before creating a Template through the Console.
- Atom 1.18.0-1~webupd8~0
- CUDA 8.0.61-1
- cuDNN 5.1.10-1+cuda8.0
- TensorFlow 1.2.1 (Python 2 & 3)
- Anaconda 4.4.0 (python 3.6, 2.7)
- Nvidia-docker 1.0.1
- Docker ce
- PyTorch 0.2.0
- Theano 0.9.0 (Python 2 & 3)
- python2 package torch (0.1.11.post5)
- python3 package (no torch for python 3.4.3)
- Keras 2.0.5 (Python 2 & 3)
- Chrome 59.0.3071.115-1
- NVidia 375.66-0ubuntu0.14.04.1
- Torch commit 5961f52a65fe33efa675f71e5c19ad8de56e8dad
- Caffe commit 4efdf7ee49cffefdd7ea099c00dc5ea327640f04
- OpenBLAS commit 482015f8d6840da9617b422e758162cf7358c8b2