![]() Connect to a Git repository |
![]() Choose a Docker image |
![]() Add a valohai.yaml configuration |
![]() Use Valohai data inputs/outputs |
![]() Define parameters in valohai.yaml |
![]() Collect key metrics in JSON |
This guide assumes that your organization has the environments and data stores configured already as a part of the Valohai installation.
-
How to Add a cloud data store
Connect your Git repository to a Valohai project
Start by going to the Valohai application (app.valohai.com) and creating a new project.
The owner of the project should be your organization or team so that you can access your organization's cloud resources like virtual machines, data stores, and private Docker registries.
After you’ve created the project you’ll need to connect it to a Git repository through the Repository tab in your project’s settings.
- Connect a GitHub repository to your project
- Connect a GitLab repository to your project
- Connect a BitBucket repository to your project
- Connect a Azure DevOps repository to your project
Install Valohai tools
Valohai offers command-line tools to make it easier to interact with the platform.
-
valohai-cli command-line allows you to easily run jobs, view logs, and create new projects directly from the cli.
Start by installing the Valohai tools, logging in, and linking your local directory to a Valohai project.
pip3 install valohai-cli
vh login
# Navigate to your project
cd myproject
# Links your current working directory to a existing Valohai project
vh project link
Choose a Docker image
Each Valohai execution is run inside a Docker container. This makes it possible to execute any code from C to Python as long as it can run inside the chosen container.
You can use any Docker images for your Valohai jobs, either from a public or a private Docker registry.
Here are the most common Docker images currently used on the platform:
tensorflow/tensorflow:-gpu # e.g. 2.6.1-gpu, for GPU support
tensorflow/tensorflow: # e.g. 2.6.1, for CPU only
pytorchlightning/pytorch_lightning:latest-py-torch # e.g. py3.6-torch1.6
pytorch/pytorch: # e.g. 1.3-cuda10.1-cudnn7-runtime
python: # e.g. 3.8.0
r-base: # e.g. 3.6.1
julia: # e.g. 1.3.0
valohai/prophet # Valohai hosted image with Prophet
valohai/sklearn:1.0 # Valohai hosted image with sklean
valohai/xgboost:1.4.2 # Valohai hosted image with xgboost
You can also install any additional libraries inside the execution with for example apt-get install
or pip install -r requirements.txt
.
Valohai will download and cache the Docker images on the worker virtual machine, so they don’t have to be redownloaded for every job.
Next
To get the most out of Valohai features you should integrate with Valohai input/output data, parameters, and metadata system. Follow the guides below for details:
Topic |
Why |
Guide |
---|---|---|
Configuration file |
|
|
Files |
|
|
Parameters |
|
|
Metrics |
|
Comments
0 comments
Please sign in to leave a comment.