Search…
Environment Setup
Configure your API Key and Gretel CLI!
This walkthrough will teach you how to set up your environment so you can use the Gretel Console and CLI to create and run Gretel Models.
We highly recommend using Ubuntu 18.04 for creating your own local environment.
If you need to run Gretel Workers in your own environment, you must install Docker on your host(s). By default, Gretel Workers will be provisioned and run in Gretel Cloud.
The following install scripts for Docker are recommended for installation.
The two commands are:
1
curl -fsSL https://get.docker.com -o get-docker.sh
2
sh get-docker.sh
Copied!
After Docker is installed, you should be able to access it via this command: sudo docker
Next, enable Docker in privileged mode:
1
sudo groupadd docker
2
sudo usermod -aG docker $USER
Copied!
From here you may need to restart your shell session and now you should be able to access docker via this command: docker ps

Console Sign up and Project Creation

Sign up for Gretel with your Google or GitHub account in the Gretel Console.
Sign up for Gretel with your Google or GitHub account
After signing up, select New Project from the sidebar. This will create an empty project.
Create a new project
On the top right, select Connect and then Generate API Key.
Generate an API key
At this point, you should note your Project name and API key for use when configuring the CLI in the next section.

CLI Installation

Next, install the Gretel CLI and Python client. Both the CLI and Python SDK are bundled in a single Python package.
The Gretel CLI requires Python 3.7 or greater and Docker (for local Gretel Workers).
If you are using a system such as Ubuntu 18.04, etc with a version of Python less than 3.7 installed, you should upgrade to a newer version. We’ve provided a script that has been tested with Ubuntu 18.04 and will install Python 3.8 and PIP as the new defaults for the system.
Be sure to run a source ~/.bashrc from your terminal once the script completes.
Gretel Beta2 will require gretel-client version 0.8.0 or greater.
Once appropriate versions of Python and PIP are installed, you can install Gretel’s Python tools.
1
pip install -U gretel-client
Copied!

CLI Configuration

Next, configure the CLI, this can be done by simply running:
1
gretel configure
Copied!
Accept the default for the Endpoint.
The Default Runner is set for cloud, if you need workers to run on your own machine(s) you may change this value to local.
When prompted for your Gretel API Key, paste the key you created in the above steps.
When prompted for your Default Project, paste the Project Name you retrieved in the above steps.
At this point your configuration should have been written to your home directory. To validate the connection to Gretel Cloud try running:
1
gretel projects search --limit 10
Copied!
This should return JSON output that shows existing projects you have in your account.
At this point you should be ready to run Gretel jobs on your local system.
The first time you run the gretel models create command , the CLI application will trigger a download of Gretel's containers to your system. This may take a few minutes. As we update our service and software, running jobs may also trigger the downloading of updated containers.

GPU + Docker Configuration (Synthetic Workers Only)

If you will be running synthetic workers in your own environment, we highly encourage you to configure a system with a GPU. Additionally, Docker and the Docker NVIDIA toolkit will need to be installed.
We have created a script that will configure an Ubuntu 18.04 machine with NVIDIA GPUs to work properly with Gretel’s CLI. Provided you have created a VM or setup a machine with Ubuntu and a GPU you can run:
1
curl https://raw.githubusercontent.com/gretelai/gretel-blueprints/main/utils/gpu_docker_setup.sh | bash
Copied!
When this script completes you should see output similar to the following:
1
+-----------------------------------------------------------------------------+
2
| NVIDIA-SMI 465.19.01 Driver Version: 465.19.01 CUDA Version: 11.3 |
3
|-------------------------------+----------------------+----------------------+
4
| GPU Name Persistence-M| Bus-Id Disp.A | Volatile Uncorr. ECC |
5
| Fan Temp Perf Pwr:Usage/Cap| Memory-Usage | GPU-Util Compute M. |
6
| | | MIG M. |
7
|===============================+======================+======================|
8
| 0 NVIDIA Tesla T4 On | 00000000:00:04.0 Off | 0 |
9
| N/A 59C P0 28W / 70W | 0MiB / 15109MiB | 6% Default |
10
| | | N/A |
11
+-------------------------------+----------------------+----------------------+
12
13
+-----------------------------------------------------------------------------+
14
| Processes: |
15
| GPU GI CI PID Type Process name GPU Memory |
16
| ID ID Usage |
17
|=============================================================================|
18
| No running processes found |
19
+-----------------------------------------------------------------------------+
Copied!
Finally, you will need to configure Docker to allow your current user to run commands, this can be done by running:
1
sudo groupadd docker
2
sudo usermod -aG docker $USER
Copied!
Now log out and back in.
At this point, your machine is now configured to launch Gretel Workers with GPU support.
Last modified 24d ago