Environment Setup

Configure your API Key and Gretel CLI!

This walkthrough will teach you how to set up your environment so you can use the Gretel Console and CLI to create and run Gretel Models.

If you need to run Gretel Workers in your own environment, you must install Docker on your host(s). By default, Gretel Workers will be provisioned and run in Gretel Cloud.

Console Sign up and Project Creation

Sign up for Gretel with your Google or GitHub account in the Gretel Console.

Sign up for Gretel with your Google or GitHub account

After signing up, select New Project from the sidebar. This will create an empty project.

Create a new project

On the top right, select Connect and then Generate API Key.

Generate an API key

At this point, you should note your Project name and API key for use when configuring the CLI in the next section.

CLI Installation

Next, install the Gretel CLI and Python client. Both the CLI and Python SDK are bundled in a single Python package.

The Gretel CLI requires Python 3.7 or greater and Docker (for local Gretel Workers).

If you are using a system such as Ubuntu 18.04, etc with a version of Python less than 3.7 installed, you should upgrade to a newer version. We’ve provided a script that has been tested with Ubuntu 18.04 and will install Python 3.8 and PIP as the new defaults for the system.

Be sure to run a source ~/.bashrc from your terminal once the script completes.

Gretel Beta2 will require gretel-client version 0.8.0 or greater.

Once appropriate versions of Python and PIP are installed, you can install Gretel’s Python tools.

pip install gretel-client

CLI Configuration

Next, configure the CLI, this can be done by simply running:

gretel configure

Accept the default for the Endpoint.

The Default Runner is set for cloud, if you need workers to run on your own machine(s) you may change this value to local.

When prompted for your Gretel API Key, paste the key you created in the above steps.

When prompted for your Default Project, paste the Project Name you retrieved in the above steps.

At this point your configuration should have been written to your home directory. To validate the connection to Gretel Cloud try running:

gretel projects search --limit 10

This should return JSON output that shows existing projects you have in your account.

GPU + Docker Configuration (Synthetic Workers Only)

If you will be running synthetic workers in your own environment, we highly encourage you to configure a system with a GPU. Additionally, Docker and the Docker NVIDIA toolkit will need to be installed.

We have created a script that will configure an Ubuntu 18.04 machine with NVIDIA GPUs to work properly with Gretel’s CLI. Provided you have created a VM or setup a machine with Ubuntu and a GPU you can run:

curl https://raw.githubusercontent.com/gretelai/gretel-blueprints/main/utils/gpu_docker_setup.sh | bash

When this script completes you should see output similar to the following:

| NVIDIA-SMI 465.19.01 Driver Version: 465.19.01 CUDA Version: 11.3 |
| GPU Name Persistence-M| Bus-Id Disp.A | Volatile Uncorr. ECC |
| Fan Temp Perf Pwr:Usage/Cap| Memory-Usage | GPU-Util Compute M. |
| | | MIG M. |
| 0 NVIDIA Tesla T4 On | 00000000:00:04.0 Off | 0 |
| N/A 59C P0 28W / 70W | 0MiB / 15109MiB | 6% Default |
| | | N/A |
| Processes: |
| GPU GI CI PID Type Process name GPU Memory |
| ID ID Usage |
| No running processes found |

Finally, you will need to configure Docker to allow your current user to run commands, this can be done by running:

sudo groupadd docker
sudo usermod -aG docker $USER

Now log out and back in.

At this point, your machine is now configured to launch Gretel Workers with GPU support.