Documentation Index
Fetch the complete documentation index at: https://docs.cloud.vessl.ai/llms.txt
Use this file to discover all available pages before exploring further.
Overview
A job runs a singlevesslctl job create command in a container and exits when the command finishes. Use it for training, inference, and batch processing workloads. Submission is CLI-only — the console shows status, logs, and metrics. New to the CLI? Start with the CLI Quickstart.
Prerequisites
- Authenticated
vesslctl: Runvesslctl auth loginto complete the browser OAuth flow. See the CLI Quickstart for the full setup. - Available credit balance: Job creation is blocked when the balance is zero or negative. Add a payment method and top up from Billing.
- (Optional) Persistent volume for outputs: Create an Object storage or Cluster storage volume ahead of time if you need to keep model checkpoints or other artifacts.
Submit
Provide a cluster, resource spec, container image, command to run, and any volumes or environment variables you need.Persist job output
Jobs run in ephemeral containers — anything written outside a mounted volume disappears when the job ends. Attach at least one persistent volume so your outputs survive.- Object storage (
--object-volume): Shared across clusters, ideal for final artifacts like trained models and evaluation metrics. Mount at a dedicated path such as/output. - Cluster storage (
--cluster-volume): Fast in-cluster storage, ideal for intermediate checkpoints during long training. Mount at/workspaceor similar.
Reuse a job configuration
Export an existing job’s configuration as JSON and resubmit it later:Submit a job from inside a workspace
Workspaces ship withvesslctl pre-authenticated via a workload token, so you can iterate on a script in JupyterLab and submit the same code as a batch job from the same shell. See vesslctl workspace for details.
