Skip to main content

Documentation Index

Fetch the complete documentation index at: https://docs.cloud.vessl.ai/llms.txt

Use this file to discover all available pages before exploring further.

Jobs are CLI-only for submission. Use vesslctl job to submit, monitor, terminate, and tag batch jobs entirely from your terminal — bring your own image, mount volumes, and set environment variables in a single command. See the CLI cheat sheet for a one-page command reference, or vesslctl job for the full flag reference.
A Job runs a command to completion on a specified GPU or CPU resource. Unlike workspaces, jobs are non-interactive — submit a fine-tuning script or evaluation pipeline and let it run to completion. Jobs are ideal for:
  • Model training and fine-tuning
  • Batch inference and evaluation
  • Data preprocessing pipelines
  • Hyperparameter sweeps (submit multiple jobs in parallel)
VESSL Cloud Jobs list page showing job name, status, requested resources, duration, and creator

Jobs vs Workspaces

JobWorkspace
InteractionNon-interactive (runs a command)Interactive (SSH, JupyterLab)
LifecycleStarts → runs → completes automaticallyStays running until you pause or terminate
BillingOnly while runningWhile running (GPU + storage); while paused (storage only)
Best forTraining, batch processing, sweepsDevelopment, debugging, exploration

Job statuses

StatusMeaning
schedulingWaiting for resources to become available. The job shows a reason like Waiting for GPU capacity while it queues.
runningYour command is actively executing on the allocated resources.
succeededThe command exited successfully (exit code 0). Output in mounted volumes is preserved.
failedThe command exited with a non-zero code, or the container crashed (for example, OOMKilled). Check logs to debug.
terminatedYou manually cancelled the job before it finished.

Next steps