Back to Blog
Productclivesslctldeveloper-toolsmcp

vesslctl: Manage VESSL Cloud from Your Terminal

VESSL AI
VESSL AI
||5 min read
vesslctl launch — VESSL Cloud CLI with native MCP integration for Claude, Codex, and Gemini
vesslctl CLI main help output listing available commands

Why manage GPU instances only through a web browser?

Creating a workspace means opening a browser, logging into the dashboard, clicking through settings... It works, but it's slow.

vesslctl is the official CLI for VESSL Cloud. Create workspaces, run batch jobs, upload data — all from your terminal. And one more thing: you can plug AI coding tools straight into an MCP server that serves the CLI docs.

30-second setup

# Install
curl -fsSL https://api.cloud.vessl.ai/cli/install.sh | bash

# Log in (browser OAuth)
vesslctl auth login

# Verify auth status
vesslctl auth status

That's it.

Check your credits
After logging in, run vesslctl billing show to check your organization's credit balance. If it's at zero, workspace create and job create are blocked before they run. You can top up from the VESSL Cloud first.

Core workflows

Workspace management

# Create a workspace — PyTorch image
vesslctl workspace create \
  --cluster <cluster-name> \
  --resource-spec <spec-name> \
  --image "pytorch/pytorch:2.3.0-cuda12.1-cudnn8-devel" \
  --name "my-workspace"

# Check status
vesslctl workspace show my-workspace

# SSH in
vesslctl workspace ssh my-workspace

# Pause (stops billing, preserves data)
vesslctl workspace pause my-workspace

# Resume
vesslctl workspace start my-workspace
Use vesslctl cluster list to see available clusters and vesslctl resource-spec list for resource specs.

Data upload

# Create an Object storage volume
vesslctl volume create \
  --name my-dataset \
  --storage <storage-name> \
  --teams <team>

# Get S3 credentials
vesslctl volume token my-dataset

# Upload data with AWS CLI
aws s3 cp ./data s3://my-dataset/ --recursive --endpoint-url <endpoint>

# Verify upload
vesslctl volume ls my-dataset
Use vesslctl storage list to see available storage options.

Batch job execution

# Create a job
vesslctl job create \
  --name my-training \
  --resource-spec <spec-name> \
  --image "pytorch/pytorch:2.3.0-cuda12.1-cudnn8-devel" \
  --cmd "python train.py --epochs 100"

# Follow logs in real time
vesslctl job logs <job-id> --follow

For an end-to-end walkthrough — a real Gemma 4 fine-tuning run across five jobs for $1.72 total — see the post below.

Let Your Laptop Sleep: Automate GPU Training with VESSL Batch Jobs
Develop in workspaces, train with batch jobs. Submit GPU training with a single vesslctl command.

AI tools integration (MCP)

VESSL Cloud documentation is served as an MCP server, so AI coding assistants can reference accurate vesslctl documentation in real time.

MCP (Model Context Protocol) lets AI coding tools manage your GPU infrastructure using natural language — no need to memorize CLI flags.

MCP server URL

https://docs.cloud.vessl.ai/mcp

Setup

Claude Code

claude mcp add vessl --url https://docs.cloud.vessl.ai/mcp

Cursor

Go to Settings > MCP Servers and add the URL:

https://docs.cloud.vessl.ai/mcp

Windsurf, VS Code (Copilot)

Add the following to your MCP configuration file:

{
  "mcpServers": {
    "vessl": {
      "url": "https://docs.cloud.vessl.ai/mcp"
    }
  }
}

What this looks like in practice

Ask your AI coding tool in plain language, and it generates the right vesslctl commands.

Prompt exampleGenerated action
"Create a workspace with an A100"generates the exact vesslctl workspace create command
"Show me the logs for my running job"generates the exact vesslctl job logs --follow command
"List my workspaces"runs vesslctl workspace list
"Pause the workspace I'm done with"runs vesslctl workspace pause <name>
"Submit a training job on H100"generates the exact vesslctl job create command

When you need a GPU while writing code, you can provision infrastructure without leaving your editor.

Install the Claude skill

If you use Claude Code, install the bundled Claude skill that ships with vesslctl. Once installed, Claude Code learns vesslctl command usage, safety rules, and argument formats — so it suggests more accurate commands.

vesslctl skill install --target claude-code

After install, ask "Create an A100 workspace on VESSL Cloud" in natural language and Claude runs the right command. Combined with the MCP server, you get richer context.

Command overview

AreaCommandDescription
Authvesslctl auth loginLog in (browser OAuth)
vesslctl auth logoutLog out
vesslctl auth statusCheck auth status
Workspacevesslctl workspace createCreate a workspace
vesslctl workspace listList workspaces
vesslctl workspace showShow workspace details
vesslctl workspace sshSSH into a workspace
vesslctl workspace pausePause (stops billing, preserves data)
vesslctl workspace startResume
vesslctl workspace terminateTerminate
Jobvesslctl job createCreate a batch job
vesslctl job listList jobs
vesslctl job showShow job details
vesslctl job logsView logs (--follow for real-time streaming)
Storagevesslctl volume createCreate a volume
vesslctl volume listList volumes
vesslctl volume lsList files in a volume
vesslctl volume tokenGet S3-compatible access token
Org/Teamvesslctl org listList organizations
vesslctl org switchSwitch default organization
vesslctl team listList teams
vesslctl team switchSwitch default team
Configvesslctl config showShow CLI config
vesslctl config setUpdate CLI config
Billingvesslctl billing showCheck credit balance and burn rate
Utilityvesslctl completion installInstall shell autocompletion
vesslctl updateUpdate CLI to latest version

FAQ

Can I use the CLI instead of the web console?

For most tasks, yes. Workspace creation, job execution, storage management, and org/team switching all work from the CLI. Some administrative tasks — like viewing resource usage dashboards or inviting team members — are easier in the web console.

Which AI tools support MCP?

Any tool that implements the MCP protocol. Currently confirmed: Claude Code, Cursor, Windsurf, and VS Code (GitHub Copilot). If your tool supports MCP, it can connect to the VESSL MCP server.

Can my team use this together?

Yes. vesslctl fully supports VESSL Cloud's org/team multi-tenancy. Use vesslctl org switch and vesslctl team switch to switch between organizations and teams. Each team member logs in with their own account and shares access to the same organization's resources.

References

Keep reading

Let Your Laptop Sleep: Automate GPU Training with VESSL Batch Jobs
Your laptop sleeps — your GPU keeps training. Submit remote GPU jobs with one vesslctl command, and Smart Pausing stops idle spend automatically.
How to Fine-Tune Gemma 4 in 15 Minutes
Fine-tune Google Gemma 4 E4B on a single A100 with Unsloth in 15 minutes — from Object Storage setup to evaluation, end-to-end.
VESSL AI

VESSL AI