# CLI reference (`scripts/sora.py`)
This file contains the command catalog for the bundled video generation CLI. Keep `SKILL.md` overview-first; put verbose CLI details here.
## What this CLI does
- `create`: create a new video job (async)
- `create-and-poll`: create a job, poll until complete, optionally download
- `poll`: wait for an existing job to finish
- `status`: retrieve job status/details
- `download`: download video/thumbnail/spritesheet
- `list`: list recent jobs
- `delete`: delete a job
- `remix`: remix a completed video
- `create-batch`: create multiple jobs from a JSONL file
Real API calls require **network access** + `OPENAI_API_KEY`. `--dry-run` does not.
## Quick start (works from any repo)
Set a stable path to the skill CLI (default `CODEX_HOME` is `~/.codex`):
```
export CODEX_HOME="${CODEX_HOME:-$HOME/.codex}"
export SORA_CLI="$CODEX_HOME/skills/sora/scripts/sora.py"
```
If you're in this repo, you can set the path directly:
```
# Use repo root (tests run from output/* so $PWD is not the root)
export SORA_CLI="$(git rev-parse --show-toplevel)//scripts/sora.py"
```
If `git` isn't available, set `SORA_CLI` to the absolute path to `/scripts/sora.py`.
If uv cache fails with permission errors, set a writable cache:
```
export UV_CACHE_DIR="/tmp/uv-cache"
```
Dry-run (no API call; no network required; does not require the `openai` package):
```
python "$SORA_CLI" create --prompt "Test" --dry-run
```
Preflight a full prompt without running the API:
```
python "$SORA_CLI" create --prompt-file prompt.txt --dry-run --json-out out/request.json
```
Create a job (requires `OPENAI_API_KEY` + network):
```
uv run --with openai python "$SORA_CLI" create
--model sora-2
--prompt "Wide tracking shot of a teal coupe on a desert highway"
--size 1280x720
--seconds 8
```
Create from a prompt file (avoids shell-escaping issues for multi-line prompts):
```
cat > prompt.txt < tmp/sora/prompts.jsonl << 'EOB'
{"prompt":"A neon-lit rainy alley, slow dolly-in","seconds":"4"}
{"prompt":"A warm sunrise over a misty lake, gentle pan","seconds":"8",
"fields":{"camera":"35mm, slow pan","lighting":"soft dawn light"}}
EOB
uv run --with openai python "$SORA_CLI" create-batch --input tmp/sora/prompts.jsonl --out-dir out --concurrency 3
# Cleanup (recommended)
rm -f tmp/sora/prompts.jsonl
```
Notes:
- `create-batch` writes one JSON response per job under `--out-dir`.
- Output names default to `NNN-.json`.
- Use `--concurrency` to control parallelism (default `3`). Higher concurrency can hit rate limits.
- Treat the JSONL file as temporary: write it under `tmp/` and delete it after the run (do not commit it). If `rm` is blocked in your sandbox, skip cleanup or truncate the file.
## CLI notes
- Supported sizes depend on model (see `references/video-api.md`).
- Seconds are limited to 4, 8, or 12.
- Download URLs expire after about 1 hour; copy assets to your own storage.
- In CI/sandboxes where long-running commands time out, prefer `create` + `poll` (or add `--timeout`).
## See also
- API parameter quick reference: `references/video-api.md`
- Prompt structure and examples: `references/prompting.md`
- Sample prompts: `references/sample-prompts.md`
- Troubleshooting: `references/troubleshooting.md`
Source: claude-code-templates (MIT). See About Us for full credits.