Full command surface
Every platform operation — projects, datasets, campaigns, runs, claims, ledger, artifacts — maps to a focused subcommand with consistent flags and structured output.
Command-Line Interface
Use the cde CLI when the workflow belongs in a terminal, batch job, cluster, or CI pipeline. It gives operators direct access to datasets, runs, claims, and artifacts without sending the work through a browser.
Every platform operation — projects, datasets, campaigns, runs, claims, ledger, artifacts — maps to a focused subcommand with consistent flags and structured output.
Human-readable tables for exploration, JSON for jq pipelines, CSV for spreadsheets, NDJSON for streaming ingestion. Field names are stable across CLI versions.
Meaningful exit codes, idempotent operations, and API key auth make the CLI safe to embed in Makefiles, SLURM scripts, Airflow DAGs, and GitHub Actions.
Governance policies apply to CLI sessions the same way they apply to API and MCP calls. Promotions require rationale. The ledger records everything.
The cde binary is available through pip, Homebrew, or as a standalone download. No runtime dependencies — a single binary covers macOS, Linux, and Windows.
# pip (includes Python SDK + CLI entry point) pip install cde-sdk # Homebrew brew install vareon/tap/cde # Standalone binary curl -fsSL https://get.cde.vareon.com | sh # Verify installation cde version
$ cde version cde 1.4.0 API version: 2026-03-01 Base URL: https://api.cde.vareon.com

Two authentication modes — interactive login for human operators and API keys for automation. Both produce the same authorization context and are subject to the same governance policies.
Opens a browser-based auth flow. Tokens are stored in the system keychain (macOS/Linux) or credential manager (Windows) and refreshed automatically. You authenticate once; the CLI handles renewal.
Set CDE_API_KEY in your environment. Keys can be scoped to a project and rotated without downtime by configuring both old and new keys during the rotation window.
# Interactive login (opens browser) $ cde auth login ✓ Authenticated as faruk@vareon.com (Vareon Research) # Check current session $ cde auth status User: faruk@vareon.com Tenant: Vareon Research Token: valid (expires in 23h 41m) Keychain: macOS Keychain # Rotate API key $ cde auth rotate-key Old key: cde_key_proj01_a8f2... (valid for 24h) New key: cde_key_proj01_k9m1... (active now) # Logout and clear stored credentials $ cde auth logout ✓ Session cleared from keychain
Commands are organized into families that mirror the platform's resource hierarchy. Each family groups related operations under a common noun.
Manage authentication — login, logout, check status, and rotate API keys.
cde auth login
cde auth logout
cde auth status
cde auth rotate-key [--project <id>]
Manage projects: create, list, inspect, archive. Projects are the top-level container for datasets, campaigns, and governance policies.
cde projects list [--status active|archived] [--format table|json]
cde projects create --name "Project Name" [--description "..."]
cde projects inspect <project-id>
cde projects archive <project-id>
$ cde projects list ID NAME STATUS DATASETS CAMPAIGNS CREATED proj_a1b2 Pendulum Dynamics active 3 2 2026-02-15 proj_c3d4 Fluid Turbulence active 7 4 2026-01-08 proj_e5f6 Climate Feedbacks archived 12 6 2025-11-20
Upload, profile, list, and inspect datasets. Supports CSV, Parquet, and HDF5 formats. The profile command runs statistical analysis and recommends discovery modes.
cde datasets upload --project <id> --file data.csv --name "Dataset Name"
cde datasets profile <dataset-id>
cde datasets list --project <id>
cde datasets inspect <dataset-id>
$ cde datasets profile ds_29xK4m Dataset: pendulum_timeseries (ds_29xK4m) Rows: 15,000 Columns: 8 COLUMN TYPE NULLS MIN MAX MEAN STD time float64 0 0.000 30.000 15.000 8.660 theta float64 0 -1.571 1.571 0.003 0.782 omega float64 0 -5.124 5.089 -0.012 2.341 ... Quality score: 0.94 Recommended modes: symbolic, neuro_symbolic
Campaigns group related discovery runs under shared governance and budget constraints. Create with budget allocations, list with status filtering, and close when research is complete.
cde campaigns create --project <id> --name "Campaign Name" [--budget-hours 100]
cde campaigns list --project <id> [--status open|closed]
cde campaigns inspect <campaign-id>
cde campaigns close <campaign-id>
$ cde campaigns inspect camp_xyz Campaign: Initial Sweep (camp_xyz) Project: proj_a1b2 Status: open Budget: 40/100 GPU-hours used Runs: 12 completed, 2 running Claims: 47 explore, 8 validate, 1 publish Created: 2026-03-01T09:00:00Z
Submit discovery runs across all four modes, monitor pipeline stages, wait for completion, and cancel running jobs. Each mode accepts mode-specific parameters.
cde runs submit --campaign <id> --dataset <id> --mode symbolic [--max-complexity 8]
cde runs submit --campaign <id> --dataset <id> --mode neural [--config params.json]
cde runs submit --campaign <id> --dataset <id> --mode neuro_symbolic [--config params.json]
cde runs submit --campaign <id> --dataset <id> --mode causal [--config params.json]
cde runs status <run-id>
cde runs wait <run-id> [--timeout 3600]
cde runs cancel <run-id>
cde runs list --campaign <id> [--status running|completed|failed]
$ cde runs submit --campaign camp_xyz --dataset ds_29xK4m \
--mode symbolic --max-complexity 8
✓ Run submitted: run_7fGh2p (symbolic)
$ cde runs status run_7fGh2p
Run: run_7fGh2p
Mode: symbolic
Status: running
STAGE STATUS PROGRESS DURATION
preprocessing ✓ completed 100% 4s
analysis ✓ completed 100% 12s
modeling ◉ running 62% 2m 18s
validation ○ pending — —
extraction ○ pending — —
ETA: ~2m 25s remaining
$ cde runs wait run_7fGh2p
Waiting for run_7fGh2p... ████████████████████ 100%
✓ Completed in 4m 53s. 12 claims produced.List, inspect, promote, and export claims. Filter by type (equation, causal_graph, conservation_law, invariant), tier, and producing run. Promotions require a rationale recorded in the Evidence Ledger.
cde claims list --project <id> [--type equation] [--tier explore] [--run <id>]
cde claims inspect <claim-id>
cde claims promote <claim-id> --tier validate --rationale "..."
cde claims export --project <id> [--format json|csv] [--tier validate]
$ cde claims list --project proj_a1b2 --type equation --tier explore
ID TYPE EXPRESSION FITNESS COMPLEXITY TIER
clm_Ax92 equation d²θ/dt² = -(g/L)·sin(θ) 0.9987 5 explore
clm_Bx41 equation θ̈ = -9.81·θ 0.9812 3 explore
clm_Cx73 equation θ̈ = -g·sin(θ)/L + β·θ̇ 0.9801 7 explore
$ cde claims promote clm_Ax92 --tier validate \
--rationale "Exact pendulum equation recovered, physically interpretable"
✓ clm_Ax92 promoted: explore → validate
Ledger entry: le_881fQuery the Evidence Ledger for governance events, verify entry integrity with content hashes, and export audit trails. Every promotion, negative control, and policy event is recorded immutably.
cde ledger query --project <id> [--event-type promotion] [--since 2026-01-01]
cde ledger verify <entry-id>
cde ledger export --project <id> --format json > audit.json
$ cde ledger query --project proj_a1b2 --event-type promotion ENTRY EVENT RESOURCE ACTOR TIMESTAMP HASH le_881f promotion clm_Ax92 faruk@vareon.com 2026-03-28T10:12:00Z sha256:9f86d0... le_772e promotion clm_Dx55 agent:mcp-04a 2026-03-27T14:30:00Z sha256:a3f1c2... le_663d promotion clm_Ex88 faruk@vareon.com 2026-03-26T09:45:00Z sha256:b7e2d4... $ cde ledger verify le_881f ✓ Entry le_881f integrity verified (sha256:9f86d0...)
List, download, and stream artifacts produced by discovery runs — model checkpoints, optimization summaries, export bundles, and intermediate results.
cde artifacts list --run <run-id> [--type checkpoint|visualization|export]
cde artifacts download <artifact-id> [--output ./local-path]
cde artifacts stream <artifact-id>
$ cde artifacts list --run run_7fGh2p ID TYPE NAME SIZE HASH art_kL3m visualization optimization_summary 180 KB sha256:3c7a2b... art_mN4p checkpoint model_checkpoint 12.4 MB sha256:8d9e1f... art_oP5q export results_bundle.zip 2.1 MB sha256:f4a5b6... $ cde artifacts download art_kL3m --output ./figures/ ✓ Downloaded optimization_summary → ./figures/optimization_summary
Every command supports the --format flag. Field names in structured formats are stable across CLI versions — renames are backward-compatible for at least one major version.
Default. Human-readable, column-aligned. Streams rows as they arrive.
Single JSON object containing the full result. Ideal for jq processing.
RFC 4180 CSV with header row. Import directly into spreadsheets or pandas.
One JSON object per line. Streamable — downstream tools start processing before results finish.

The CLI is designed for embedding in automated pipelines. Below are production-ready patterns for common environments.
#!/bin/bash set -euo pipefail PROJECT_ID="proj_a1b2" CAMPAIGN_ID="camp_xyz" # Upload fresh dataset DS_ID=$(cde datasets upload \ --project "$PROJECT_ID" \ --file ./data/experiment_batch_42.csv \ --name "Batch 42" \ --format json | jq -r '.dataset_id') # Profile and verify quality QUALITY=$(cde datasets profile "$DS_ID" --format json | jq -r '.quality.score') if (( $(echo "$QUALITY < 0.8" | bc -l) )); then echo "Quality score $QUALITY below threshold" >&2 exit 1 fi # Submit symbolic discovery RUN_ID=$(cde runs submit \ --campaign "$CAMPAIGN_ID" \ --dataset "$DS_ID" \ --mode symbolic \ --max-complexity 8 \ --format json | jq -r '.run_id') # Wait for completion (non-zero exit if run fails) cde runs wait "$RUN_ID" --timeout 3600 # Export claims cde claims list --project "$PROJECT_ID" --run "$RUN_ID" \ --format json > "claims-$RUN_ID.json" # Export audit trail cde ledger export --project "$PROJECT_ID" \ --format json > "ledger-$PROJECT_ID.json" echo "Pipeline complete. Claims: claims-$RUN_ID.json"
#!/bin/bash #SBATCH --job-name=cde-neural #SBATCH --partition=gpu #SBATCH --gres=gpu:1 #SBATCH --time=04:00:00 #SBATCH --output=cde-%j.log module load python/3.11 export CDE_API_KEY="$CDE_API_KEY" RUN_ID=$(cde runs submit \ --campaign camp_hpc_01 \ --dataset ds_large_sim \ --mode neural \ --config params.json \ --format json | jq -r '.run_id') cde runs wait "$RUN_ID" --timeout 14400 cde claims list --run "$RUN_ID" --format csv > "$SLURM_SUBMIT_DIR/claims.csv" cde artifacts download "$(cde artifacts list --run "$RUN_ID" \ --type checkpoint --format json | jq -r '.artifacts[0].artifact_id')" \ --output "$SLURM_SUBMIT_DIR/model_checkpoint"
PROJECT := proj_a1b2 CAMPAIGN := camp_xyz DATASET := ds_29xK4m MODES := symbolic neural neuro_symbolic causal .PHONY: profile discover-all claims audit profile: cde datasets profile $(DATASET) --format json > profile.json discover-all: profile @for mode in $(MODES); do \ echo "Submitting $$mode..."; \ cde runs submit --campaign $(CAMPAIGN) \ --dataset $(DATASET) --mode $$mode \ --format json | jq -r '.run_id' >> run_ids.txt; \ done @while read -r rid; do \ cde runs wait "$$rid"; \ done < run_ids.txt claims: discover-all cde claims list --project $(PROJECT) \ --format csv > claims.csv audit: claims cde ledger export --project $(PROJECT) \ --format json > audit.json
from airflow.decorators import dag, task
from airflow.operators.bash import BashOperator
from datetime import datetime
@dag(schedule="@weekly", start_date=datetime(2026, 1, 1),
catchup=False, tags=["cde"])
def cde_weekly_discovery():
upload = BashOperator(
task_id="upload_dataset",
bash_command="""
cde datasets upload --project {{ var.value.cde_project }} \
--file /data/weekly/{{ ds }}.csv \
--name "Weekly {{ ds }}" \
--format json | jq -r '.dataset_id'
""",
env={"CDE_API_KEY": "{{ var.value.cde_api_key }}"},
)
submit = BashOperator(
task_id="submit_run",
bash_command="""
cde runs submit \
--campaign {{ var.value.cde_campaign }} \
--dataset {{ ti.xcom_pull(task_ids='upload_dataset') }} \
--mode symbolic --max-complexity 10 \
--format json | jq -r '.run_id'
""",
env={"CDE_API_KEY": "{{ var.value.cde_api_key }}"},
)
wait = BashOperator(
task_id="wait_for_run",
bash_command="""
cde runs wait {{ ti.xcom_pull(task_ids='submit_run') }} \
--timeout 7200
""",
env={"CDE_API_KEY": "{{ var.value.cde_api_key }}"},
)
upload >> submit >> wait
cde_weekly_discovery()Configuration is read from three sources in order of precedence: command-line flags → environment variables → config file. Set defaults in the file, override in CI with env vars, and further override with flags for one-off commands.
Default location: ~/.cde/config.yaml. Override with --config flag.
base_url: https://api.cde.vareon.com default_project: proj_a1b2 default_format: table timeout: 300 verbose: false
All config values map to CDE_-prefixed env vars.
CDE_BASE_URL API base URL CDE_API_KEY Authentication key CDE_DEFAULT_PROJECT Default project ID CDE_DEFAULT_FORMAT Output format CDE_TIMEOUT Request timeout (seconds) CDE_VERBOSE Enable verbose output
Exit codes are stable and documented. Scripts can branch on exit codes without parsing error messages.
| Code | Meaning | Description |
|---|---|---|
| 0 | Success | Command completed successfully. |
| 1 | General error | Unhandled error — check stderr for details. |
| 2 | Auth failure | Missing, expired, or invalid credentials. |
| 3 | Policy violation | Governance policy blocked the operation (tier ceiling, budget, allowlist). |
| 4 | Not found | Referenced resource (project, dataset, run, claim) does not exist. |
| 5 | Timeout | Operation exceeded the configured timeout (see --timeout flag). |
Tab completion for commands, flags, and resource IDs. Completions are generated from the CLI binary — no external files to keep in sync.
# Add to ~/.bashrc eval "$(cde completion bash)"
# Add to ~/.zshrc eval "$(cde completion zsh)"
# Save to completions dir cde completion fish > \ ~/.config/fish/completions/cde.fish
Common issues and their fixes. Run any command with --verbose for detailed request/response logging.
No valid credentials found. Run cde auth login or set CDE_API_KEY. Check cde auth status to verify token validity.
Your session's autonomy policy does not allow promotion to the requested tier. Contact a project admin to adjust the policy, or promote to a lower tier.
Cannot reach the API. Verify CDE_BASE_URL or the base_url in ~/.cde/config.yaml. Run cde version --verbose to see the configured URL.
The CLI accepts CSV, Parquet, and HDF5. Ensure the file extension matches the actual format. For headerless CSVs, add --no-header to the upload command.
The cde runs wait command exceeded its timeout. Increase with --timeout flag or set CDE_TIMEOUT. The run continues server-side — check with cde runs status <run-id>.
Shell completions fetch resource IDs from the API. If the API is unreachable or slow, completions time out. Set CDE_COMPLETION_TIMEOUT=2 to reduce wait time or disable dynamic completions with CDE_COMPLETION_STATIC=1.