⚡ Do Command

Execute commands via LLM with natural language.

Overview

The cicerone do command enables natural language interaction with your infrastructure. It parses your query, routes it to the appropriate admin host, and returns actionable responses.

Query Flow: ============== User Query: "darth-ai: what's the disk usage?" │ ▼ [1] Parse admin name ───→ "darth-ai" │ ▼ [2] Load admin config ─→ host, llmUrl, library │ ▼ [3] Check permissions ──→ User in allowed groups? │ ▼ [4] Search library ─────→ Find relevant documents (if connected) │ ▼ [5] Build context ───────→ System info + library content │ ▼ [6] Send to LLM ─────────→ Ollama / OpenAI / Custom │ ▼ Response: "The disk usage is 45% (90 GB free of 200 GB)..."

Usage Examples

Basic Query

Query the default admin.

$ cicerone do "check system status"

Querying default admin (darth-ai)...

System Status for darth-ai:
• CPU: 15% usage
• Memory: 3.2 GB / 16 GB (20%)
• Disk: 45 GB / 200 GB (22%)
• Uptime: 21 days

Query Specific Admin

Target a specific admin host.

$ cicerone do "local: what processes are using the most memory?"

Querying local...

Top memory consumers:
1. java (PID 1234) - 2.1 GB
2. postgres (PID 5678) - 1.8 GB
3. nginx (PID 9012) - 512 MB

With Library Context

When an admin has a library connected, responses include context from your documents.

$ cicerone do "prod-ai: how do I restart the API?"

Based on your runbooks:

1. SSH to production:
   $ ssh deploy@prod-server

2. Navigate to API directory:
   $ cd /opt/api

3. Restart service:
   $ systemctl restart api-service

Reference: runbooks/api-restart.md

Multi-Admin Query

Query all available admins for comparison.

$ cicerone do "compare memory usage across all servers"

Querying: darth-ai, local, prod-ai

Results:
• darth-ai: 32 GB total, 12 GB used (37%)
• local: 16 GB total, 8 GB used (50%)
• prod-ai: 64 GB total, 48 GB used (75%) ⚠️ HIGH

Recommendation: prod-ai is using 75% memory.

Query Syntax

SyntaxDescriptionExample
<query> Query default admin cicerone do "check disk"
<admin>: <query> Query specific admin cicerone do "local: check disk"
ask <admin> to <query> Natural language format cicerone do "ask local to check disk"

Setting Up Admins

Before using cicerone do, you need to configure admin hosts:

Quick Setup

# Create local admin (Ollama running locally)
$ cicerone admin new local --llm-url http://localhost:11434

# Create remote admin
$ cicerone admin new prod-ai --host 192.168.1.100 --groups devops

# View all admins
$ cicerone admin show

# Test query
$ cicerone do "local: what's the system status?"

See Admin Commands for more details on configuring admins.