llm-council

Installation
SKILL.md

LLM Council

Query multiple AI models in parallel with a live web dashboard.

Launch

cd ~/.claude/skills/llm-council/scripts
# Kill any existing server on the port
fuser -k 8787/tcp 2>/dev/null
# Start server
nohup python3 server.py > /tmp/council-server.log 2>&1 &
# Wait for startup, verify health
sleep 2 && curl -s --max-time 5 http://localhost:8787/health
# Export port for browser access
/app/export-port.sh 8787

Environment: COUNCIL_PORT (default 8787), AI_GATEWAY_API_KEY (required, auto-detected from environment).

Related skills

More from happycapy-ai/happycapy-skills

Installs
5
GitHub Stars
111
First Seen
Mar 21, 2026