lmstudio-subagents
SKILL.md
LM Studio Models
Offload tasks to local models when quality suffices. Base URL: http://127.0.0.1:1234. Auth: Authorization: Bearer lmstudio. instance_id = loaded_instances[].id (same model can have multiple, e.g. key and key:2).
Key Terms
- model: From GET models key; use in chat and optional load.
- lm_studio_api_url: Default http://127.0.0.1:1234 (paths /api/v1/...).
- response_id / previous_response_id: Chat returns response_id; pass as previous_response_id for stateful.
- instance_id: For unload, use only the value from GET /api/v1/models for that model: each
loaded_instances[].id. Do not assume it equals the model key; with multiple instances ids can be like key:2. LM Studio docs: List (loaded_instances[].id), Unload (instance_id).
Trigger in frontmatter; below = implementation.
Prerequisites
LM Studio 0.4+, server :1234, models on disk; load/unload via API (JIT optional); Node for script (curl ok).