lm-studio-subagents
Installation
SKILL.md
LM Studio Subagents
Overview
Offload LLM tasks to local models running in LM Studio to save API costs and maintain privacy. LM Studio provides an OpenAI-compatible API for local models, making it a drop-in replacement for cloud LLM calls. Use local models for high-volume, lower-complexity tasks like summarization, extraction, classification, and reformatting while reserving cloud APIs for complex reasoning.
Instructions
When a user wants to use local models via LM Studio, determine the task:
Task A: Set up LM Studio as a local API server
-
Download and install LM Studio from
https://lmstudio.ai/ -
Download a model through the LM Studio UI (recommended starting models):
lmstudio-community/Llama-3.1-8B-Instruct-GGUF(general purpose)lmstudio-community/Mistral-7B-Instruct-v0.3-GGUF(fast inference)lmstudio-community/Qwen2.5-7B-Instruct-GGUF(multilingual)
-
Start the local server:
Related skills