modelslab-chat-generation

Installation
SKILL.md

ModelsLab Chat/LLM Generation

Send chat completions to 60+ LLM models through a single OpenAI-compatible endpoint.

When to Use This Skill

  • Chat with AI models (DeepSeek, Llama, Gemini, Qwen, Mistral)
  • Build conversational AI applications
  • Generate text completions with system prompts
  • Use function/tool calling with LLMs
  • Stream responses for real-time output
  • Get structured JSON responses

API Endpoint

Chat Completions: POST https://modelslab.com/api/v7/llm/chat/completions

The endpoint follows the OpenAI chat completions format, making it easy to switch from OpenAI or use the OpenAI SDK.

Related skills
Installs
20
GitHub Stars
8
First Seen
Feb 21, 2026