addon-direct-llm-sdk

Installation
SKILL.md

Add-on: Direct LLM SDK

Use this skill when a project needs explicit provider SDK control for chat, completions, embeddings, or tool calls without an additional orchestration framework.

Compatibility

  • Works with architect-python-uv-fastapi-sqlalchemy, architect-python-uv-batch, architect-nextjs-bun-app, and architect-next-prisma-bun-vector.
  • Use this instead of addon-langchain-llm when abstraction overhead is not wanted.
  • If paired with addon-llm-judge-evals, do not assume auto backend resolution; the current judge contract must be extended before direct SDK becomes a supported judge backend.

Inputs

Collect:

  • SDK_PROVIDER: openai | anthropic | google | openrouter.
  • DEFAULT_MODEL: provider model id.
  • ENABLE_STREAMING: yes | no (default yes).
  • REQUEST_TIMEOUT_SECONDS: default 60.
  • MAX_RETRIES: default 2.
Related skills
Installs
1
First Seen
Mar 2, 2026