addon-langchain-llm

Installation
SKILL.md

Add-on: LangChain LLM

Use this skill when an existing project needs LangChain primitives for chat, retrieval, or summarization.

Compatibility

  • Works with architect-python-uv-fastapi-sqlalchemy, architect-python-uv-batch, and architect-nextjs-bun-app.
  • Can be combined with addon-rag-ingestion-pipeline.
  • Can be combined with addon-langgraph-agent when graph orchestration is required.
  • Can be combined with addon-llm-judge-evals; when used together, declare langchain in config/skill_manifest.json so the judge runner can resolve the backend without guessing.

Inputs

Collect:

  • LLM_PROVIDER: openai | anthropic | ollama.
  • DEFAULT_MODEL: provider model id.
  • ENABLE_STREAMING: yes | no (default yes).
  • USE_RAG: yes | no.
  • MAX_INPUT_TOKENS: default 8000.
Related skills
Installs
8
First Seen
Mar 2, 2026