ai-orchestration-llamaindex

Installation
SKILL.md

LlamaIndex.TS Patterns

Quick Guide: LlamaIndex.TS is a data framework for building context-aware LLM applications in TypeScript. Use Settings singleton to configure LLM and embedding models globally. Load documents with SimpleDirectoryReader, chunk with SentenceSplitter, index with VectorStoreIndex.fromDocuments(), and query with index.asQueryEngine(). For agents, use agent() from @llamaindex/workflow with tool() definitions using Zod schemas. All core operations are async -- every function returns a Promise. The llamaindex package re-exports most things, but LLM providers require separate packages like @llamaindex/openai or @llamaindex/ollama.


<critical_requirements>

CRITICAL: Before Using This Skill

All code must follow project conventions in CLAUDE.md (kebab-case, named exports, import ordering, import type, named constants)

(You MUST configure Settings.llm and Settings.embedModel before any indexing or querying -- the Settings singleton is lazily initialized and defaults to OpenAI, which will fail without an API key)

(You MUST await all LlamaIndex operations -- fromDocuments(), asQueryEngine(), query(), chat(), loadData() are ALL async)

(You MUST install provider packages separately -- @llamaindex/openai, @llamaindex/ollama, @llamaindex/anthropic are NOT included in the base llamaindex package)

(You MUST use storageContextFromDefaults({ persistDir }) to persist indexes -- without persistence, indexes are rebuilt from scratch on every restart)

Related skills
Installs
2
GitHub Stars
6
First Seen
Apr 7, 2026