ai-llm-inference

Installation
SKILL.md

LLMOps - Inference & Optimization - Production Skill Hub

Modern Best Practices (January 2026):

This skill provides production-ready operational patterns for optimizing LLM inference performance, cost, and reliability. It centralizes decision rules, optimization strategies, configuration templates, and operational checklists for inference workloads.

No theory. No narrative. Only what Codex can execute.


When to Use This Skill

Codex should activate this skill whenever the user asks for:

Related skills
Installs
102
GitHub Stars
60
First Seen
Jan 23, 2026