ai-gateway

Installation
SKILL.md

ai-gateway

Purpose

This skill manages an AI gateway for routing, securing, and monitoring AI service requests in ML operations, ensuring efficient traffic handling, API security, and performance tracking within the aimlops cluster.

When to Use

Use this skill when building ML pipelines that require centralized routing of AI requests, such as in production environments with multiple AI models, to enforce security policies, monitor traffic, or scale API endpoints. Apply it in scenarios involving microservices for AI inference or when integrating with tools like Kubernetes for aimlops workflows.

Key Capabilities

  • Routing: Dynamically route requests to AI services based on rules, using path-based or header-based matching.
  • Security: Enforce authentication, rate limiting, and encryption via JWT or API keys.
  • Monitoring: Track metrics like request latency and error rates through integrated logging and Prometheus exporters.
  • Configuration: Support YAML-based configs for defining routes, e.g., specifying source and destination endpoints.
  • Scalability: Handle load balancing across multiple AI backends with automatic failover.

Usage Patterns

To use this skill, first set up the AI gateway via CLI or API, then define routes and security rules. Always authenticate requests using the $AI_GATEWAY_API_KEY environment variable. For CLI usage, initialize with ai-gateway-cli init --config path/to/config.yaml, then apply changes with ai-gateway-cli apply. In code, import the SDK and call methods like createRoute() for programmatic setup. Monitor usage by querying metrics endpoints periodically.

Related skills
Installs
23
GitHub Stars
5
First Seen
Mar 5, 2026