mistral-deploy-integration
Installation
SKILL.md
Mistral AI Deploy Integration
Overview
Deploy Mistral AI-powered applications to production with secure API key management. Covers Vercel (Edge + Serverless), Docker, Cloud Run, and self-hosted vLLM deployments. All connect to api.mistral.ai or your own inference endpoint.
Prerequisites
- Mistral AI production API key
- Platform CLI installed (vercel, docker, or gcloud)
- Application using
@mistralai/mistralaiSDK
Instructions
Step 1: Platform Secret Configuration
set -euo pipefail
# Vercel
vercel env add MISTRAL_API_KEY production
vercel env add MISTRAL_MODEL production # optional: default model
Related skills