hf-model-inference

Installation
SKILL.md

HuggingFace Model Inference Service

Overview

This skill provides procedural guidance for setting up HuggingFace model inference services. It covers model downloading, caching strategies, Flask API creation, and service deployment patterns.

Workflow

Phase 1: Environment Setup

  1. Verify package manager availability

    • Check for uv, pip, or conda before installing dependencies
    • Prefer uv for faster dependency resolution when available
  2. Install required packages

    • Core: transformers, torch (or tensorflow)
    • API: flask for REST endpoints
    • Set appropriate timeouts for large package installations (300+ seconds)
Related skills

More from letta-ai/skills

Installs
37
Repository
letta-ai/skills
GitHub Stars
97
First Seen
Jan 24, 2026