valohai-migrate-parameters
Valohai Parameter Migration
Migrate hardcoded values in ML code to Valohai-managed parameters. Parameters cover both hyperparameters (learning rate, epochs, batch size) and configuration values (model name, output format, feature flags, thresholds, file paths). This makes jobs configurable without code changes, enables experiment tracking, hyperparameter sweeps, and reproducibility.
Philosophy
Valohai follows a YAML-over-SDK approach. Your ML code should not be entangled with the platform. Parameters are declared in valohai.yaml and injected as command-line arguments. Your code uses standard Python argparse to receive them.
Step-by-Step Instructions
1. Identify Hardcoded Values
Scan the user's ML code for hardcoded values that should be configurable. Parameters are not just for hyperparameters — any value that a user might want to change between runs belongs here:
More from valohai/valohai-skills
valohai-yaml-step
Create valohai.yaml step definitions for ML projects. Use this skill when a user wants to define a new step in valohai.yaml, configure Docker images, set up commands, define parameters/inputs for a step, or create a complete valohai.yaml from scratch for their ML project. Triggers on mentions of valohai.yaml, step definition, YAML configuration, Docker image selection, or creating Valohai steps.
18valohai-migrate-data
Migrate data loading and model saving in ML code to use Valohai's input/output system. Use this skill when a user wants to configure data inputs from cloud storage (S3, Azure Blob, GCS), save model outputs to Valohai, replace hardcoded file paths, remove boto3/cloud SDK code, or set up the Valohai file I/O system. Triggers on mentions of inputs, outputs, data loading, model saving, S3, cloud storage, file paths, or Valohai data migration.
18valohai-design-pipelines
Analyze ML project structure and design Valohai pipelines to orchestrate multi-step workflows. Use this skill when a user wants to create a Valohai pipeline, connect multiple ML steps into an automated workflow, identify pipeline opportunities in their codebase, design data flow between preprocessing/training/evaluation/inference steps, or add conditional logic and parallel execution to pipelines. Triggers on mentions of pipelines, workflows, DAGs, orchestration, multi-step ML, or connecting Valohai steps.
18valohai-project-run
Create a Valohai project, link it to a local directory, and run executions or pipelines using the Valohai CLI. Use this skill when a user wants to set up a new Valohai project, link an existing project, run a step execution, run a pipeline, watch execution logs, or manage their Valohai project via the CLI. Triggers on mentions of vh project create, vh execution run, vh pipeline run, running on Valohai, or Valohai CLI commands.
17valohai-migrate-metrics
Add metrics and metadata tracking to ML code for Valohai. Use this skill when a user wants to track training metrics (loss, accuracy, F1, etc.), log experiment metadata, enable real-time metric visualization, or compare experiments in Valohai. Triggers on mentions of metrics, metadata, experiment tracking, logging accuracy/loss, or Valohai metric migration.
17