Skip to main content

Runtime Management

llm.port lets teams manage local runtimes and remote providers from one platform.

Supported operating model

  • Local runtime hosting for data-sensitive workloads
  • Remote provider usage for elasticity and model breadth
  • Hybrid operation with routing control by model alias

Operator benefits

  • Unified configuration and monitoring workflows
  • Easier runtime comparison and migration decisions
  • Better control over performance and cost posture

Rollout recommendations

  1. Define approved runtime/provider options
  2. Standardize alias naming per environment
  3. Validate model behavior before production cutover

Screenshots

LLM Providers

Provider Details

Local Runtime

Remote Runtime

Models

This documentation is generated with AI assistance and may contain inaccuracies. Please validate critical details before production use.