Skip to content
@Inferact

Inferact

Our mission is to grow vLLM as the world's AI inference engine and accelerate AI progress by making inference cheaper and faster.

Popular repositories Loading

  1. vllm-frontend-rs vllm-frontend-rs Public

    Early-stage Rust drop-in alternative frontend for vLLM

    Rust 4

Repositories

Showing 1 of 1 repositories

Top languages

Loading…

Most used topics

Loading…