Inferact
Our mission is to grow vLLM as the world's AI inference engine and accelerate AI progress by making inference cheaper and faster.
- 129 followers
- United States of America
- https://inferact.ai/
- contact@inferact.ai
Popular repositories Loading
-
vllm-frontend-rs
vllm-frontend-rs PublicEarly-stage Rust drop-in alternative frontend for vLLM
Rust 4
Repositories
Showing 1 of 1 repositories
Top languages
Loading…
Most used topics
Loading…