STACKQUADRANT

vLLM

Inference Engines

vLLM — a leading open-source project in the AI/LLM ecosystem.

8.7
GitHub Metrics
Stars
71.5k
Forks
13.8k
Open Issues
3.5k
Watchers
493
Contributors
2.3k
Weekly Commits
318
Language
Python
License
Apache-2.0
Last Commit
Feb 28, 2026
Created
Feb 9, 2023
Latest Release
v0.16.0
Release Date
Feb 25, 2026
Synced: Feb 28, 2026
Quality Scores
Documentation Qualityw: 20%
8.2

Has docs site (https://vllm.ai). Description: 61 chars. Stars signal: 71,512. Contributors: 2264. Score: 8.2/10

Community Healthw: 20%
9.3

Stars: 71,512. Contributors: 2264. Watchers: 493. Forks: 13,792. Issue ratio: 4.9%. Score: 9.3/10

Maintenance Velocityw: 15%
9.8

Last commit: 0d ago. Weekly commits: 318. Latest release: v0.16.0. Maturity bonus: 3.1y old. Score: 9.8/10

API Design & DXw: 20%
7.8

Stars/issues ratio: 21. Dynamic language: Python. Has documentation site. Permissive license: Apache-2.0. Popularity signal: 71,512 stars. Score: 7.8/10

Production Readinessw: 15%
8.6

Battle-tested: 71,512 stars. Peer review: 2264 contributors. Versioned: v0.16.0. Licensed: Apache-2.0. Age: 3.1 years. Maintenance: last commit 0d ago. Score: 8.6/10

Ecosystem Integrationw: 10%
8.9

Fork interest: 13,792. Major ecosystem: Python. Integration-friendly: Apache-2.0. Adoption: 71,512 stars. Has web presence. Score: 8.9/10

Tags
pythoninferencehigh-throughput
Radar
Documentation Quality
Community Health
Maintenance Velocity
API Design & DX
Production Readiness
Ecosystem Integration