STACKQUADRANT

vLLM

Inference Engines

vLLM — a leading open-source project in the AI/LLM ecosystem.

8.6
GitHub Metrics
Stars
76.6k
Forks
15.6k
Open Issues
4.2k
Watchers
524
Contributors
2.5k
Weekly Commits
112
Language
Python
License
Apache-2.0
Last Commit
Apr 15, 2026
Created
Feb 9, 2023
Latest Release
v0.19.0
Release Date
Apr 3, 2026
Synced: Apr 15, 2026
Quality Scores
Documentation Qualityw: 20%
8.2

Has docs site (https://vllm.ai). Description: 61 chars. Stars signal: 76,598. Contributors: 2514. Score: 8.2/10

Community Healthw: 20%
9.3

Stars: 76,598. Contributors: 2514. Watchers: 524. Forks: 15,578. Issue ratio: 5.5%. Score: 9.3/10

Maintenance Velocityw: 15%
9.8

Last commit: 0d ago. Weekly commits: 112. Latest release: v0.19.0. Maturity bonus: 3.2y old. Score: 9.8/10

API Design & DXw: 20%
7.3

Stars/issues ratio: 18. Dynamic language: Python. Has documentation site. Permissive license: Apache-2.0. Popularity signal: 76,598 stars. Score: 7.3/10

Production Readinessw: 15%
8.6

Battle-tested: 76,598 stars. Peer review: 2514 contributors. Versioned: v0.19.0. Licensed: Apache-2.0. Age: 3.2 years. Maintenance: last commit 0d ago. Score: 8.6/10

Ecosystem Integrationw: 10%
8.9

Fork interest: 15,578. Major ecosystem: Python. Integration-friendly: Apache-2.0. Adoption: 76,598 stars. Has web presence. Score: 8.9/10

Tags
pythoninferencehigh-throughput
Radar
Documentation Quality
Community Health
Maintenance Velocity
API Design & DX
Production Readiness
Ecosystem Integration