Tebmer/Awesome-Knowledge-Distillation-of-LLMs
LLM FrameworksThis repository collects papers for "A Survey on Knowledge Distillation of Large Language Models". We break down KD into Knowledge Elicitation and Distillation Algorithms, and explore the Skill & Vertical Distillation of LLMs.
GitHub Metrics
Stars
1.3k
Forks
69
Open Issues
2
Watchers
17
Contributors
2
Weekly Commits
0
Language
—
License
—
Last Commit
Mar 9, 2025
Created
Feb 8, 2024
Latest Release
—
Release Date
—
Synced: Mar 3, 2026
Quality Scores
Documentation Qualityw: 20%
0.0
Community Healthw: 20%
0.0
Maintenance Velocityw: 15%
0.0
API Design & DXw: 20%
0.0
Production Readinessw: 15%
0.0
Ecosystem Integrationw: 10%
0.0
Tags
alignmentcompressiondata-augmentationdata-synthesisfeedbackinstruction-followingkdknowledge-distillationlarge-language-modelllm
Radar
No scores yet