Awesome Open Source
Search
Programming Languages
Languages
All Categories
Categories
About
Search results for knowledge distillation cifar100
cifar100
x
knowledge-distillation
x
3 search results found
Torchdistill
⭐
1,171
A coding-free framework built on PyTorch for reproducible deep learning studies. 🏆22 knowledge distillation methods presented at CVPR, ICLR, ECCV, NeurIPS, ICCV, etc are implemented so far. 🎁 Trained models, training logs and configurations are available for ensuring the reproducibiliy and benchmark.
Ctkd
⭐
109
[AAAI 2023] Code for "Curriculum Temperature for Knowledge Distillation"
Dm Kd
⭐
38
Code for "Is Synthetic Data From Diffusion Models Ready for Knowledge Distillation?" (https://arxiv.org/abs/2305.12954)
1-3 of 3 search results
Privacy
|
About
|
Terms
|
Follow Us On Twitter
Copyright 2018-2024 Awesome Open Source. All rights reserved.