Awesome Open Source
Search
Programming Languages
Languages
All Categories
Categories
About
Search results for cifar knowledge distillation
cifar
x
knowledge-distillation
x
9 search results found
Torchdistill
⭐
1,171
A coding-free framework built on PyTorch for reproducible deep learning studies. 🏆22 knowledge distillation methods presented at CVPR, ICLR, ECCV, NeurIPS, ICCV, etc are implemented so far. 🎁 Trained models, training logs and configurations are available for ensuring the reproducibiliy and benchmark.
Attention Transfer
⭐
1,120
Improving Convolutional Networks via Attention Transfer (ICLR 2017)
Knowledge Distillation Zoo
⭐
804
Pytorch implementation of various Knowledge Distillation (KD) methods.
Mdistiller
⭐
634
The official implementation of [CVPR2022] Decoupled Knowledge Distillation https://arxiv.org/abs/2203.08679 and [ICCV2023] DOT: A Distillation-Oriented Trainer https://openaccess.thecvf.com/content/ICCV2023/pap
Pytorch Be Your Own Teacher
⭐
61
A pytorch implementation of paper 'Be Your Own Teacher: Improve the Performance of Convolutional Neural Networks via Self Distillation', https://arxiv.org/abs/1905.08094
Okddip Aaai2020
⭐
50
This is a PyTorch-1.0 implementation for the AAAI-2020 paper (Online Knowledge Distillation with Diverse Peers).
Data Free Adversarial Distillation
⭐
44
Code and pretrained models for paper: Data-Free Adversarial Distillation
Semckd
⭐
43
This is the official implementation for the AAAI-2021 paper (Cross-Layer Distillation with Semantic Calibration).
Luminet
⭐
6
The official implementation of LumiNet: The Bright Side of Perceptual Knowledge Distillation https://arxiv.org/abs/2310.03669
Related Searches
Python Cifar (1,478)
Pytorch Cifar (462)
Jupyter Notebook Cifar (412)
Dataset Cifar (373)
Resnet Cifar (354)
Mnist Cifar (321)
Cifar Cifar10 (315)
Cifar Imagenet (308)
Tensorflow Cifar (305)
Convolutional Neural Networks Cifar (273)
1-9 of 9 search results
Privacy
|
About
|
Terms
|
Follow Us On Twitter
Copyright 2018-2024 Awesome Open Source. All rights reserved.