Awesome Open Source
Awesome Open Source

torchdistill: A Modular, Configuration-Driven Framework for Knowledge Distillation

PyPI version Build Status DOI:10.1007/978-3-030-76423-4_3

torchdistill (formerly kdkit) offers various state-of-the-art knowledge distillation methods and enables you to design (new) experiments simply by editing a declarative yaml config file instead of Python code. Even when you need to extract intermediate representations in teacher/student models, you will NOT need to reimplement the models, that often change the interface of the forward, but instead specify the module path(s) in the yaml file. Refer to this paper for more details.

In addition to knowledge distillation, this framework helps you design and perform general deep learning experiments (WITHOUT coding) for reproducible deep learning studies. i.e., it enables you to train models without teachers simply by excluding teacher entries from a declarative yaml config file. You can find such examples below and in configs/sample/.

When you refer to torchdistill in your paper, please cite this paper instead of this GitHub repository.
Your citation is appreciated and motivates me to maintain and upgrade this framework!

Forward hook manager

Using ForwardHookManager, you can extract intermediate representations in model without modifying the interface of its forward function.
This example notebook Open In Colab will give you a better idea of the usage such as knowledge distillation and analysis of intermediate representations.

1 experiment 1 declarative PyYAML config file

In torchdistill, many components and PyTorch modules are abstracted e.g., models, datasets, optimizers, losses, and more! You can define them in a declarative PyYAML config file so that can be seen as a summary of your experiment, and in many cases, you will NOT need to write Python code at all. Take a look at some configurations available in configs/. You'll see what modules are abstracted and how they are defined in a declarative PyYAML config file to design an experiment.

Top-1 validation accuracy for ILSVRC 2012 (ImageNet)

T: ResNet-34* Pretrained KD AT FT CRD Tf-KD SSKD L2 PAD-L2 KR
S: ResNet-18 69.76* 71.37 70.90 71.56 70.93 70.52 70.09 71.08 71.71 71.64
Original work N/A N/A 70.70 71.43** 71.17 70.42 71.62 70.90 71.71 71.61

* The pretrained ResNet-34 and ResNet-18 are provided by torchvision.
** FT is assessed with ILSVRC 2015 in the original work.
For the 2nd row (S: ResNet-18), most of the results are reported in this paper, and their checkpoints (trained weights), configuration and log files are available, and the configurations reuse the hyperparameters such as number of epochs used in the original work except for KD.

Examples

Executable code can be found in examples/ such as

For CIFAR-10 and CIFAR-100, some models are reimplemented and available as pretrained models in torchdistill. More details can be found here.

Some Transformer models fine-tuned by torchdistill for GLUE tasks are available at Hugging Face Model Hub. Sample GLUE benchmark results and details can be found here.

Google Colab Examples

The following examples are available in demo/. Note that the examples are for Google Colab users. Usually, examples/ would be a better reference if you have your own GPU(s).

CIFAR-10 and CIFAR-100

  • Training without teacher models Open In Colab
  • Knowledge distillation Open In Colab

GLUE

  • Fine-tuning without teacher models Open In Colab
  • Knowledge distillation Open In Colab

These examples write out test prediction files for you to see the test performance at the GLUE leaderboard system.

PyTorch Hub

If you find models on PyTorch Hub or GitHub repositories supporting PyTorch Hub, you can import them as teacher/student models simply by editing a declarative yaml config file.

e.g., If you use a pretrained ResNeSt-50 available in rwightman/pytorch-image-models (aka timm) as a teacher model for ImageNet dataset, you can import the model via PyTorch Hub with the following entry in your declarative yaml config file.

models:
  teacher_model:
    name: 'resnest50d'
    repo_or_dir: 'rwightman/pytorch-image-models'
    params:
      num_classes: 1000
      pretrained: True

How to setup

  • Python 3.6 >=
  • pipenv (optional)

Install by pip/pipenv

pip3 install torchdistill
# or use pipenv
pipenv install torchdistill

Install from this repository

git clone https://github.com/yoshitomo-matsubara/torchdistill.git
cd torchdistill/
pip3 install -e .
# or use pipenv
pipenv install "-e ."

Issues / Questions / Requests

The documentation is work-in-progress. In the meantime, feel free to create an issue if you find a bug.
If you have either a question or feature request, start a new discussion here.

Citation

If you use torchdistill in your research, please cite the following paper.
[Paper] [Preprint]

@inproceedings{matsubara2021torchdistill,
  title={torchdistill: A Modular, Configuration-Driven Framework for Knowledge Distillation},
  author={Matsubara, Yoshitomo},
  booktitle={International Workshop on Reproducible Research in Pattern Recognition},
  pages={24--44},
  year={2021},
  organization={Springer}
}

References


Get A Weekly Email With Trending Projects For These Topics
No Spam. Unsubscribe easily at any time.
Python (1,143,903) 
Pytorch (11,619) 
Nlp (8,384) 
Natural Language Processing (4,751) 
Object Detection (2,661) 
Image Classification (1,764) 
Transformer (1,693) 
Semantic Segmentation (932) 
Cifar10 (387) 
Colab Notebook (338) 
Imagenet (317) 
Google Colab (280) 
Knowledge Distillation (162) 
Coco (152) 
Pascal Voc (94) 
Glue (81) 
Cifar100 (75) 
Related Projects
Advertising 📦 9
All Projects
Application Programming Interfaces 📦 120
Applications 📦 181
Artificial Intelligence 📦 72
Blockchain 📦 70
Build Tools 📦 111
Cloud Computing 📦 79
Code Quality 📦 28
Collaboration 📦 30
Command Line Interface 📦 48
Community 📦 81
Companies 📦 60
Compilers 📦 60
Computer Science 📦 74
Configuration Management 📦 39
Content Management 📦 167
Control Flow 📦 197
Data Formats 📦 77
Data Processing 📦 266
Data Storage 📦 132
Economics 📦 60
Frameworks 📦 198
Games 📦 122
Graphics 📦 103
Hardware 📦 148
Integrated Development Environments 📦 47
Learning Resources 📦 147
Legal 📦 28
Libraries 📦 119
Lists Of Projects 📦 21
Machine Learning 📦 336
Mapping 📦 61
Marketing 📦 15
Mathematics 📦 55
Media 📦 228
Messaging 📦 97
Networking 📦 304
Operating Systems 📦 84
Operations 📦 120
Package Managers 📦 52
Programming Languages 📦 229
Runtime Environments 📦 96
Science 📦 42
Security 📦 375
Social Media 📦 26
Software Architecture 📦 70
Software Development 📦 68
Software Performance 📦 57
Software Quality 📦 127
Text Editors 📦 45
Text Processing 📦 131
User Interface 📦 310
User Interface Components 📦 465
Version Control 📦 29
Virtualization 📦 68
Web Browsers 📦 38
Web Servers 📦 25
Web User Interface 📦 194