Awesome Open Source
Search
Programming Languages
Languages
All Categories
Categories
About
Search results for knowledge distillation model compression
knowledge-distillation
x
model-compression
x
29 search results found
Awesome Knowledge Distillation
⭐
3,222
Awesome Knowledge Distillation
Pretrained Language Model
⭐
2,912
Pretrained language model and its related optimization techniques developed by Huawei Noah's Ark Lab.
Knowledge Distillation Pytorch
⭐
1,780
A PyTorch implementation for exploring deep and shallow knowledge distillation (KD) experiments with flexibility
Neuronblocks
⭐
1,441
NLP DNN Toolkit - Building Your NLP DNN Models Like Playing Lego
Efficient Computing
⭐
1,110
Efficient computing methods developed by Huawei Noah's Ark Lab
Efficient Deep Learning
⭐
885
Collection of recent methods on (deep) neural network compression and acceleration.
Knowledge Distillation Zoo
⭐
804
Pytorch implementation of various Knowledge Distillation (KD) methods.
Knowledge Distillation Papers
⭐
638
knowledge distillation papers
Kd_lib
⭐
476
A Pytorch Knowledge Distillation library for benchmarking and extending works in the domains of Knowledge Distillation, Pruning, and Quantization.
Collaborative Distillation
⭐
185
[CVPR'20] Collaborative Distillation for Ultra-Resolution Universal Style Transfer (PyTorch)
Slimsam
⭐
183
SlimSAM: 0.1% Data Makes Segment Anything Slim
Awesome Ai Infrastructures
⭐
171
Infrastructures™ for Machine Learning Training/Inference in Production.
Knowledgedistillation
⭐
150
Knowledge distillation in text classification with pytorch. 知识蒸馏,中文文本分类,教师模型BERT、XLNET,学生模型biLSTM。
Microexpnet
⭐
118
MicroExpNet: An Extremely Small and Fast Model For Expression Recognition From Frontal Face Images
Aquvitae
⭐
88
Knowledge Distillation Toolkit
Compress
⭐
67
Compressing Representations for Self-Supervised Learning
Cmi
⭐
65
[IJCAI-2021] Contrastive Model Inversion for Data-Free Knowledge Distillation
Model Compression And Acceleration Progress
⭐
50
Repository to track the progress in model compression and acceleration
Data Free Adversarial Distillation
⭐
44
Code and pretrained models for paper: Data-Free Adversarial Distillation
Cv_dl_gather
⭐
42
Gather research papers, corresponding codes (if having), reading notes and any other related materials about Hot🔥🔥🔥 fields in Computer Vision based on Deep Learning.
Efficient Bert
⭐
31
This repository contains the code for the paper in Findings of EMNLP 2021: "EfficientBERT: Progressively Searching Multilayer Perceptron via Warm-up Knowledge Distillation".
Squeezer
⭐
18
Lightweight knowledge distillation pipeline
Research Paper Summaries
⭐
17
A directory with some interesting research paper summaries in the field of Deep Learning
Good Da In Kd
⭐
14
[NeurIPS'22] What Makes a "Good" Data Augmentation in Knowledge Distillation -- A Statistical Perspective
Rosita
⭐
11
[AAAI 2021] "ROSITA: Refined BERT cOmpreSsion with InTegrAted techniques", Yuanxin Liu, Zheng Lin, Fengcheng Yuan
Task Aware Distillation
⭐
10
Less is More: Task-aware Layer-wise Distillation for Language Model Compression (ICML2023)
Lgtm
⭐
10
[ACL 2023] Code for paper “Tailoring Instructions to Student’s Learning Levels Boosts Knowledge Distillation”(https://arxiv.org/abs/2305.09651)
Da2lite
⭐
6
DA2Lite is an automated model compression toolkit for PyTorch.
Luminet
⭐
6
The official implementation of LumiNet: The Bright Side of Perceptual Knowledge Distillation https://arxiv.org/abs/2310.03669
1-29 of 29 search results
Privacy
|
About
|
Terms
|
Follow Us On Twitter
Copyright 2018-2024 Awesome Open Source. All rights reserved.