Awesome Open Source
Search
Programming Languages
Languages
All Categories
Categories
About
Search results for pytorch knowledge distillation
knowledge-distillation
x
pytorch
x
74 search results found
Easynlp
⭐
1,871
EasyNLP: A Comprehensive and Easy-to-use NLP Toolkit
Knowledge Distillation Pytorch
⭐
1,780
A PyTorch implementation for exploring deep and shallow knowledge distillation (KD) experiments with flexibility
Neuronblocks
⭐
1,441
NLP DNN Toolkit - Building Your NLP DNN Models Like Playing Lego
Mmrazor
⭐
1,231
OpenMMLab Model Compression Toolbox and Benchmark.
Torchdistill
⭐
1,171
A coding-free framework built on PyTorch for reproducible deep learning studies. 🏆22 knowledge distillation methods presented at CVPR, ICLR, ECCV, NeurIPS, ICCV, etc are implemented so far. 🎁 Trained models, training logs and configurations are available for ensuring the reproducibiliy and benchmark.
Attention Transfer
⭐
1,120
Improving Convolutional Networks via Attention Transfer (ICLR 2017)
Pytorch_classification
⭐
1,055
利用pytorch实现图像分类的一个完整的代码,训练,预测,TTA,模型融合,模型部署,cnn提取特
Mdistiller
⭐
634
The official implementation of [CVPR2022] Decoupled Knowledge Distillation https://arxiv.org/abs/2203.08679 and [ICCV2023] DOT: A Distillation-Oriented Trainer https://openaccess.thecvf.com/content/ICCV2023/pap
Teacher Free Knowledge Distillation
⭐
490
Knowledge Distillation: CVPR2020 Oral, Revisiting Knowledge Distillation via Label Smoothing Regularization
Kd_lib
⭐
476
A Pytorch Knowledge Distillation library for benchmarking and extending works in the domains of Knowledge Distillation, Pruning, and Quantization.
Mammoth
⭐
424
An Extendible (General) Continual Learning Framework based on Pytorch - official codebase of Dark Experience for General Continual Learning
Ld
⭐
332
Localization Distillation for Object Detection (CVPR 2022, TPAMI 2023)
2dpass
⭐
282
2DPASS: 2D Priors Assisted Semantic Segmentation on LiDAR Point Clouds (ECCV 2022) 🔥
Rkd
⭐
270
Official pytorch Implementation of Relational Knowledge Distillation, CVPR 2019
Fast_human_pose_estimation_pytorch
⭐
246
Pytorch Code for CVPR2019 paper "Fast Human Pose Estimation" https://arxiv.org/abs/1811.05419
Fgd
⭐
228
Focal and Global Knowledge Distillation for Detectors (CVPR 2022)
Fasterai
⭐
226
FasterAI: Prune and Distill your models with FastAI and PyTorch
Matchmaker
⭐
223
Training & evaluation library for text-based neural re-ranking and dense retrieval models built with PyTorch
Slak
⭐
220
[ICLR 2023] "More ConvNets in the 2020s: Scaling up Kernels Beyond 51x51 using Sparsity"; [ICML 2023] "Are Large Kernels Better Teachers than Transformers for ConvNets?"
Object Detection Knowledge Distillation
⭐
211
An Object Detection Knowledge Distillation framework powered by pytorch, now having SSD and yolov5.
Ifrnet
⭐
191
IFRNet: Intermediate Feature Refine Network for Efficient Frame Interpolation (CVPR 2022)
Overhaul Distillation
⭐
181
Official PyTorch implementation of "A Comprehensive Overhaul of Feature Distillation" (ICCV 2019)
Fkd
⭐
166
Official code for our ECCV'22 paper "A Fast Knowledge Distillation Framework for Visual Recognition"
Knowledgedistillation
⭐
150
Knowledge distillation in text classification with pytorch. 知识蒸馏,中文文本分类,教师模型BERT、XLNET,学生模型biLSTM。
Cls_kd
⭐
147
'NKD and USKD' (ICCV 2023) and 'ViTKD'
Efficientat
⭐
128
This repository aims at providing efficient CNNs for Audio Tagging. We provide AudioSet pre-trained models ready for downstream training and extraction of audio embeddings.
Mgd
⭐
124
Masked Generative Distillation (ECCV 2022)
Mutualguide
⭐
111
Localize to Classify and Classify to Localize: Mutual Guidance in Object Detection
Aquvitae
⭐
88
Knowledge Distillation Toolkit
Sur Adapter
⭐
83
SUR-adapter for pre-trained diffusion models can acquire the powerful semantic understanding and reasoning capabilities from large language models to build a high-quality textual semantic representation for text-to-image generation.
Searle
⭐
76
[ICCV 2023] - Zero-shot Composed Image Retrieval with Textual Inversion
Knowledge Distillation Toolkit
⭐
71
A knowledge distillation toolkit based on PyTorch and PyTorch Lightning.
Dhm
⭐
67
[CVPR 2020] Dynamic Hierarchical Mimicking Towards Consistent Optimization Objectives
Knowledge_evolution
⭐
67
(CVPR-Oral 2021) PyTorch implementation of Knowledge Evolution approach and Split-Nets
Multilangstructurekd
⭐
64
[ACL 2020] Structure-Level Knowledge Distillation For Multilingual Sequence Labeling
Pytorch Be Your Own Teacher
⭐
61
A pytorch implementation of paper 'Be Your Own Teacher: Improve the Performance of Convolutional Neural Networks via Self Distillation', https://arxiv.org/abs/1905.08094
Stagewise Knowledge Distillation
⭐
60
Code implementation of Data Efficient Stagewise Knowledge Distillation paper.
Ctc Optimizedloss
⭐
50
Computes the MWER (minimum WER) Loss with CTC beam search. Knowledge distillation for CTC loss.
Model Compression And Acceleration Progress
⭐
50
Repository to track the progress in model compression and acceleration
Knowledge Distillation Paper
⭐
47
This resposity maintains a collection of important papers on knowledge distillation (awesome-knowledge-distillation)).
Accv_tinygan
⭐
44
BigGAN; Knowledge Distillation; Black-Box; Fast Training; 16x compression
Icd
⭐
42
This is the official implementation of the paper "Instance-conditional Knowledge Distillation for Object Detection", based on MegEngine and Pytorch.
Pocketnet
⭐
42
Official repository for PocketNet: Extreme Lightweight Face Recognition Network using Neural Architecture Search and Multi-Step Knowledge Distillation
Mgd
⭐
40
Matching Guided Distillation (ECCV 2020)
Head Network Distillation
⭐
27
[IEEE Access] "Head Network Distillation: Splitting Distilled Deep Neural Networks for Resource-constrained Edge Computing Systems" and [ACM MobiCom HotEdgeVideo 2019] "Distilled Split Deep Neural Networks for Edge-assisted Real-time Systems"
Labelrelaxation Cvpr21
⭐
27
Official PyTorch Implementation of Embedding Transfer with Label Relaxation for Improved Metric Learning, CVPR 2021
Difficulty Aware Simulator
⭐
26
Official PyTorch Repository of "Difficulty-Aware Simulator for Open Set Recognition" (ECCV 2022 Paper)
Hnd Ghnd Object Detectors
⭐
23
[ICPR 2020] "Neural Compression and Filtering for Edge-assisted Real-time Object Detection in Challenged Networks" and [ACM MobiCom EMDL 2020] "Split Computing for Complex Object Detectors: Challenges and Preliminary Results"
Realtime Semantic Segmentation Pytorch
⭐
22
PyTorch implementation of over 30 realtime semantic segmentations models, e.g. BiSeNetv1, BiSeNetv2, CGNet, ContextNet, DABNet, DDRNet, EDANet, ENet, ERFNet, ESPNet, ESPNetv2, FastSCNN, ICNet, LEDNet, LinkNet, PP-LiteSeg, SegNet, ShelfNet, STDC, SwiftNet, and support knowledge distillation, distributed training etc.
Mdvit
⭐
22
[MICCAI 2023] MDViT: Multi-domain Vision Transformer for Small Medical Image Segmentation Datasets (an official implementation)
Supervised Compression
⭐
21
[WACV 2022] "Supervised Compression for Resource-Constrained Edge Computing Systems"
Sc2 Benchmark
⭐
18
[TMLR] "SC2 Benchmark: Supervised Compression for Split Computing"
Squeezer
⭐
18
Lightweight knowledge distillation pipeline
Ssan
⭐
17
[ACMMM 2020] Code release for "Simultaneous Semantic Alignment Network for Heterogenous Domain Adaptation" https://arxiv.org/abs/2008.01677
Ai_book
⭐
17
AI book for everyone
Distilkobilstm
⭐
17
Distilling Task-Specific Knowledge from Teacher Model into BiLSTM
Boosting Light Weight Depth Estimation
⭐
16
Official implementation of "Boosting Light-Weight Depth Estimation Via Knowledge Distillation"
San
⭐
15
[ECCV 2020] Scale Adaptive Network: Learning to Learn Parameterized Classification Networks for Scalable Input Images
Zero Shot_knowledge_distillation_pytorch
⭐
14
ZSKD with PyTorch
Bert Aad
⭐
13
Adversarial Adaptation with Distillation for BERT Unsupervised Domain Adaptation
Cool Papers In Pytorch
⭐
12
Reimplementing cool papers in PyTorch...
Dm Vton
⭐
12
👗 DM-VTON: Distilled Mobile Real-time Virtual Try-On
Rosita
⭐
11
[AAAI 2021] "ROSITA: Refined BERT cOmpreSsion with InTegrAted techniques", Yuanxin Liu, Zheng Lin, Fengcheng Yuan
Knowledge Distillation By Replacing Cheap Conv
⭐
11
In search of effective and efficient Pipeline for Distillating Knowledge in Convolutional Neural Networks
Face Recognition Pipeline
⭐
11
Pipeline for training face recognition models (based on pytorch 1.1)
Knowledge_distillation
⭐
11
PyTorch implementation of "Distilling the Knowledge in a Neural Network"
Distillation In Dg
⭐
10
Implementation of "Weight Averaging Improves Knowledge Distillation under Domain Shift" (ICCV 2023 OOD-CV Workshop)
Knowledge Distillation Via Nd
⭐
9
The official implementation for paper: Improving Knowledge Distillation via Regularizing Feature Norm and Direction
Kd_data
⭐
8
IJCAI 2021, "Comparing Kullback-Leibler Divergence and Mean Squared Error Loss in Knowledge Distillation"
Deepguide
⭐
7
Deep Multimodal Guidance for Medical Image Classification: https://arxiv.org/pdf/2203.05683.pdf
Da2lite
⭐
6
DA2Lite is an automated model compression toolkit for PyTorch.
Luminet
⭐
6
The official implementation of LumiNet: The Bright Side of Perceptual Knowledge Distillation https://arxiv.org/abs/2310.03669
Dino_mnist Pytorch
⭐
5
Pytorch implementation of "Emerging Properties in Self-Supervised Vision Transformers" (a.k.a. DINO)
Knowledge Distillation For Image Captioning
⭐
5
Compressing Image Captioning Network using Knowledge Distillation
Related Searches
Python Pytorch (15,943)
Deep Learning Pytorch (7,533)
Jupyter Notebook Pytorch (4,892)
Machine Learning Pytorch (2,934)
Dataset Pytorch (1,848)
Pytorch Convolutional Neural Networks (1,777)
Pytorch Neural Network (1,631)
Pytorch Natural Language Processing (1,408)
Network Pytorch (1,299)
Pytorch Computer Vision (1,230)
1-74 of 74 search results
Privacy
|
About
|
Terms
|
Follow Us On Twitter
Copyright 2018-2024 Awesome Open Source. All rights reserved.