Awesome Open Source
Search
Programming Languages
Languages
All Categories
Categories
About
Search results for python knowledge distillation
knowledge-distillation
x
python
x
149 search results found
Paddleclas
⭐
5,161
A treasure chest for visual classification and recognition powered by PaddlePaddle
Pretrained Language Model
⭐
2,912
Pretrained language model and its related optimization techniques developed by Huawei Noah's Ark Lab.
Easynlp
⭐
1,871
EasyNLP: A Comprehensive and Easy-to-use NLP Toolkit
Neural Compressor
⭐
1,773
SOTA low-bit LLM quantization (INT8/FP8/INT4/FP4/NF4) & sparsity; leading model compression techniques on TensorFlow, PyTorch, and ONNX Runtime
Dwpose
⭐
1,500
"Effective Whole-body Pose Estimation with Two-stages Distillation" (ICCV 2023, CV4Metaverse Workshop)
Cream
⭐
1,446
This is a collection of our NAS and Vision Transformer work.
Mmrazor
⭐
1,231
OpenMMLab Model Compression Toolbox and Benchmark.
Torchdistill
⭐
1,171
A coding-free framework built on PyTorch for reproducible deep learning studies. 🏆22 knowledge distillation methods presented at CVPR, ICLR, ECCV, NeurIPS, ICCV, etc are implemented so far. 🎁 Trained models, training logs and configurations are available for ensuring the reproducibiliy and benchmark.
Knowledge Distillation Zoo
⭐
804
Pytorch implementation of various Knowledge Distillation (KD) methods.
Easytransfer
⭐
795
EasyTransfer is designed to make the development of transfer learning in NLP applications easier.
Mdistiller
⭐
634
The official implementation of [CVPR2022] Decoupled Knowledge Distillation https://arxiv.org/abs/2203.08679 and [ICCV2023] DOT: A Distillation-Oriented Trainer https://openaccess.thecvf.com/content/ICCV2023/pap
Awesome Efficient Llm
⭐
521
A curated list for Efficient Large Language Models
Teacher Free Knowledge Distillation
⭐
490
Knowledge Distillation: CVPR2020 Oral, Revisiting Knowledge Distillation via Label Smoothing Regularization
Efficient Gnns
⭐
490
Code and resources on scalable and efficient Graph Neural Networks
Kd_lib
⭐
476
A Pytorch Knowledge Distillation library for benchmarking and extending works in the domains of Knowledge Distillation, Pruning, and Quantization.
Mammoth
⭐
424
An Extendible (General) Continual Learning Framework based on Pytorch - official codebase of Dark Experience for General Continual Learning
Suod
⭐
371
(MLSys' 21) An Acceleration System for Large-scare Unsupervised Heterogeneous Outlier Detection (Anomaly Detection)
Distill Sd
⭐
346
Segmind Distilled diffusion
Ld
⭐
332
Localization Distillation for Object Detection (CVPR 2022, TPAMI 2023)
Mi Gan
⭐
297
[ICCV 2023] MI-GAN: A Simple Baseline for Image Inpainting on Mobile Devices
2dpass
⭐
282
2DPASS: 2D Priors Assisted Semantic Segmentation on LiDAR Point Clouds (ECCV 2022) 🔥
Rkd
⭐
270
Official pytorch Implementation of Relational Knowledge Distillation, CVPR 2019
Fast_human_pose_estimation_pytorch
⭐
246
Pytorch Code for CVPR2019 paper "Fast Human Pose Estimation" https://arxiv.org/abs/1811.05419
Fgd
⭐
228
Focal and Global Knowledge Distillation for Detectors (CVPR 2022)
Ifrnet
⭐
191
IFRNet: Intermediate Feature Refine Network for Efficient Frame Interpolation (CVPR 2022)
Slimsam
⭐
183
SlimSAM: 0.1% Data Makes Segment Anything Slim
Overhaul Distillation
⭐
181
Official PyTorch implementation of "A Comprehensive Overhaul of Feature Distillation" (ICCV 2019)
Lion
⭐
172
Code for "Lion: Adversarial Distillation of Proprietary Large Language Models (EMNLP 2023)"
Fkd
⭐
166
Official code for our ECCV'22 paper "A Fast Knowledge Distillation Framework for Visual Recognition"
Teacher Assistant Knowledge Distillation
⭐
157
Using Teacher Assistants to Improve Knowledge Distillation
Knowledgedistillation
⭐
150
Knowledge distillation in text classification with pytorch. 知识蒸馏,中文文本分类,教师模型BERT、XLNET,学生模型biLSTM。
Cls_kd
⭐
147
'NKD and USKD' (ICCV 2023) and 'ViTKD'
Efficientat
⭐
128
This repository aims at providing efficient CNNs for Audio Tagging. We provide AudioSet pre-trained models ready for downstream training and extraction of audio embeddings.
Mgd
⭐
124
Masked Generative Distillation (ECCV 2022)
Microexpnet
⭐
118
MicroExpNet: An Extremely Small and Fast Model For Expression Recognition From Frontal Face Images
Distiller
⭐
112
A large scale study of Knowledge Distillation.
Ctkd
⭐
109
[AAAI 2023] Code for "Curriculum Temperature for Knowledge Distillation"
Simxns
⭐
92
SimXNS, a research project for information retrieval, containing official implementations, by MSRA NLC team.
Kdtf
⭐
91
Knowledge Distillation using Tensorflow
Knowledge_distillation_via_tf2.0
⭐
91
The codes for recent knowledge distillation algorithms and benchmark results via TF2.0 low-level API
Url
⭐
90
Universal Representation Learning from Multiple Domains for Few-shot Classification - ICCV 2021, Cross-domain Few-shot Learning with Task-specific Adapters - CVPR 2022
Distill Bert Textgen
⭐
90
Research code for ACL 2020 paper: "Distilling Knowledge Learned in BERT for Text Generation".
Coperception
⭐
86
An SDK for multi-agent collaborative perception.
Sur Adapter
⭐
83
SUR-adapter for pre-trained diffusion models can acquire the powerful semantic understanding and reasoning capabilities from large language models to build a high-quality textual semantic representation for text-to-image generation.
Ab_distillation
⭐
78
Knowledge Transfer via Distillation of Activation Boundaries Formed by Hidden Neurons (AAAI 2019)
Ssda Yolo
⭐
77
Codes for my paper "SSDA-YOLO: Semi-supervised Domain Adaptive YOLO for Cross-Domain Object Detection"
Searle
⭐
76
[ICCV 2023] - Zero-shot Composed Image Retrieval with Textual Inversion
Ml Doctor
⭐
74
Code for ML Doctor
Knowledge Distillation Toolkit
⭐
71
A knowledge distillation toolkit based on PyTorch and PyTorch Lightning.
Vkd
⭐
69
PyTorch code for ECCV 2020 paper: "Robust Re-Identification by Multiple Views Knowledge Distillation"
Dhm
⭐
67
[CVPR 2020] Dynamic Hierarchical Mimicking Towards Consistent Optimization Objectives
Knowledge_evolution
⭐
67
(CVPR-Oral 2021) PyTorch implementation of Knowledge Evolution approach and Split-Nets
Revisiting Reverse Distillation
⭐
66
(CVPR 2023) Revisiting Reverse Distillation for Anomaly Detection
Two Stage Knowledge For Multiple Adverse Weather Removal
⭐
65
[CVPR 2022] Learning Multiple Adverse Weather Removal via Two-stage Knowledge Learning and Multi-contrastive Regularization: Toward a Unified Model
Multilangstructurekd
⭐
64
[ACL 2020] Structure-Level Knowledge Distillation For Multilingual Sequence Labeling
Bert In Production
⭐
64
A collection of resources on using BERT (https://arxiv.org/abs/1810.04805 ) and related Language Models in production environments.
Pytorch Be Your Own Teacher
⭐
61
A pytorch implementation of paper 'Be Your Own Teacher: Improve the Performance of Convolutional Neural Networks via Self Distillation', https://arxiv.org/abs/1905.08094
Disco
⭐
61
This is the public repository of EMNLP 2023 paper "DisCo: Co-training Distilled Student Models for Semi-supervised Text Mining"
Kd4mtl
⭐
51
Knowledge Distillation for Multi-task Learning (ECCV20 Workshops)
Metadistil
⭐
50
Code for ACL 2022 paper "BERT Learns to Teach: Knowledge Distillation with Meta Learning".
Ctc Optimizedloss
⭐
50
Computes the MWER (minimum WER) Loss with CTC beam search. Knowledge distillation for CTC loss.
Pgd
⭐
48
[ECCV 2022] Prediction-Guided Distillation for Dense Object Detection
Distill Bev
⭐
46
DistillBEV: Boosting Multi-Camera 3D Object Detection with Cross-Modal Knowledge Distillation (ICCV 2023)
Bss_distillation
⭐
45
Knowledge Distillation with Adversarial Samples Supporting Decision Boundary (AAAI 2019)
Distill And Select
⭐
44
Authors official PyTorch implementation of the "DnS: Distill-and-Select for Efficient and Accurate Video Indexing and Retrieval" [IJCV 2022]
Data Free Adversarial Distillation
⭐
44
Code and pretrained models for paper: Data-Free Adversarial Distillation
Icd
⭐
42
This is the official implementation of the paper "Instance-conditional Knowledge Distillation for Object Detection", based on MegEngine and Pytorch.
Pocketnet
⭐
42
Official repository for PocketNet: Extreme Lightweight Face Recognition Network using Neural Architecture Search and Multi-Step Knowledge Distillation
Rotated Ld
⭐
41
Rotated Localization Distillation (CVPR 2022, TPAMI 2023)
Zero Shot_knowledge_distillation
⭐
41
Zero-Shot Knowledge Distillation in Deep Networks in ICML2019
Bake
⭐
41
Self-distillation with Batch Knowledge Ensembling Improves ImageNet Classification
Mgd
⭐
40
Matching Guided Distillation (ECCV 2020)
Model_optimizer_tf
⭐
39
Model optimizer used in Adlik.
Il Semsegm
⭐
38
Code for the paper "Incremental Learning Techniques for Semantic Segmentation", Michieli U. and Zanuttigh P., ICCVW, 2019
Dm Kd
⭐
38
Code for "Is Synthetic Data From Diffusion Models Ready for Knowledge Distillation?" (https://arxiv.org/abs/2305.12954)
Pointdistiller
⭐
38
[CVPR 2023] PointDistiller: Structured Knowledge Distillation Towards Efficient and Compact 3D Detection
Okdhp
⭐
37
[ICCV 2021] Code for "Online Knowledge Distillation for Efficient Pose Estimation"
Sskd_svd
⭐
35
Msdn
⭐
33
Official PyTorch Implementation of MSDN (CVPR'22)
Efficient Bert
⭐
31
This repository contains the code for the paper in Findings of EMNLP 2021: "EfficientBERT: Progressively Searching Multilayer Perceptron via Warm-up Knowledge Distillation".
Dynamickd
⭐
30
Code for EMNLP 2021 main conference paper "Dynamic Knowledge Distillation for Pre-trained Language Models"
Labelrelaxation Cvpr21
⭐
27
Official PyTorch Implementation of Embedding Transfer with Label Relaxation for Improved Metric Learning, CVPR 2021
Difficulty Aware Simulator
⭐
26
Official PyTorch Repository of "Difficulty-Aware Simulator for Open Set Recognition" (ECCV 2022 Paper)
Efficient_graph_similarity_computation
⭐
22
[NeurIPS-2021] Slow Learning and Fast Inference: Efficient Graph Similarity Computation via Knowledge Distillation
Mdvit
⭐
22
[MICCAI 2023] MDViT: Multi-domain Vision Transformer for Small Medical Image Segmentation Datasets (an official implementation)
Realtime Semantic Segmentation Pytorch
⭐
22
PyTorch implementation of over 30 realtime semantic segmentations models, e.g. BiSeNetv1, BiSeNetv2, CGNet, ContextNet, DABNet, DDRNet, EDANet, ENet, ERFNet, ESPNet, ESPNetv2, FastSCNN, ICNet, LEDNet, LinkNet, PP-LiteSeg, SegNet, ShelfNet, STDC, SwiftNet, and support knowledge distillation, distributed training etc.
Nasty Teacher
⭐
22
[ICLR 2021 Spotlight Oral] "Undistillable: Making A Nasty Teacher That CANNOT teach students", Haoyu Ma, Tianlong Chen, Ting-Kuei Hu, Chenyu You, Xiaohui Xie, Zhangyang Wang
Cpf
⭐
22
The official code of WWW2021 paper: Extract the Knowledge of Graph Neural Networks and Go Beyond it: An Effective Knowledge Distillation Framework
Model_compression_for_yolov3 V4
⭐
22
In this repository using the sparse training, group channel pruning and knowledge distilling for YOLOV4,
Eccv22 Foster
⭐
21
The official implementation for ECCV22 paper: "FOSTER: Feature Boosting and Compression for Class-Incremental Learning" in PyTorch.
Lamda Zhijian
⭐
21
ZhiJian: A Unifying and Rapidly Deployable Toolbox for Pre-trained Model Reuse
Supervised Compression
⭐
21
[WACV 2022] "Supervised Compression for Resource-Constrained Edge Computing Systems"
Zipfls
⭐
19
This repo is the official megengine implementation of the ECCV2022 paper: Efficient One Pass Self-distillation with Zipf's Label Smoothing.
Sakdn
⭐
19
[IEEE T-IP 2021] Semantics-aware Adaptive Knowledge Distillation for Cross-modal Action Recognition
Led
⭐
18
Source code of paper 'LED: Lexicon-Enlightened Dense Retriever for Large-Scale Retrieval' (WWW 2023)
Spkd
⭐
18
Official PyTorch implementation of "Lightweight Deep CNN for Natural Image Matting via Similarity-Preserving Knowledge Distillation" (IEEE Signal Processing Letters 2020)
Sc2 Benchmark
⭐
18
[TMLR] "SC2 Benchmark: Supervised Compression for Split Computing"
Distilkobilstm
⭐
17
Distilling Task-Specific Knowledge from Teacher Model into BiLSTM
Ssan
⭐
17
[ACMMM 2020] Code release for "Simultaneous Semantic Alignment Network for Heterogenous Domain Adaptation" https://arxiv.org/abs/2008.01677
Kd_srrl
⭐
17
Related Searches
Python Machine Learning (19,138)
Python Dataset (14,792)
Python Docker (14,113)
Python Tensorflow (13,736)
Python Deep Learning (13,092)
Python Network (11,495)
Python Natural Language Processing (9,064)
Python Pytorch (7,877)
Python Convolutional Neural Networks (6,944)
Python Paper (6,544)
1-100 of 149 search results
Next >
Privacy
|
About
|
Terms
|
Follow Us On Twitter
Copyright 2018-2024 Awesome Open Source. All rights reserved.