Awesome Open Source
Search
Programming Languages
Languages
All Categories
Categories
About
Search results for deep learning knowledge distillation
deep-learning
x
knowledge-distillation
x
57 search results found
Awesome Knowledge Distillation
⭐
3,222
Awesome Knowledge Distillation
Easynlp
⭐
1,871
EasyNLP: A Comprehensive and Easy-to-use NLP Toolkit
Neuronblocks
⭐
1,441
NLP DNN Toolkit - Building Your NLP DNN Models Like Playing Lego
Vlm_survey
⭐
1,405
Vision-Language Models for Vision Tasks: A Survey
Attention Transfer
⭐
1,120
Improving Convolutional Networks via Attention Transfer (ICLR 2017)
Efficient Deep Learning
⭐
885
Collection of recent methods on (deep) neural network compression and acceleration.
Knowledge Distillation Papers
⭐
638
knowledge distillation papers
Mdistiller
⭐
634
The official implementation of [CVPR2022] Decoupled Knowledge Distillation https://arxiv.org/abs/2203.08679 and [ICCV2023] DOT: A Distillation-Oriented Trainer https://openaccess.thecvf.com/content/ICCV2023/pap
Mammoth
⭐
424
An Extendible (General) Continual Learning Framework based on Pytorch - official codebase of Dark Experience for General Continual Learning
Ld
⭐
332
Localization Distillation for Object Detection (CVPR 2022, TPAMI 2023)
Fast Human Pose Estimation.pytorch
⭐
325
Official pytorch Code for CVPR2019 paper "Fast Human Pose Estimation" https://arxiv.org/abs/1811.05419
2dpass
⭐
282
2DPASS: 2D Priors Assisted Semantic Segmentation on LiDAR Point Clouds (ECCV 2022) 🔥
Rkd
⭐
270
Official pytorch Implementation of Relational Knowledge Distillation, CVPR 2019
Fgd
⭐
228
Focal and Global Knowledge Distillation for Detectors (CVPR 2022)
Slak
⭐
220
[ICLR 2023] "More ConvNets in the 2020s: Scaling up Kernels Beyond 51x51 using Sparsity"; [ICML 2023] "Are Large Kernels Better Teachers than Transformers for ConvNets?"
Object Detection Knowledge Distillation
⭐
211
An Object Detection Knowledge Distillation framework powered by pytorch, now having SSD and yolov5.
Ifrnet
⭐
191
IFRNet: Intermediate Feature Refine Network for Efficient Frame Interpolation (CVPR 2022)
Awesome Ai Infrastructures
⭐
171
Infrastructures™ for Machine Learning Training/Inference in Production.
Fkd
⭐
166
Official code for our ECCV'22 paper "A Fast Knowledge Distillation Framework for Visual Recognition"
Teacher Assistant Knowledge Distillation
⭐
157
Using Teacher Assistants to Improve Knowledge Distillation
Microexpnet
⭐
118
MicroExpNet: An Extremely Small and Fast Model For Expression Recognition From Frontal Face Images
Disconet
⭐
115
[NeurIPS2021] Learning Distilled Collaboration Graph for Multi-Agent Perception
Mutualguide
⭐
111
Localize to Classify and Classify to Localize: Mutual Guidance in Object Detection
Kdtf
⭐
91
Knowledge Distillation using Tensorflow
Aquvitae
⭐
88
Knowledge Distillation Toolkit
Coperception
⭐
86
An SDK for multi-agent collaborative perception.
Vkd
⭐
69
PyTorch code for ECCV 2020 paper: "Robust Re-Identification by Multiple Views Knowledge Distillation"
Knowledge_evolution
⭐
67
(CVPR-Oral 2021) PyTorch implementation of Knowledge Evolution approach and Split-Nets
Revisiting Reverse Distillation
⭐
66
(CVPR 2023) Revisiting Reverse Distillation for Anomaly Detection
Stagewise Knowledge Distillation
⭐
60
Code implementation of Data Efficient Stagewise Knowledge Distillation paper.
Awesome Knowledge Distillation For Object Detection
⭐
51
A curated list of awesome knowledge distillation papers and codes for object detection.
Model Compression And Acceleration Progress
⭐
50
Repository to track the progress in model compression and acceleration
Okddip Aaai2020
⭐
50
This is a PyTorch-1.0 implementation for the AAAI-2020 paper (Online Knowledge Distillation with Diverse Peers).
Knowledge Distillation Paper
⭐
47
This resposity maintains a collection of important papers on knowledge distillation (awesome-knowledge-distillation)).
Semckd
⭐
43
This is the official implementation for the AAAI-2021 paper (Cross-Layer Distillation with Semantic Calibration).
Cv_dl_gather
⭐
42
Gather research papers, corresponding codes (if having), reading notes and any other related materials about Hot🔥🔥🔥 fields in Computer Vision based on Deep Learning.
Easy Bert
⭐
38
easy-bert是一个中文NLP工具,提供诸多bert变体调用和调参方法,极速上手;清晰的设计和代
Mlic Kd Wsd
⭐
37
Multi-Label Image Classification via Knowledge Distillation from Weakly-Supervised Detection (ACM MM 2018)
Sskd_svd
⭐
35
Awesome 3d Anomaly Detection
⭐
30
Awesome-3D/Multimodal-Anomaly-Detection-and-Locali
Head Network Distillation
⭐
27
[IEEE Access] "Head Network Distillation: Splitting Distilled Deep Neural Networks for Resource-constrained Edge Computing Systems" and [ACM MobiCom HotEdgeVideo 2019] "Distilled Split Deep Neural Networks for Edge-assisted Real-time Systems"
Difficulty Aware Simulator
⭐
26
Official PyTorch Repository of "Difficulty-Aware Simulator for Open Set Recognition" (ECCV 2022 Paper)
Awesome Nlp References
⭐
25
A curated list of resources dedicated to Knowledge Distillation, Recommendation System, especially Natural Language Processing (NLP).
Mdvit
⭐
22
[MICCAI 2023] MDViT: Multi-domain Vision Transformer for Small Medical Image Segmentation Datasets (an official implementation)
Research Paper Summaries
⭐
17
A directory with some interesting research paper summaries in the field of Deep Learning
Ai_book
⭐
17
AI book for everyone
Dicod
⭐
16
Official Pytorch implementation for Distilling Image Classifiers in Object detection (NeurIPS2021)
Dynamic Cdfsl
⭐
15
Pytorch codes for NeurIPS 2021 paper titled "Dynamic Distillation Network for Cross-Domain Few-Shot Recognition with Unlabeled Data".
Pytorch Minilm
⭐
15
Unofficial Pytorch implementation of MiniLM and MiniLMv2
Fedntd
⭐
14
(NeurIPS 2022) Official Implementation of "Preservation of the Global Knowledge by Not-True Distillation in Federated Learning"
Mousika
⭐
12
Mousika: Enable General In-Network Intelligence in Programmable Switches by Knowledge Distillation (INFOCOM22 & ToN23)
Distillation In Dg
⭐
10
Implementation of "Weight Averaging Improves Knowledge Distillation under Domain Shift" (ICCV 2023 OOD-CV Workshop)
Lgtm
⭐
10
[ACL 2023] Code for paper “Tailoring Instructions to Student’s Learning Levels Boosts Knowledge Distillation”(https://arxiv.org/abs/2305.09651)
Dcm
⭐
8
Knowledge Transfer via Dense Cross-layer Mutual-distillation (ECCV'2020)
Deepguide
⭐
7
Deep Multimodal Guidance for Medical Image Classification: https://arxiv.org/pdf/2203.05683.pdf
Linkless Link Prediction
⭐
7
[ICML 2023] Linkless Link Prediction via Relational Distillation
Distilling Knowledge Via Intermediate Classifiers
⭐
6
Distilling Knowledge via Intermediate Classifiers
Da2lite
⭐
6
DA2Lite is an automated model compression toolkit for PyTorch.
Luminet
⭐
6
The official implementation of LumiNet: The Bright Side of Perceptual Knowledge Distillation https://arxiv.org/abs/2310.03669
Kd Loss
⭐
5
Facial Landmark Detection Using Knowledge Distillation-Based Neural Networks
Semi Super
⭐
5
Semi/Self-Supervised Learning on a Pediatric Pneumonia Dataset
Related Searches
Python Deep Learning (21,220)
Jupyter Notebook Deep Learning (10,328)
Deep Learning Neural Network (5,801)
Deep Learning Pytorch (4,652)
Deep Learning Tensorflow (4,441)
Deep Learning Convolutional Neural Networks (4,142)
Deep Learning Computer Vision (3,960)
Deep Learning Keras (3,258)
Deep Learning Artificial Intelligence (2,898)
Deep Learning Natural Language Processing (2,312)
1-57 of 57 search results
Privacy
|
About
|
Terms
|
Follow Us On Twitter
Copyright 2018-2024 Awesome Open Source. All rights reserved.