Awesome Open Source
Search
Programming Languages
Languages
All Categories
Categories
About
Search results for machine learning knowledge distillation
knowledge-distillation
x
machine-learning
x
18 search results found
Easynlp
⭐
1,871
EasyNLP: A Comprehensive and Easy-to-use NLP Toolkit
Kd_lib
⭐
476
A Pytorch Knowledge Distillation library for benchmarking and extending works in the domains of Knowledge Distillation, Pruning, and Quantization.
Suod
⭐
371
(MLSys' 21) An Acceleration System for Large-scare Unsupervised Heterogeneous Outlier Detection (Anomaly Detection)
Awesome Ai Infrastructures
⭐
171
Infrastructures™ for Machine Learning Training/Inference in Production.
Microexpnet
⭐
118
MicroExpNet: An Extremely Small and Fast Model For Expression Recognition From Frontal Face Images
Aquvitae
⭐
88
Knowledge Distillation Toolkit
Ml Doctor
⭐
74
Code for ML Doctor
Knowledge_evolution
⭐
67
(CVPR-Oral 2021) PyTorch implementation of Knowledge Evolution approach and Split-Nets
Stagewise Knowledge Distillation
⭐
60
Code implementation of Data Efficient Stagewise Knowledge Distillation paper.
Okddip Aaai2020
⭐
50
This is a PyTorch-1.0 implementation for the AAAI-2020 paper (Online Knowledge Distillation with Diverse Peers).
Refilled
⭐
47
This is the code of CVPR'20 paper "Distilling Cross-Task Knowledge via Relationship Matching".
Semckd
⭐
43
This is the official implementation for the AAAI-2021 paper (Cross-Layer Distillation with Semantic Calibration).
Cv_dl_gather
⭐
42
Gather research papers, corresponding codes (if having), reading notes and any other related materials about Hot🔥🔥🔥 fields in Computer Vision based on Deep Learning.
Awesome Nlp References
⭐
25
A curated list of resources dedicated to Knowledge Distillation, Recommendation System, especially Natural Language Processing (NLP).
Ai_book
⭐
17
AI book for everyone
Fedntd
⭐
14
(NeurIPS 2022) Official Implementation of "Preservation of the Global Knowledge by Not-True Distillation in Federated Learning"
Dcm
⭐
8
Knowledge Transfer via Dense Cross-layer Mutual-distillation (ECCV'2020)
Kd_data
⭐
8
IJCAI 2021, "Comparing Kullback-Leibler Divergence and Mean Squared Error Loss in Knowledge Distillation"
St
⭐
6
This is the code of NeurIPS'21 paper "Towards Enabling Meta-Learning from Target Models".
Related Searches
Python Machine Learning (14,099)
Jupyter Notebook Machine Learning (12,247)
Machine Learning Neural Network (4,397)
Machine Learning Tensorflow (4,050)
Machine Learning Natural Language Processing (3,891)
Machine Learning Artificial Intelligence (3,877)
Machine Learning Data Science (3,802)
Machine Learning Pytorch (2,910)
Machine Learning Dataset (2,298)
Machine Learning Computer Vision (1,966)
1-18 of 18 search results
Privacy
|
About
|
Terms
|
Follow Us On Twitter
Copyright 2018-2024 Awesome Open Source. All rights reserved.