Awesome Open Source
Search
Programming Languages
Languages
All Categories
Categories
About
Search results for distillation
distillation
x
70 search results found
Distiller
⭐
4,252
Neural Network Distiller by Intel AI Lab: a Python package for neural network compression research. https://intellabs.github.io/distiller
Awesome Knowledge Distillation
⭐
3,222
Awesome Knowledge Distillation
Awesome Knowledge Distillation
⭐
2,182
Awesome Knowledge-Distillation. 分类整理的知识蒸馏paper(2014-2021)。
Paddleslim
⭐
1,486
PaddleSlim is an open-source library for deep model compression and architecture search.
Textbrewer
⭐
1,387
A PyTorch-based knowledge distillation toolkit for natural language processing
Continual Learning
⭐
1,190
PyTorch implementation of various methods for continual learning (XdG, EWC, SI, LwF, FROMP, DGR, BI-R, ER, A-GEM, iCaRL, Generative Classifier) in three different scenarios.
Vitpose
⭐
950
The official repo for [NeurIPS'22] "ViTPose: Simple Vision Transformer Baselines for Human Pose Estimation" and [Arxiv'22] "ViTPose+: Vision Transformer Foundation Model for Generic Body Pose Estimation"
Knowledge Distillation Zoo
⭐
804
Pytorch implementation of various Knowledge Distillation (KD) methods.
Recsyspapers
⭐
801
推荐/广告/搜索领域工业界经典以及最前沿论文集合。A collection of industry classics and cutting-edge papers in the field of recommendation/advertising/search.
Meal V2
⭐
603
MEAL V2: Boosting Vanilla ResNet-50 to 80%+ Top-1 Accuracy on ImageNet without Tricks
Tanuki.py
⭐
579
Easily build LLM-powered apps that get cheaper and faster over time.
Cluepretrainedmodels
⭐
536
高质量中文预训练模型集合:最先进大模型、最快小模型、相似度专门模型
Ares
⭐
413
A Python library for adversarial machine learning focusing on benchmarking adversarial robustness.
Llm Vm
⭐
401
irresponsible innovation. Try now at https://chat.dev/
Distill Sd
⭐
346
Segmind Distilled diffusion
Hawq
⭐
324
Quantization library for PyTorch. Support low-precision and mixed-precision quantization, with hardware implementation through TVM.
Mobile Yolov5 Pruning Distillation
⭐
320
mobilev2-yolov5s剪枝、蒸馏,支持ncnn,tensorRT部署。ultra-ligh but better performence!
Optimum Intel
⭐
268
🤗 Optimum Intel: Accelerate inference with Intel optimization tools
Lav
⭐
267
(CVPR 2022) A minimalist, mapless, end-to-end self-driving stack for joint perception, prediction, planning and control.
Agi Papers
⭐
218
Papers and Book to look at when starting AGI 📚
Keras_insightface
⭐
208
Insightface Keras implementation
Brain Inspired Replay
⭐
169
A brain-inspired version of generative replay for continual learning with deep neural networks (e.g., class-incremental learning on CIFAR-100; PyTorch code).
Fkd
⭐
166
Official code for our ECCV'22 paper "A Fast Knowledge Distillation Framework for Visual Recognition"
Distilkobert
⭐
165
Distillation of KoBERT from SKTBrain (Lightweight KoBERT)
R2l
⭐
162
[ECCV 2022] R2L: Distilling Neural Radiance Field to Neural Light Field for Efficient Novel View Synthesis
Knowledgedistillation
⭐
150
Knowledge distillation in text classification with pytorch. 知识蒸馏,中文文本分类,教师模型BERT、XLNET,学生模型biLSTM。
Biosteam
⭐
150
The Biorefinery Simulation and Techno-Economic Analysis Modules; Life Cycle Assessment; Chemical Process Simulation Under Uncertainty
Bk Sdm
⭐
124
A Compressed Stable Diffusion for Efficient Text-to-Image Generation [ICCV'23 Demo] [ICML'23 Workshop]
Filter Grafting
⭐
110
Filter Grafting for Deep Neural Networks(CVPR 2020)
Unif
⭐
109
基于 Tensorflow,仿 Scikit-Learn 设计的深度学习自然语言处理框架。支持 40 余种模型类,涵盖语言模型、文本分类、NER、MRC、知识蒸馏等各个领域
Worldonrails
⭐
101
(ICCV 2021, Oral) RL and distillation in CARLA using a factorized world model
Disco Pytorch
⭐
80
Code for DisCo: Remedy Self-supervised Learning on Lightweight Models with Distilled Contrastive Learning
Bert Squeeze
⭐
77
🛠️ Tools for Transformers compression using PyTorch Lightning ⚡
Incremental Learning
⭐
76
Pytorch implementation of ACCV18 paper "Revisiting Distillation and Incremental Classifier Learning."
Supermix
⭐
72
Pytorch implementation of CVPR2021 paper: SuperMix: Supervising the Mixing Data Augmentation
Roberta Wwm Base Distill
⭐
64
this is roberta wwm base distilled model which was distilled from roberta wwm by roberta wwm large
Adaptive Wavelets
⭐
60
Adaptive, interpretable wavelets across domains (NeurIPS 2021)
Awesome Efficient Aigc
⭐
57
A list of papers, docs, codes about efficient AIGC. This repo is aimed to provide the info for efficient AIGC research, including language and vision, we are continuously improving the project. Welcome to PR the works (papers, repositories) that are missed by the repo.
Mkb
⭐
54
Knowledge Base Embedding By Cooperative Knowledge Distillation
Distill Bev
⭐
46
DistillBEV: Boosting Multi-Camera 3D Object Detection with Cross-Modal Knowledge Distillation (ICCV 2023)
Bert Distillation
⭐
44
Distillation of BERT model with catalyst framework
Zaq Code
⭐
40
CVPR 2021 : Zero-shot Adversarial Quantization (ZAQ)
Augmented Interpretable Models
⭐
35
Interpretable and efficient predictors using pre-trained language models. Scikit-learn compatible.
Knowledgefactor
⭐
31
[ECCV2022] Factorizing Knowledge in Neural Networks
Gnosis
⭐
29
Code to reproduce experiments from 'Does Knowledge Distillation Really Work' a paper which appeared in the 2021 NeurIPS proceedings.
Kominilm
⭐
29
Korean Light Weight Language Model
Datasetculling
⭐
27
✂️ Dataset Culling: Faster training of domain specific models with distillation ✂️ (IEEE ICIP 2019)
Ccl
⭐
25
PyTorch Implementation on Paper [CVPR2021]Distilling Audio-Visual Knowledge by Compositional Contrastive Learning
Yolov5 Distillation Train Inference
⭐
24
Yolov5 distillation training | Yolov5知识蒸馏训练,支持训练自己的数据
Single Img Extrapolating
⭐
23
Repo for the paper "Extrapolating from a Single Image to a Thousand Classes using Distillation"
Armhubert
⭐
23
(INTERSPEECH 2023) Official implementation of ARMHuBERT
Vqgraph
⭐
21
[ICLR 2024] VQGraph: Rethinking Graph Representation Space for Bridging GNNs and MLPs
Deit Tf
⭐
21
Includes PyTorch -> Keras model porting code for DeiT models with fine-tuning and inference notebooks.
Simdistill
⭐
20
The official repo for [AAAI 2024] "SimDistill: Simulated Multi-modal Distillation for BEV 3D Object Detection""
Squeezer
⭐
18
Lightweight knowledge distillation pipeline
Distilkobilstm
⭐
17
Distilling Task-Specific Knowledge from Teacher Model into BiLSTM
Dine
⭐
17
code for our CVPR 2022 paper "DINE: Domain Adaptation from Single and Multiple Black-box Predictors"
Reachnnstar
⭐
14
Reachability Analysis Tool of Neural Network Controlled Systems (NNCSs)
Distill_visual_priors
⭐
13
2nd place solution of ECCV 2020 workshop VIPriors Image Classification Challenge, https://arxiv.org/abs/2008.00261
Consistency Models
⭐
13
An implementation of diffusion-adjacent Consistency Models (Song et al 2023) in Jax.
Lightweight Low Resource Nmt
⭐
12
Official code for "Too Brittle To Touch: Comparing the Stability of Quantization and Distillation Towards Developing Lightweight Low-Resource MT Models" to appear in WMT 2022.
Idn Tensorflow
⭐
12
TensorFlow Implementation of "Fast and Accurate Single Image Super-Resolution via Information Distillation Network" (CVPR 2018)
Lgtm
⭐
10
[ACL 2023] Code for paper “Tailoring Instructions to Student’s Learning Levels Boosts Knowledge Distillation”(https://arxiv.org/abs/2305.09651)
Skt
⭐
9
Efficient Crowd Counting via Structured Knowledge Transfer (SKT, ACM MM 2020)
Self_distillation
⭐
9
Self-Distillation with weighted ground-truth targets; ResNet and Kernel Ridge Regression
Knowledge Distillation Semantic Search
⭐
8
KDSS is the framework for knowledge distillation from LLMs
Linkless Link Prediction
⭐
7
[ICML 2023] Linkless Link Prediction via Relational Distillation
Papers
⭐
7
Reading Papers
Overhaul
⭐
6
[ICCV 2019] A Comprehensive Overhaul of Feature Distillation
Difussion_continual_learning
⭐
5
PyTorch implementation of various distillation approaches for continual learning of Diffusion Models.
1-70 of 70 search results
Privacy
|
About
|
Terms
|
Follow Us On Twitter
Copyright 2018-2024 Awesome Open Source. All rights reserved.