Awesome Open Source
Search
Programming Languages
Languages
All Categories
Categories
About
Search results for pytorch distillation
distillation
x
pytorch
x
21 search results found
Distiller
⭐
4,252
Neural Network Distiller by Intel AI Lab: a Python package for neural network compression research. https://intellabs.github.io/distiller
Textbrewer
⭐
1,387
A PyTorch-based knowledge distillation toolkit for natural language processing
Vitpose
⭐
950
The official repo for [NeurIPS'22] "ViTPose: Simple Vision Transformer Baselines for Human Pose Estimation" and [Arxiv'22] "ViTPose+: Vision Transformer Foundation Model for Generic Body Pose Estimation"
Meal V2
⭐
603
MEAL V2: Boosting Vanilla ResNet-50 to 80%+ Top-1 Accuracy on ImageNet without Tricks
Hawq
⭐
324
Quantization library for PyTorch. Support low-precision and mixed-precision quantization, with hardware implementation through TVM.
Brain Inspired Replay
⭐
169
A brain-inspired version of generative replay for continual learning with deep neural networks (e.g., class-incremental learning on CIFAR-100; PyTorch code).
Fkd
⭐
166
Official code for our ECCV'22 paper "A Fast Knowledge Distillation Framework for Visual Recognition"
Distilkobert
⭐
165
Distillation of KoBERT from SKTBrain (Lightweight KoBERT)
Knowledgedistillation
⭐
150
Knowledge distillation in text classification with pytorch. 知识蒸馏,中文文本分类,教师模型BERT、XLNET,学生模型biLSTM。
Bk Sdm
⭐
124
A Compressed Stable Diffusion for Efficient Text-to-Image Generation [ICCV'23 Demo] [ICML'23 Workshop]
Filter Grafting
⭐
110
Filter Grafting for Deep Neural Networks(CVPR 2020)
Incremental Learning
⭐
76
Pytorch implementation of ACCV18 paper "Revisiting Distillation and Incremental Classifier Learning."
Supermix
⭐
72
Pytorch implementation of CVPR2021 paper: SuperMix: Supervising the Mixing Data Augmentation
Adaptive Wavelets
⭐
60
Adaptive, interpretable wavelets across domains (NeurIPS 2021)
Mkb
⭐
54
Knowledge Base Embedding By Cooperative Knowledge Distillation
Knowledgefactor
⭐
31
[ECCV2022] Factorizing Knowledge in Neural Networks
Datasetculling
⭐
27
✂️ Dataset Culling: Faster training of domain specific models with distillation ✂️ (IEEE ICIP 2019)
Squeezer
⭐
18
Lightweight knowledge distillation pipeline
Distilkobilstm
⭐
17
Distilling Task-Specific Knowledge from Teacher Model into BiLSTM
Skt
⭐
9
Efficient Crowd Counting via Structured Knowledge Transfer (SKT, ACM MM 2020)
Overhaul
⭐
6
[ICCV 2019] A Comprehensive Overhaul of Feature Distillation
Related Searches
Python Pytorch (15,943)
Deep Learning Pytorch (7,533)
Jupyter Notebook Pytorch (4,892)
Machine Learning Pytorch (2,934)
Dataset Pytorch (1,848)
Pytorch Convolutional Neural Networks (1,777)
Pytorch Neural Network (1,631)
Pytorch Natural Language Processing (1,408)
Network Pytorch (1,301)
Pytorch Computer Vision (1,230)
1-21 of 21 search results
Privacy
|
About
|
Terms
|
Follow Us On Twitter
Copyright 2018-2024 Awesome Open Source. All rights reserved.