Awesome Open Source
Search
Programming Languages
Languages
All Categories
Categories
About
Search results for natural language processing knowledge distillation
knowledge-distillation
x
natural-language-processing
x
10 search results found
Easynlp
⭐
1,871
EasyNLP: A Comprehensive and Easy-to-use NLP Toolkit
Neuronblocks
⭐
1,441
NLP DNN Toolkit - Building Your NLP DNN Models Like Playing Lego
Torchdistill
⭐
1,171
A coding-free framework built on PyTorch for reproducible deep learning studies. 🏆22 knowledge distillation methods presented at CVPR, ICLR, ECCV, NeurIPS, ICCV, etc are implemented so far. 🎁 Trained models, training logs and configurations are available for ensuring the reproducibiliy and benchmark.
Easytransfer
⭐
795
EasyTransfer is designed to make the development of transfer learning in NLP applications easier.
What I Have Read
⭐
149
Paper Lists, Notes and Slides, Focus on NLP. For summarization, please refer to https://github.com/xcfcode/Summarization-Papers
Simxns
⭐
92
SimXNS, a research project for information retrieval, containing official implementations, by MSRA NLC team.
Distill Bert Textgen
⭐
90
Research code for ACL 2020 paper: "Distilling Knowledge Learned in BERT for Text Generation".
Multilangstructurekd
⭐
64
[ACL 2020] Structure-Level Knowledge Distillation For Multilingual Sequence Labeling
Disco
⭐
61
This is the public repository of EMNLP 2023 paper "DisCo: Co-training Distilled Student Models for Semi-supervised Text Mining"
Easy Bert
⭐
38
easy-bert是一个中文NLP工具,提供诸多bert变体调用和调参方法,极速上手;清晰的设计和代
Efficient Bert
⭐
31
This repository contains the code for the paper in Findings of EMNLP 2021: "EfficientBERT: Progressively Searching Multilayer Perceptron via Warm-up Knowledge Distillation".
Dynamickd
⭐
30
Code for EMNLP 2021 main conference paper "Dynamic Knowledge Distillation for Pre-trained Language Models"
Neurips Micronet
⭐
29
[JMLR 2020] NeurIPS 2019 MicroNet Challenge Efficient Language Modeling, Champion
Awesome Nlp References
⭐
25
A curated list of resources dedicated to Knowledge Distillation, Recommendation System, especially Natural Language Processing (NLP).
Ai_book
⭐
17
AI book for everyone
Distilkobilstm
⭐
17
Distilling Task-Specific Knowledge from Teacher Model into BiLSTM
Pytorch Minilm
⭐
15
Unofficial Pytorch implementation of MiniLM and MiniLMv2
Sglkt Visdial
⭐
10
🌈 PyTorch Implementation for EMNLP'21 Findings "Reasoning Visual Dialog with Sparse Graph Learning and Knowledge Transfer"
Lgtm
⭐
10
[ACL 2023] Code for paper “Tailoring Instructions to Student’s Learning Levels Boosts Knowledge Distillation”(https://arxiv.org/abs/2305.09651)
Knowledge Distillation Experiments
⭐
5
Related Searches
Python Natural Language Processing (7,915)
Jupyter Notebook Natural Language Processing (4,405)
Machine Learning Natural Language Processing (3,939)
Deep Learning Natural Language Processing (2,414)
Pytorch Natural Language Processing (1,212)
Artificial Intelligence Natural Language Processing (1,010)
Dataset Natural Language Processing (1,010)
Tensorflow Natural Language Processing (909)
Javascript Natural Language Processing (843)
Natural Language Processing Chatbot (726)
1-10 of 10 search results
Privacy
|
About
|
Terms
|
Follow Us On Twitter
Copyright 2018-2024 Awesome Open Source. All rights reserved.