Awesome Open Source
Search
Programming Languages
Languages
All Categories
Categories
About
Search results for bert knowledge distillation
bert
x
knowledge-distillation
x
11 search results found
Easynlp
⭐
1,871
EasyNLP: A Comprehensive and Easy-to-use NLP Toolkit
Easytransfer
⭐
795
EasyTransfer is designed to make the development of transfer learning in NLP applications easier.
Knowledgedistillation
⭐
150
Knowledge distillation in text classification with pytorch. 知识蒸馏,中文文本分类,教师模型BERT、XLNET,学生模型biLSTM。
Multilangstructurekd
⭐
64
[ACL 2020] Structure-Level Knowledge Distillation For Multilingual Sequence Labeling
Bert In Production
⭐
64
A collection of resources on using BERT (https://arxiv.org/abs/1810.04805 ) and related Language Models in production environments.
Easy Bert
⭐
38
easy-bert是一个中文NLP工具,提供诸多bert变体调用和调参方法,极速上手;清晰的设计和代
Distilkobilstm
⭐
17
Distilling Task-Specific Knowledge from Teacher Model into BiLSTM
Bert Aad
⭐
13
Adversarial Adaptation with Distillation for BERT Unsupervised Domain Adaptation
Rosita
⭐
11
[AAAI 2021] "ROSITA: Refined BERT cOmpreSsion with InTegrAted techniques", Yuanxin Liu, Zheng Lin, Fengcheng Yuan
Distilledneuralresponseranker
⭐
9
Implementation of "Distilling Knowledge for Fast Retrieval-based Chat-bots" (SIGIR 2020) using deep matching transformer networks and knowledge distillation for response retrieval in information-seeking conversational systems.
Simpleclassification
⭐
6
Simple Text Classification[WIP]
Related Searches
Natural Language Processing Bert (689)
Python Bert (688)
Pytorch Bert (295)
Deep Learning Bert (223)
1-11 of 11 search results
Privacy
|
About
|
Terms
|
Follow Us On Twitter
Copyright 2018-2024 Awesome Open Source. All rights reserved.