Awesome Open Source
Search
Programming Languages
Languages
All Categories
Categories
About
Search results for bert model compression
bert
x
model-compression
x
11 search results found
Paddleslim
⭐
1,486
PaddleSlim is an open-source library for deep model compression and architecture search.
Bert Of Theseus
⭐
186
⛵️The official PyTorch implementation for "BERT-of-Theseus: Compressing BERT by Progressive Module Replacing" (EMNLP 2020).
Cofipruning
⭐
151
ACL 2022: Structured Pruning Learns Compact and Accurate Models https://arxiv.org/abs/2204.00408
Knowledgedistillation
⭐
150
Knowledge distillation in text classification with pytorch. 知识蒸馏,中文文本分类,教师模型BERT、XLNET,学生模型biLSTM。
Pkd For Bert Model Compression
⭐
82
pytorch implementation for Patient Knowledge Distillation for BERT Model Compression
Dialog Nlu
⭐
78
Tensorflow and Keras implementation of the state of the art researches in Dialog System NLU
Ltp
⭐
59
[KDD'22] Learned Token Pruning for Transformers
I Bert
⭐
39
[ICML'21] I-BERT: Integer-only BERT Quantization
Xcompression
⭐
17
[ICLR 2022] Code for paper "Exploring Extreme Parameter Compression for Pre-trained Language Models"(https://arxiv.org/abs/2205.10036)
Rosita
⭐
11
[AAAI 2021] "ROSITA: Refined BERT cOmpreSsion with InTegrAted techniques", Yuanxin Liu, Zheng Lin, Fengcheng Yuan
Lm Vocab Trimmer
⭐
9
Vocabulary Trimming (VT) is a model compression technique, which reduces a multilingual LM vocabulary to a target language by deleting irrelevant tokens from its vocabulary. This repository contains a python-library vocabtrimmer, that remove irrelevant tokens from a multilingual LM vocabulary for the target language.
Related Searches
Python Bert (688)
Natural Language Processing Bert (674)
Pytorch Bert (295)
1-11 of 11 search results
Privacy
|
About
|
Terms
|
Follow Us On Twitter
Copyright 2018-2024 Awesome Open Source. All rights reserved.