Project Name | Stars | Downloads | Repos Using This | Packages Using This | Most Recent Commit | Total Releases | Latest Release | Open Issues | License | Language |
---|---|---|---|---|---|---|---|---|---|---|
Pretrained Language Model | 2,912 | 4 months ago | 108 | Python | ||||||
Pretrained language model and its related optimization techniques developed by Huawei Noah's Ark Lab. | ||||||||||
Ld Net | 145 | 4 years ago | 3 | apache-2.0 | Python | |||||
Efficient Contextualized Representation: Language Model Pruning for Sequence Labeling | ||||||||||
Causal Distill | 12 | 2 years ago | 1 | mit | Python | |||||
The Codebase for Causal Distillation for Language Models | ||||||||||
Task Aware Distillation | 10 | 9 months ago | 1 | Python | ||||||
Less is More: Task-aware Layer-wise Distillation for Language Model Compression (ICML2023) | ||||||||||
Lm Vocab Trimmer | 9 | 7 months ago | mit | Python | ||||||
Vocabulary Trimming (VT) is a model compression technique, which reduces a multilingual LM vocabulary to a target language by deleting irrelevant tokens from its vocabulary. This repository contains a python-library vocabtrimmer, that remove irrelevant tokens from a multilingual LM vocabulary for the target language. |