Project Name | Stars | Downloads | Repos Using This | Packages Using This | Most Recent Commit | Total Releases | Latest Release | Open Issues | License | Language |
---|---|---|---|---|---|---|---|---|---|---|
Deberta | 1,673 | 7 months ago | 13 | February 09, 2021 | 63 | mit | Python | |||
The implementation of DeBERTa | ||||||||||
Sapbert | 117 | a year ago | mit | Python | ||||||
[NAACL'21 & ACL'21] SapBERT: Self-alignment pretraining for BERT & XL-BEL: Cross-Lingual Biomedical Entity Linking. | ||||||||||
Coco Lm | 82 | 2 years ago | mit | Python | ||||||
[NeurIPS 2021] COCO-LM: Correcting and Contrasting Text Sequences for Language Model Pretraining | ||||||||||
Selformer | 55 | 2 months ago | 5 | Python | ||||||
SELFormer: Molecular Representation Learning via SELFIES Language Models | ||||||||||
Amos | 10 | 2 years ago | mit | Python | ||||||
[ICLR 2022] Pretraining Text Encoders with Adversarial Mixture of Training Signal Generators | ||||||||||
Hierarchical Language Modeling | 5 | a year ago | 2 | mit | Jupyter Notebook | |||||
We address the task of learning contextualized word, sentence and document representations with a hierarchical language model by stacking Transformer-based encoders on a sentence level and subsequently on a document level and performing masked token prediction. |