Awesome Open Source
Search
Programming Languages
Languages
All Categories
Categories
About
Search results for bert pre training
bert
x
pre-training
x
11 search results found
Uer Py
⭐
2,802
Open Source Pre-training Model Framework in PyTorch & Pre-trained Model Zoo
Spark
⭐
1,355
[ICLR'23 Spotlight🔥] The first successful BERT/MAE-style pretraining on any convolutional network; Pytorch impl. of "Designing BERT for Convolutional Networks: Sparse and Hierarchical Masked Modeling"
Tencentpretrain
⭐
951
Tencent Pre-training framework in PyTorch & Pre-trained Model Zoo
Awesome Vision Language Pretraining Papers
⭐
724
Recent Advances in Vision and Language PreTrained Models (VL-PTMs)
Vl Bert
⭐
680
Code for ICLR 2020 paper "VL-BERT: Pre-training of Generic Visual-Linguistic Representations".
Azureml Bert
⭐
384
End-to-End recipes for pre-training and fine-tuning BERT using Azure Machine Learning Service
Kaleido Bert
⭐
207
(CVPR2021) Kaleido-BERT: Vision-Language Pre-training on Fashion Domain.
Sigir2020_peterrec
⭐
194
Universal User Representation Pre-training for Cross-domain Recommendation and User Profiling
Awesome Mim
⭐
178
[Survey] Masked Modeling for Self-supervised Representation Learning on Vision and Beyond (https://arxiv.org/abs/2401.00897)
Tupe
⭐
163
Transformer with Untied Positional Encoding (TUPE). Code of paper "Rethinking Positional Encoding in Language Pre-training". Improve existing models like BERT.
Ontoprotein
⭐
98
Code and datasets for the ICLR2022 paper "OntoProtein: Protein Pretraining With Gene Ontology Embedding"
Bert Tickets
⭐
94
[NeurIPS 2020] "The Lottery Ticket Hypothesis for Pre-trained BERT Networks", Tianlong Chen, Jonathan Frankle, Shiyu Chang, Sijia Liu, Yang Zhang, Zhangyang Wang, Michael Carbin
Tacl
⭐
63
TaCL: Improving BERT Pre-training with Token-aware Contrastive Learning (NAACL 2022)
Sigir2021_conure
⭐
39
One Person, One Model, One World: Learning Continual User Representation without Forgetting
Zeldarose
⭐
27
Train transformer-based models.
Torch_study
⭐
26
torch tutorial and paper implementation mainly about NLP
Universal_user_representation
⭐
18
papers of universal user representation learning for recommendation
Plmpapers
⭐
12
A paper list of pre-trained language models (PLMs).
Knowqa
⭐
8
预训练模型知识量度量竞赛 Baseline F1 0.35 BERTForMaskedLM
Kr3
⭐
5
KR3: Korean Restaurant Review with Ratings / Experiments on Parameter-efficient Tuning and Task-adaptive Pre-training
Concept Based Curriculum Masking
⭐
5
Efficient Pre-training of Masked Language Model via Concept-based Curriculum Masking
Related Searches
Python Bert (688)
Natural Language Processing Bert (668)
Pytorch Bert (341)
Jupyter Notebook Bert (298)
Ner Bert (121)
Classification Bert (109)
Text Classification Bert (107)
Bert Language Model (102)
Bert Roberta (92)
Python Pre Training (73)
1-11 of 11 search results
Privacy
|
About
|
Terms
|
Follow Us On Twitter
Copyright 2018-2024 Awesome Open Source. All rights reserved.