Awesome Open Source
Search
Programming Languages
Languages
All Categories
Categories
About
Search results for language model pre training
language-model
x
pre-training
x
9 search results found
Lmops
⭐
3,145
General technology for enabling AI capabilities w/ LLMs and MLLMs
Bert_language_understanding
⭐
886
Pre-training of Deep Bidirectional Transformers for Language Understanding: pre-train TextCNN
Knowlm
⭐
870
An Open-sourced Knowledgable Large Language Model Framework.
Azureml Bert
⭐
384
End-to-End recipes for pre-training and fine-tuning BERT using Azure Machine Learning Service
Mathpile
⭐
192
Generative AI for Math: MathPile
Dragon
⭐
189
[NeurIPS 2022] DRAGON 🐲: Deep Bidirectional Language-Knowledge Graph Pretraining
Recommendation Systems Without Explicit Id Features A Literature Review
⭐
171
Large pre-trained Foundation recommender models
Tupe
⭐
163
Transformer with Untied Positional Encoding (TUPE). Code of paper "Rethinking Positional Encoding in Language Pre-training". Improve existing models like BERT.
Pretraining With Human Feedback
⭐
97
Code accompanying the paper Pretraining Language Models with Human Preferences
Coco Lm
⭐
82
[NeurIPS 2021] COCO-LM: Correcting and Contrasting Text Sequences for Language Model Pretraining
Molgen
⭐
64
Code and pre-trained models for the paper "Domain-Agnostic Molecular Generation with Self-feedback."
Linkbert
⭐
63
[ACL 2022] LinkBERT: A Knowledgeable Language Model 😎 Pretrained with Document Links
Tacl
⭐
63
TaCL: Improving BERT Pre-training with Token-aware Contrastive Learning (NAACL 2022)
Powerfulpromptft
⭐
59
[NeurIPS 2023 Main Track] This is the repository for the paper titled "Don’t Stop Pretraining? Make Prompt-based Fine-tuning Powerful Learner"
Kebiolm
⭐
58
Improving Biomedical Pretrained Language Models with Knowledge [BioNLP 2021]
Relm_unmt
⭐
26
Python source code for EMNLP 2020 paper "Reusing a Pretrained Language Model on Languages with Limited Corpora for Unsupervised NMT".
Ktelebert
⭐
19
[Paper][ICDE 2023] Tele-Knowledge Pre-training for Fault Analysis
Amos
⭐
10
[ICLR 2022] Pretraining Text Encoders with Adversarial Mixture of Training Signal Generators
Concept Based Curriculum Masking
⭐
5
Efficient Pre-training of Masked Language Model via Concept-based Curriculum Masking
Idforrec
⭐
5
Is ID embedding necessary for multimodal recommender system?
Related Searches
Python Language Model (540)
Jupyter Notebook Language Model (203)
Bert Language Model (102)
Deep Learning Language Model (100)
Corpus Language Model (92)
Python Pre Training (73)
1-9 of 9 search results
Privacy
|
About
|
Terms
|
Follow Us On Twitter
Copyright 2018-2024 Awesome Open Source. All rights reserved.