Awesome Open Source
Search
Programming Languages
Languages
All Categories
Categories
About
Search results for pre trained language models
pre-trained-language-models
x
37 search results found
Chinese Llama Alpaca
⭐
15,877
中文LLaMA&Alpaca大语言模型+本地CPU/GPU训练部署 (Chinese LLaMA & Alpaca LLMs)
Llmsurvey
⭐
7,255
The official GitHub page for the survey paper "A Survey of Large Language Models".
Openprompt
⭐
4,006
An Open-Source Framework for Prompt-Learning.
Promptpapers
⭐
3,423
Must-read papers on prompt-based tuning for pre-trained language models.
Top2vec
⭐
2,847
Top2Vec learns jointly embedded topic, document and word vectors.
Roberta_zh
⭐
2,141
RoBERTa中文预训练模型: RoBERTa for Chinese
Awesome Transformer Nlp
⭐
1,022
A curated list of NLP resources focused on Transformer networks, attention mechanism, GPT, BERT, ChatGPT, LLMs, and transfer learning.
Knowlm
⭐
870
An Open-sourced Knowledgable Large Language Model Framework.
P Tuning
⭐
528
A novel method to tune language models. Codes and datasets for paper ``GPT understands, too''.
Lmaas Papers
⭐
480
Awesome papers on Language-Model-as-a-Service (LMaaS)
Knowledgeeditingpapers
⭐
423
Must-read Papers on Knowledge Editing for Large Language Models.
Textpruner
⭐
314
A PyTorch-based model pruning toolkit for pre-trained language models
Hugnlp
⭐
237
HugNLP is a unified and comprehensive NLP library based on HuggingFace Transformer. Please hugging for NLP now!😊 HugNLP will released to @HugAILab
Awesome Scientific Language Models
⭐
166
A Curated List of Language Models in Scientific Domains
Awesome Efficient Plm
⭐
83
Must-read papers on improving efficiency for pre-trained language models.
Dart
⭐
76
Code for the ICLR2022 paper "Differentiable Prompt Makes Pre-trained Language Models Better Few-shot Learners"
Molgen
⭐
64
Code and pre-trained models for the paper "Domain-Agnostic Molecular Generation with Self-feedback."
Sifrank
⭐
61
The code of our paper "SIFRank: A New Baseline for Unsupervised Keyphrase Extraction Based on Pre-trained Language Model"
Mkg_analogy
⭐
56
Code and datasets for the ICLR2023 paper "Multimodal Analogical Reasoning over Knowledge Graphs."
Valm
⭐
46
VaLM: Visually-augmented Language Modeling. ICLR 2023.
Sifrank_zh
⭐
43
基于预训练模型的中文关键词抽取方法(论文SIFRank: A New Baseline for Unsupervised Keyphrase Extraction Based on Pre-trained Language Model 的中文版代码)
Electra_crf_ner
⭐
37
We start a company-name recognition task with a small scale and low quality training data, then using skills to enhanced model training speed and predicting performance with least artificial participation. The methods we use involve lite pre-training models such as Albert-small or Electra-small with financial corpus, knowledge of distillation and multi-stage learning. The result is that we improve the recall rate of company names recognition task from 0.73 to 0.92 and get 4 times as fast as BERT
Dynamickd
⭐
30
Code for EMNLP 2021 main conference paper "Dynamic Knowledge Distillation for Pre-trained Language Models"
Gigabert
⭐
26
Zero-shot Transfer Learning from English to Arabic
Ares
⭐
21
SIGIR'22 paper: Axiomatically Regularized Pre-training for Ad hoc Search
Seetopic
⭐
17
Seed-Guided Topic Discovery with Out-of-Vocabulary Seeds (NAACL'22)
Llm Survey
⭐
9
The official GitHub page for the survey paper "A Survey on Large Language Models: Applications, Challenges, Limitations, and Practical Usage".
Linglong
⭐
9
LingLong (玲珑): a small-scale Chinese pretrained language model
Valuezeroing
⭐
8
The official repo for the EACL 2023 paper "Quantifying Context Mixing in Transformers"
Cdgp
⭐
8
Code for Findings of EMNLP 2022 short paper "CDGP: Automatic Cloze Distractor Generation based on Pre-trained Language Model".
Chinese Llama Alpaca Usage
⭐
7
📔 对Chinese-LLaMA-Alpaca进行使用说明和核心代码注解
Cascadebert
⭐
7
Code for CascadeBERT, Findings of EMNLP 2021
Revisit Knn
⭐
7
Code for the CCL2023 paper "Revisiting k-NN for Fine-tuning Pre-trained Language Models."
Xlm Plus
⭐
7
Gigabert
⭐
6
Arabic Relation extraction system, named entity recognition, IE
Scimult
⭐
5
Pre-training Multi-task Contrastive Learning Models for Scientific Literature Understanding (Findings of EMNLP'23)
Pytorch Ko Ner
⭐
5
PLM 기반 한국어 개체명 인식 (NER)
Related Searches
Python Pre Trained Language Models (19)
Pytorch Pre Trained Language Models (8)
Llm Pre Trained Language Models (6)
Bert Pre Trained Language Models (6)
1-37 of 37 search results
Privacy
|
About
|
Terms
|
Follow Us On Twitter
Copyright 2018-2024 Awesome Open Source. All rights reserved.