Awesome Open Source
Search
Programming Languages
Languages
All Categories
Categories
About
Search results for gpt 2 roberta
gpt-2
x
roberta
x
11 search results found
Lora
⭐
7,814
Code for loralib, an implementation of "LoRA: Low-Rank Adaptation of Large Language Models"
Bertviz
⭐
5,547
BertViz: Visualize Attention in NLP Models (BERT, GPT2, BART, etc.)
Awesome Pretrained Chinese Nlp Models
⭐
3,738
Awesome Pretrained Chinese NLP Models,高质量中文预训练模型&大模型&多模态模型&大语言模型集合
Uer Py
⭐
2,802
Open Source Pre-training Model Framework in PyTorch & Pre-trained Model Zoo
Rust Bert
⭐
2,300
Rust native ready-to-use NLP pipelines and transformer-based models (BERT, DistilBERT, GPT2,...)
Roberta_zh
⭐
2,141
RoBERTa中文预训练模型: RoBERTa for Chinese
Turbotransformers
⭐
1,322
a fast and user-friendly runtime for transformer inference (Bert, Albert, GPT2, Decoders, etc) on CPU and GPU.
Tencentpretrain
⭐
951
Tencent Pre-training framework in PyTorch & Pre-trained Model Zoo
Bert_seq2seq
⭐
890
pytorch实现 Bert 做seq2seq任务,使用unilm方案,现在也可以做自动摘要,文本分类,情感分析,NER,词性标注
Texar Pytorch
⭐
711
Integrating the Best of TF into PyTorch, for Machine Learning, Natural Language Processing, and Text Generation. This is part of the CASL project: http://casl-project.ai/
Japanese Pretrained Models
⭐
479
Code for producing Japanese pretrained models provided by rinna Co., Ltd.
Mint
⭐
192
MinT: Minimal Transformer Library and Tutorials
Adamix
⭐
94
This is the implementation of the paper AdaMix: Mixture-of-Adaptations for Parameter-efficient Model Tuning (https://arxiv.org/abs/2205.12410).
Text Summarization
⭐
76
Abstractive and Extractive Text summarization using Transformers.
Amazon Ml Challenge2021
⭐
28
Scripts and Approach for Amazon ML Challenge
Transformer Qg On Squad
⭐
26
Implement Question Generator with SOTA pre-trained Language Models (RoBERTa, BERT, GPT, BART, T5, etc.)
Distilkobilstm
⭐
17
Distilling Task-Specific Knowledge from Teacher Model into BiLSTM
Bert_seq2seq_ddp
⭐
16
bert_seq2seq的DDP版本,支持bert、roberta、nezha、t5、gpt2等模型
Security Intelligence On Exchanged Multimedia Messages Based On Deep Learning
⭐
6
Deep learning (DL) approaches use various processing layers to learn hierarchical representations of data. Recently, many methods and designs of natural language processing (NLP) models have shown significant development, especially in text mining and analysis. For learning vector-space representations of text, there are famous models like Word2vec, GloVe, and fastText. In fact, NLP took a big step forward when BERT and recently GTP-3 came out. Deep Learning algorithms are unable to deal with te
Perceptivepyro
⭐
6
Run and train Transformer based Large Language Models (LLMS) natively in .NET using TorchSharp
Scratch2lm
⭐
5
Training transformer models (e.g. RoBERTa, GPT2 and GPT-J) from scratch.
1-11 of 11 search results
Privacy
|
About
|
Terms
|
Follow Us On Twitter
Copyright 2018-2024 Awesome Open Source. All rights reserved.