Awesome Open Source
Search
Programming Languages
Languages
All Categories
Categories
About
Search results for gpt 2 albert
albert
x
gpt-2
x
4 search results found
Uer Py
⭐
2,802
Open Source Pre-training Model Framework in PyTorch & Pre-trained Model Zoo
Turbotransformers
⭐
1,322
a fast and user-friendly runtime for transformer inference (Bert, Albert, GPT2, Decoders, etc) on CPU and GPU.
Tencentpretrain
⭐
951
Tencent Pre-training framework in PyTorch & Pre-trained Model Zoo
Nlpgnn
⭐
310
1. Use BERT, ALBERT and GPT2 as tensorflow2.0's layer. 2. Implement GCN, GAN, GIN and GraphSAGE based on message passing.
Ai_and_memory_wall
⭐
121
AI and Memory Wall blog post
Transformer Qg On Squad
⭐
26
Implement Question Generator with SOTA pre-trained Language Models (RoBERTa, BERT, GPT, BART, T5, etc.)
Security Intelligence On Exchanged Multimedia Messages Based On Deep Learning
⭐
6
Deep learning (DL) approaches use various processing layers to learn hierarchical representations of data. Recently, many methods and designs of natural language processing (NLP) models have shown significant development, especially in text mining and analysis. For learning vector-space representations of text, there are famous models like Word2vec, GloVe, and fastText. In fact, NLP took a big step forward when BERT and recently GTP-3 came out. Deep Learning algorithms are unable to deal with te
1-4 of 4 search results
Privacy
|
About
|
Terms
|
Follow Us On Twitter
Copyright 2018-2024 Awesome Open Source. All rights reserved.