Awesome Open Source
Search
Programming Languages
Languages
All Categories
Categories
About
Search results for transfer learning roberta
roberta
x
transfer-learning
x
10 search results found
Farm
⭐
1,706
🏡 Fast & easy transfer learning for NLP. Harvesting language models for the industry. Focus on Question Answering.
Promptclue
⭐
592
PromptCLUE, 全中文任务支持零样本学习模型
Dialog Nlu
⭐
78
Tensorflow and Keras implementation of the state of the art researches in Dialog System NLU
Xlm Roberta Ner
⭐
46
Named Entity Recognition with Pretrained XLM-RoBERTa
Efficient Task Transfer
⭐
34
Research code for "What to Pre-Train on? Efficient Intermediate Task Selection", EMNLP 2021
Bert_for_longer_texts
⭐
25
BERT classification model for processing texts longer than 512 tokens. Text is first divided into smaller chunks and after feeding them to BERT, intermediate results are pooled. The implementation allows fine-tuning.
Security Intelligence On Exchanged Multimedia Messages Based On Deep Learning
⭐
6
Deep learning (DL) approaches use various processing layers to learn hierarchical representations of data. Recently, many methods and designs of natural language processing (NLP) models have shown significant development, especially in text mining and analysis. For learning vector-space representations of text, there are famous models like Word2vec, GloVe, and fastText. In fact, NLP took a big step forward when BERT and recently GTP-3 came out. Deep Learning algorithms are unable to deal with te
Mmm Mcqa
⭐
6
Source code for our "MMM" paper at AAAI 2020
Tweet Sentiment Extraction
⭐
5
(Silver medal - 60th place - Top 3%) Repository for the "Tweet Sentiment Extraction" Kaggle competition.
Smaberta
⭐
5
Wrapper for stable version of RoBERTa language models
Related Searches
Python Transfer Learning (561)
Jupyter Notebook Transfer Learning (522)
1-10 of 10 search results
Privacy
|
About
|
Terms
|
Follow Us On Twitter
Copyright 2018-2024 Awesome Open Source. All rights reserved.