Awesome Open Source
Search
Programming Languages
Languages
All Categories
Categories
About
Search results for bert transfer learning
bert
x
transfer-learning
x
45 search results found
Leedl Tutorial
⭐
8,682
《李宏毅深度学习教程》,PDF下载地址:https://github.com/datawhalech
Kashgari
⭐
2,141
Kashgari is a production-level NLP Transfer learning framework built on top of tf.keras for text-labeling and text-classification, includes Word2Vec, BERT, and GPT2 Language Embedding.
Easynlp
⭐
1,871
EasyNLP: A Comprehensive and Easy-to-use NLP Toolkit
Farm
⭐
1,706
🏡 Fast & easy transfer learning for NLP. Harvesting language models for the industry. Focus on Question Answering.
Jiant
⭐
1,603
jiant is an nlp toolkit
Spacy Transformers
⭐
1,302
🛸 Use pretrained transformers like BERT, XLNet and GPT-2 in spaCy
Awesome Transformer Nlp
⭐
1,022
A curated list of NLP resources focused on Transformer networks, attention mechanism, GPT, BERT, ChatGPT, LLMs, and transfer learning.
Getting Things Done With Pytorch
⭐
873
Jupyter Notebook tutorials on solving real-world problems with Machine Learning & Deep Learning using PyTorch. Topics: Face detection with Detectron 2, Time Series anomaly detection with LSTM Autoencoders, Object Detection with YOLO v5, Build your first Neural Network, Time Series forecasting for Coronavirus daily cases, Sentiment Analysis with BERT.
Easytransfer
⭐
795
EasyTransfer is designed to make the development of transfer learning in NLP applications easier.
Primeqa
⭐
677
The prime repository for state-of-the-art Multilingual Question Answering research and development.
Promptclue
⭐
592
PromptCLUE, 全中文任务支持零样本学习模型
Nlp Paper
⭐
579
NLP Paper
Ner Bert
⭐
403
BERT-NER (nert-bert) with google bert https://github.com/google-research.
Abstractive Summarization With Transfer Learning
⭐
387
Abstractive summarisation using Bert as encoder and Transformer Decoder
Backprop
⭐
239
Backprop makes it simple to use, finetune, and deploy state-of-the-art ML models.
Sigir2020_peterrec
⭐
194
Universal User Representation Pre-training for Cross-domain Recommendation and User Profiling
Tldr Transformers
⭐
163
The "tl;dr" on a few notable transformer papers.
Bert Sklearn
⭐
119
a sklearn wrapper for Google's BERT model
Dialog Nlu
⭐
78
Tensorflow and Keras implementation of the state of the art researches in Dialog System NLU
Wechsel
⭐
54
Code for WECHSEL: Effective initialization of subword embeddings for cross-lingual transfer of monolingual language models.
Dl_notebooks
⭐
42
This repo contains all the notebooks mentioned in blog.
Sigir2021_conure
⭐
39
One Person, One Model, One World: Learning Continual User Representation without Forgetting
Bert Qna Squad_2.0_finetuned_model
⭐
38
BERT which stands for Bidirectional Encoder Representations from Transformations is the SOTA in Transfer Learning in NLP.
Efficient Task Transfer
⭐
34
Research code for "What to Pre-Train on? Efficient Intermediate Task Selection", EMNLP 2021
Bert_for_longer_texts
⭐
25
BERT classification model for processing texts longer than 512 tokens. Text is first divided into smaller chunks and after feeding them to BERT, intermediate results are pooled. The implementation allows fine-tuning.
Task Transferability
⭐
24
Data and code for our paper "Exploring and Predicting Transferability across NLP Tasks", to appear at EMNLP 2020.
Oreilly Bert Nlp
⭐
20
This repository contains code for the O'Reilly Live Online Training for BERT
Recommendation_transfer_learning_pretraining
⭐
18
Pre-training and Transfer learning papers for recommendation
Universal_user_representation
⭐
18
papers of universal user representation learning for recommendation
Multi Label Classification Of Pubmed Articles
⭐
14
The traditional machine learning models give a lot of pain when we do not have sufficient labeled data for the specific task or domain we care about to train a reliable model. Transfer learning allows us to deal with these scenarios by leveraging the already existing labeled data of some related task or domain. We try to store this knowledge gained in solving the source task in the source domain and apply it to our problem of interest. In this work, I have utilized Transfer Learning utilizing
Browser Bert
⭐
13
Using BERT for transfer learning - but just in the browser.
Filipino Text Benchmarks
⭐
13
Open-source benchmark datasets and pretrained transformer models in the Filipino language.
Parsbigbird
⭐
13
Persian Bert For Long-Range Sequences
Bert Coref Resolution Lee
⭐
11
Coreference Model Experimentation (Tensorflow and Pytorch) : Mainly Using transfer learning and Transformer Model BERT
Oreilly Bert Hands On Nlp
⭐
11
This repository contains code for the O'Reilly Live Online Training for Hands on Transfer Learning with BERT
Donkeybot
⭐
11
🤖 Question Answering Bot for Rucio User Support (GSoC Project)
Sample Machine Learning Projects
⭐
10
Some example projects that was made using Tensorflow (mostly). This repository contains the projects that I've experimented-tried when I was new in Deep Learning.
Writing With Bert
⭐
10
Using BERT for doing the task of Conditional Natural Language Generation by fine-tuning pre-trained BERT on custom dataset.
T2ner
⭐
8
T2NER: Transformers based Transfer Learning Framework for Named Entity Recognition (EACL 2021)
Intent_and_slot_classification
⭐
8
One of the main NLU tasks is to understand the intents (sequence classification) and slots (entities within the sequence). This repo help classify both together using Joint Model (multitask model). BERT_SMALL is used which can be changed to any other BERT variant.
Bert Ner Conll
⭐
7
This repository tries to implement BERT for NER by trying to follow the paper using transformers library
Mmm Mcqa
⭐
6
Source code for our "MMM" paper at AAAI 2020
Security Intelligence On Exchanged Multimedia Messages Based On Deep Learning
⭐
6
Deep learning (DL) approaches use various processing layers to learn hierarchical representations of data. Recently, many methods and designs of natural language processing (NLP) models have shown significant development, especially in text mining and analysis. For learning vector-space representations of text, there are famous models like Word2vec, GloVe, and fastText. In fact, NLP took a big step forward when BERT and recently GTP-3 came out. Deep Learning algorithms are unable to deal with te
Bert_imdb
⭐
5
A repo for NLP model Bert.
Kr3
⭐
5
KR3: Korean Restaurant Review with Ratings / Experiments on Parameter-efficient Tuning and Task-adaptive Pre-training
Related Searches
Python Bert (688)
Natural Language Processing Bert (674)
Jupyter Notebook Transfer Learning (662)
Python Transfer Learning (561)
1-45 of 45 search results
Privacy
|
About
|
Terms
|
Follow Us On Twitter
Copyright 2018-2024 Awesome Open Source. All rights reserved.