Awesome Open Source
Search
Programming Languages
Languages
All Categories
Categories
About
Search results for bert pretrained models
bert
x
pretrained-models
x
44 search results found
Transformers
⭐
124,049
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
Paddlenlp
⭐
10,908
👑 Easy-to-use and powerful NLP and LLM library with 🤗 Awesome model zoo, supporting wide-range of NLP tasks from research to industrial applications, including 🗂Text Classification, 🔍 Neural Search, ❓ Question Answering, ℹ️ Information Extraction, 📄 Document Intelligence, 💌 Sentiment Analysis etc.
Awesome Pretrained Chinese Nlp Models
⭐
3,738
Awesome Pretrained Chinese NLP Models,高质量中文预训练模型&大模型&多模态模型&大语言模型集合
Albert_zh
⭐
3,723
A LITE BERT FOR SELF-SUPERVISED LEARNING OF LANGUAGE REPRESENTATIONS, 海量中文预训练ALBERT模型
Clue
⭐
3,345
中文语言理解测评基准 Chinese Language Understanding Evaluation Benchmark: datasets, baselines, pre-trained models, corpus and leaderboard
Easynlp
⭐
1,871
EasyNLP: A Comprehensive and Easy-to-use NLP Toolkit
Awesome Sentence Embedding
⭐
1,831
A curated list of pretrained sentence and word embedding models
Chineseglue
⭐
1,765
Language Understanding Evaluation benchmark for Chinese: datasets, baselines, pre-trained models,corpus and leaderboard
Farm
⭐
1,706
🏡 Fast & easy transfer learning for NLP. Harvesting language models for the industry. Focus on Question Answering.
Gpt2 Ml
⭐
1,674
GPT2 for Multiple Languages, including pretrained models. GPT2 多语言支持, 15亿参数中文预训练模型
Spark
⭐
1,355
[ICLR'23 Spotlight🔥] The first successful BERT/MAE-style pretraining on any convolutional network; Pytorch impl. of "Designing BERT for Convolutional Networks: Sparse and Hierarchical Masked Modeling"
Chinese Electra
⭐
1,253
Pre-trained Chinese ELECTRA(中文ELECTRA预训练模型)
Bert Ner
⭐
872
Pytorch-Named-Entity-Recognition-with-BERT
Uform
⭐
729
Pocket-Sized Multimodal AI for content understanding and generation across multilingual texts, images, and 🔜 video, up to 5x faster than OpenAI CLIP and LLaVA 🖼️ & 🖋️
Patrickstar
⭐
631
PatrickStar enables Larger, Faster, Greener Pretrained Models for NLP and democratizes AI for everyone.
Promptclue
⭐
592
PromptCLUE, 全中文任务支持零样本学习模型
M3tl
⭐
544
BERT for Multitask Learning
Treasure Of Transformers
⭐
541
💁 Awesome Treasure of Transformers Models for Natural Language processing contains papers, videos, blogs, official repo along with colab Notebooks. 🛫☑️
Cluepretrainedmodels
⭐
536
高质量中文预训练模型集合:最先进大模型、最快小模型、相似度专门模型
Bertwithpretrained
⭐
428
An implementation of the BERT model and its related downstream tasks based on the PyTorch framework
Azureml Bert
⭐
384
End-to-End recipes for pre-training and fine-tuning BERT using Azure Machine Learning Service
Tod Bert
⭐
277
Pre-Trained Models for ToD-BERT
Pert
⭐
249
PERT: Pre-training BERT with Permuted Language Model
Pytorch Nlu
⭐
226
Pytorch-NLU,一个中文文本分类、序列标注工具包,支持中文长文本、短文本的多类、多标签分类任 Ptorch NLU, a Chinese text classification and sequence annotation toolkit, supports multi class and multi label classification tasks of Chinese long text and short text, and supports sequence annotation tasks such as Chinese named entity recognition, part of speech tagging and word segmentation.
Bert Squad
⭐
217
SQuAD Question Answering Using BERT, PyTorch
Bert Ner Tf
⭐
184
Named Entity Recognition with BERT using TensorFlow 2.0
Hugsvision
⭐
170
HugsVision is a easy to use huggingface wrapper for state-of-the-art computer vision
Transformer Models
⭐
150
Deep Learning Transformer models in MATLAB
Electra
⭐
112
中文 预训练 ELECTRA 模型: 基于对抗学习 pretrain Chinese Model
Minirbt
⭐
102
MiniRBT (中文小型预训练模型系列)
Ontoprotein
⭐
98
Code and datasets for the ICLR2022 paper "OntoProtein: Protein Pretraining With Gene Ontology Embedding"
Transfomers Silicon Research
⭐
97
Research and Materials on Hardware implementation of Transformer Model
Recurrent Vln Bert
⭐
90
Code of the CVPR 2021 Oral paper: A Recurrent Vision-and-Language BERT for Navigation
Dialogue Understanding
⭐
82
This repository contains PyTorch implementation for the baseline models from the paper Utterance-level Dialogue Understanding: An Empirical Study
Roberta Wwm Base Distill
⭐
64
this is roberta wwm base distilled model which was distilled from roberta wwm by roberta wwm large
Syntaxdot
⭐
62
Neural syntax annotator, supporting sequence labeling, lemmatization, and dependency parsing.
Transformers Keras
⭐
61
Transformer-based models implemented in tensorflow 2.x(using keras).
Cybertron
⭐
44
mindspore implementation of transformers
Bert Qna Squad_2.0_finetuned_model
⭐
38
BERT which stands for Bidirectional Encoder Representations from Transformations is the SOTA in Transfer Learning in NLP.
Sentilare
⭐
34
Codes for our paper "SentiLARE: Sentiment-Aware Language Representation Learning with Linguistic Knowledge" (EMNLP 2020)
Transformers Embedder
⭐
34
A Word Level Transformer layer based on PyTorch and 🤗 Transformers.
Scibert_cn
⭐
29
Pretrained model for Chinese Scientific Text
Aispace
⭐
28
AiSpace: Better practices for deep learning model development and deployment For Tensorflow 2.0
Ares
⭐
21
SIGIR'22 paper: Axiomatically Regularized Pre-training for Ad hoc Search
Recommendation_transfer_learning_pretraining
⭐
18
Pre-training and Transfer learning papers for recommendation
Psychwordvec
⭐
15
🔜 Integrative Toolbox of Word Embedding Research for Psychological Science.
Vietnamese Roberta
⭐
12
A Robustly Optimized BERT Pretraining Approach for Vietnamese
Language Model Pretraining For Text Generation
⭐
11
LM pretraining for generation, reading list, resources, conference mappings.
Meta_xlm
⭐
10
Cross-lingual Language Model (XLM) pretraining and Model-Agnostic Meta-Learning (MAML) for fast adaptation of deep networks
Eatn
⭐
8
Dataset and code for ”EATN: An Efficient Adaptive Transfer Network for Aspect-level Sentiment Analysis"
Knowqa
⭐
8
预训练模型知识量度量竞赛 Baseline F1 0.35 BERTForMaskedLM
Anchors
⭐
7
Source code of CIKM2021 Paper 'Pre-training for Ad-hoc Retrieval: Hyperlink is Also You Need'
Childtuning
⭐
5
Source code for our EMNLP'21 paper 《Raise a Child in Large Language Model: Towards Effective and Generalizable Fine-tuning》
Polibertweet
⭐
5
A transformer-based language model trained on politics-related Twitter data. This repo is the official resource of the paper "PoliBERTweet: A Pre-trained Language Model for Analyzing Political Content on Twitter", LREC 2022
Related Searches
Python Bert (713)
Natural Language Processing Bert (604)
Python Pretrained Models (395)
1-44 of 44 search results
Privacy
|
About
|
Terms
|
Follow Us On Twitter
Copyright 2018-2024 Awesome Open Source. All rights reserved.