Awesome Open Source
Search
Programming Languages
Languages
All Categories
Categories
About
Search results for python t5
python
x
t5
x
44 search results found
Pycorrector
⭐
4,928
pycorrector is a toolkit for text error correction. 文本纠错,实现了Kenlm,T5,MacBERT,ChatGLM3,LLaMA等模型应用在纠错场景,
Uer Py
⭐
2,802
Open Source Pre-training Model Framework in PyTorch & Pre-trained Model Zoo
Tencentpretrain
⭐
951
Tencent Pre-training framework in PyTorch & Pre-trained Model Zoo
Textgen
⭐
842
TextGen: Implementation of Text Generation models, include LLaMA, BLOOM, GPT2, BART, T5, SongNet and so on. 文本生成模型,实现了包括LLaMA,ChatGLM,BLOOM,GPT2,Seq2Seq,BART,
Nlu
⭐
775
1 line for thousands of State of The Art NLP models in hundreds of languages The fastest and most accurate way to solve text problems.
Simplet5
⭐
305
simpleT5 is built on top of PyTorch-lightning⚡️ and Transformers🤗 that lets you quickly train your T5 models.
Fastt5
⭐
280
⚡ boost inference speed of T5 models by 5x & reduce the model size by 3x.
Lm Question Generation
⭐
207
Multilingual/multidomain question generation datasets, models, and python library for question generation.
Mint
⭐
192
MinT: Minimal Transformer Library and Tutorials
Question_generator
⭐
157
An NLP system for generating reading comprehension questions
Smarty Gpt
⭐
95
A wrapper of LLMs that biases its behaviour using prompts and contexts in a transparent manner to the end-users
Asp
⭐
90
PyTorch implementation and pre-trained models for ASP - Autoregressive Structured Prediction with Language Models, EMNLP 22. https://arxiv.org/pdf/2210.14698.pdf
Banglanlg
⭐
70
This repository contains the official release of the model "BanglaT5" and associated downstream finetuning code and datasets introduced in the paper titled "BanglaNLG: Benchmarks and Resources for Evaluating Low-Resource Natural Language Generation in Bangla".
Text2keywords
⭐
54
Trained T5 and T5-large model for creating keywords from text
Dst As Prompting
⭐
52
Source code for Dialogue State Tracking with a Language Modelusing Schema-Driven Prompting
Few Shot Lm
⭐
40
The source code of "Language Models are Few-shot Multilingual Learners" (MRL @ EMNLP 2021)
Ratransformers
⭐
38
RATransformers 🐭- Make your transformer (like BERT, RoBERTa, GPT-2 and T5) Relation Aware!
Ftpipe
⭐
37
FTPipe and related pipeline model parallelism research.
T5 Flax Gcp
⭐
37
Tutorial to pretrain & fine-tune a 🤗 Flax T5 model on a TPUv3-8 with GCP
Frame Semantic Transformer
⭐
37
Frame Semantic Parser based on T5 and FrameNet
Johnsnowlabs
⭐
35
Gateway into the John Snow Labs Ecosystem
Ttt
⭐
30
A package for fine-tuning Transformers with TPUs, written in Tensorflow2.0+
User Simulation T5
⭐
30
Official Code for SIGIR 2022 "A Multi-task Based Neural Model to Simulate Users in Goal Oriented Dialogue Systems". User Simulator generates user-side utterance, predicts user's next action and satisfaction level.
Chef Transformer
⭐
29
Chef Transformer 🍲 .
Flan Alpaca Lora
⭐
23
This repository contains the code to train flan t5 with alpaca instructions and low rank adaptation.
Turkish Question Generation
⭐
22
Automated question generation and question answering from Turkish texts using text-to-text transformers
Instructionner
⭐
21
Unofficial implementation of paper "InstructionNER: A Multi-Task Instruction-Based Generative Framework for Few-shot NER" (https://arxiv.org/pdf/2203.03903v1.pdf)
T5 Japanese
⭐
19
Codes to pre-train Japanese T5 models
Distilkobilstm
⭐
17
Distilling Task-Specific Knowledge from Teacher Model into BiLSTM
Bert_seq2seq_ddp
⭐
16
bert_seq2seq的DDP版本,支持bert、roberta、nezha、t5、gpt2等模型
Text Summarizer
⭐
16
Comparing state of the art models for text summary generation
T5 Encoder
⭐
13
A extension of Transformers library to include T5ForSequenceClassification class.
Book Summarizer
⭐
11
Using pretrained T5 model for abstractive summarization of books
Semeval2022 Task6 Sarcasm Detection
⭐
11
Sarcasm is a term that refers to the use of words to mock, irritate, or amuse someone. It is commonly used on social media. The metaphorical and creative nature of sarcasm presents a significant difficulty for sentiment analysis systems based on affective computing. The technique and results of our team, UTNLP, in the SemEval-2022 shared task 6 on sarcasm detection are presented in this paper.
Serving Model Cards
⭐
11
Collection of OSS models that are containerized into a serving container
Academic Paper Title Recommendation
⭐
9
Supervised text summarization (title generation/recommendation) based on academic paper abstracts, with Seq2Seq LSTM and the power of Transfer Learning and T5.
Lm Vocab Trimmer
⭐
9
Vocabulary Trimming (VT) is a model compression technique, which reduces a multilingual LM vocabulary to a target language by deleting irrelevant tokens from its vocabulary. This repository contains a python-library vocabtrimmer, that remove irrelevant tokens from a multilingual LM vocabulary for the target language.
T5 Jax
⭐
8
JAX implementation of the T5 model: Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer
Research Assistant Mini
⭐
7
Research Assistant Mini App
Famesumm
⭐
6
[EMNLP 2023] FaMeSumm: Investigating and Improving Faithfulness of Medical Summarization. Support BART, PEGASUS, T5, mT5, BioBART, etc.
Domain Robustness Prompt Tuning
⭐
6
Implementation of the report: on the domain robustness of prefix and prompt tuning
Glm Open Dialogue
⭐
5
A enhanced Open Dialogue Context Generator supported by General Language Model Pretraining with Autoregressive Blank Infilling
Audiolizr
⭐
5
A bentoML-powered API to transcribe audio and make sense of it
Amiok
⭐
5
[제 11회 투빅스 컨퍼런스] AM I OK ? - 전문의 답변 기반 심리진단 AI
Related Searches
Python Django (28,897)
Python Jupyter Notebook (21,824)
Python Deep Learning (18,278)
Python Machine Learning (17,971)
Python Flask (17,643)
Python Dataset (14,962)
Python Pytorch (14,910)
Python Docker (14,384)
Python Tensorflow (13,736)
Python Command Line (13,351)
1-44 of 44 search results
Privacy
|
About
|
Terms
|
Follow Us On Twitter
Copyright 2018-2024 Awesome Open Source. All rights reserved.