Project Name | Stars | Downloads | Repos Using This | Packages Using This | Most Recent Commit | Total Releases | Latest Release | Open Issues | License | Language |
---|---|---|---|---|---|---|---|---|---|---|
Autogluon | 7,167 | 14 | 3 days ago | 1,168 | December 10, 2023 | 299 | apache-2.0 | Python | ||
Fast and Accurate ML in 3 Lines of Code | ||||||||||
Kashgari | 2,141 | 1 | 1 | 3 years ago | 11 | October 18, 2019 | 32 | apache-2.0 | Python | |
Kashgari is a production-level NLP Transfer learning framework built on top of tf.keras for text-labeling and text-classification, includes Word2Vec, BERT, and GPT2 Language Embedding. | ||||||||||
Sparseml | 1,910 | 5 | 4 months ago | 37 | December 04, 2023 | 60 | apache-2.0 | Python | ||
Libraries for applying sparsification recipes to neural networks with a few lines of code, enabling faster and smaller models | ||||||||||
Easynlp | 1,871 | 4 months ago | 1 | April 27, 2022 | 34 | apache-2.0 | Python | |||
EasyNLP: A Comprehensive and Easy-to-use NLP Toolkit | ||||||||||
Farm | 1,706 | 1 | 2 | 5 months ago | 25 | September 14, 2020 | 6 | apache-2.0 | Python | |
:house_with_garden: Fast & easy transfer learning for NLP. Harvesting language models for the industry. Focus on Question Answering. | ||||||||||
Jiant | 1,608 | 10 months ago | 6 | May 10, 2021 | 73 | mit | Python | |||
jiant is an nlp toolkit | ||||||||||
Transfer Learning Conv Ai | 1,499 | 2 years ago | 63 | mit | Python | |||||
🦄 State-of-the-Art Conversational AI with Transfer Learning | ||||||||||
Spacy Transformers | 1,302 | 6 | 5 months ago | 7 | May 25, 2023 | mit | Python | |||
🛸 Use pretrained transformers like BERT, XLNet and GPT-2 in spaCy | ||||||||||
Awesome Transformer Nlp | 1,022 | a month ago | 1 | mit | ||||||
A curated list of NLP resources focused on Transformer networks, attention mechanism, GPT, BERT, ChatGPT, LLMs, and transfer learning. | ||||||||||
Bert_language_understanding | 886 | 5 years ago | 9 | Python | ||||||
Pre-training of Deep Bidirectional Transformers for Language Understanding: pre-train TextCNN |