Project Name | Stars | Downloads | Repos Using This | Packages Using This | Most Recent Commit | Total Releases | Latest Release | Open Issues | License | Language |
---|---|---|---|---|---|---|---|---|---|---|
Nlp.js | 5,784 | 1 | 91 | 24 days ago | 40 | January 12, 2023 | 100 | mit | JavaScript | |
An NLP library for building bots, with entity extraction, sentiment analysis, automatic language identify, and so more | ||||||||||
Pytorch Sentiment Analysis | 2,905 | 2 years ago | 16 | mit | Jupyter Notebook | |||||
Tutorials on getting started with PyTorch and TorchText for sentiment analysis. | ||||||||||
Sentiment | 2,208 | 980 | 115 | 3 years ago | 26 | August 22, 2019 | 14 | mit | JavaScript | |
AFINN-based sentiment analysis for Node.js. | ||||||||||
Absa Pytorch | 1,782 | 3 months ago | 91 | mit | Python | |||||
Aspect Based Sentiment Analysis, PyTorch Implementations. 基于方面的情感分析,使用PyTorch实现。 | ||||||||||
Stocksight | 1,557 | 10 months ago | 8 | apache-2.0 | Python | |||||
Stock market analyzer and predictor using Elasticsearch, Twitter, News headlines and Python natural language processing and sentiment analysis | ||||||||||
Twitter Sentiment Analysis | 1,322 | 7 months ago | 20 | mit | Python | |||||
Sentiment analysis on tweets using Naive Bayes, SVM, CNN, LSTM, etc. | ||||||||||
Senta | 1,272 | 2 years ago | 6 | May 21, 2020 | 50 | apache-2.0 | Python | |||
Baidu's open-source Sentiment Analysis System. | ||||||||||
Text Analytics With Python | 1,073 | 3 years ago | apache-2.0 | Jupyter Notebook | ||||||
Learn how to process, classify, cluster, summarize, understand syntax, semantics and sentiment of text data with the power of Python! This repository contains code and datasets used in my book, "Text Analytics with Python" published by Apress/Springer. | ||||||||||
Wink Nlp | 1,017 | 2 months ago | 24 | May 13, 2022 | mit | JavaScript | ||||
Developer friendly Natural Language Processing ✨ | ||||||||||
Getting Things Done With Pytorch | 873 | 2 years ago | 13 | apache-2.0 | Jupyter Notebook | |||||
Jupyter Notebook tutorials on solving real-world problems with Machine Learning & Deep Learning using PyTorch. Topics: Face detection with Detectron 2, Time Series anomaly detection with LSTM Autoencoders, Object Detection with YOLO v5, Build your first Neural Network, Time Series forecasting for Coronavirus daily cases, Sentiment Analysis with BERT. |
code for our NAACL 2019 paper "BERT Post-Training for Review Reading Comprehension and Aspect-based Sentiment Analysis", COLING 2020 paper "Understanding Pre-trained BERT for Aspect-based Sentiment Analysis" and (draft code of) Findings of EMNLP 2020 "DomBERT: Domain-oriented Language Model for Aspect-based Sentiment Analysis".
We found that BERT domain post-training (e.g, 1 day of training) is an economic way to boost the performance of BERT, because it is much harder (e.g., 10 days of training) to learn a general knowledge shared across domains and, meanwhile, loosing the long-tailed domain-specific knowledge.
Code base for "Understanding Pre-trained BERT for Aspect-based Sentiment Analysis" is released.
Code base on huggingface transformers
is under transformers
, with more cross-domain models.
Preprocessing ABSA xmls organized into a separate rep.
Want to have post-trained models for other domains in reviews ? checkout a cross-domain review BERT or download from HERE.
A conversational dataset of RRC can be found here.
If you only care about ASC, a more formal code base can be found in a similar rep focusing on ASC.
**feedbacks are welcomed for missing instructions **
We focus on 3 review-based tasks: review reading comprehension (RRC), aspect extraction (AE) and aspect sentiment classification (ASC).
RRC: given a question ("how is the retina display ?") and a review ("The retina display is great.") find an answer span ("great") from that review;
AE: given a review sentence ("The retina display is great."), find aspects("retina display");
ASC: given an aspect ("retina display") and a review sentence ("The retina display is great."), detect the polarity of that aspect (positive).
E2E-ABSA: the combination of the above two tasks as a sequence labeling task.
And how a pre-trained BERT model on reviews be prepared for those tasks.
For post-training of NAACL 2019 paper, the code base is splited into two versions: transformers/
(instructions) and pytorch-pretrained-bert/
(instructions).
For analysis of pre-trained BERT model for ABSA (COLING 2020), see this instructions.
Please check corresponding instructions for details.
If you find this work useful, please cite as following.
@inproceedings{xu_bert2019,
title = "BERT Post-Training for Review Reading Comprehension and Aspect-based Sentiment Analysis",
author = "Xu, Hu and Liu, Bing and Shu, Lei and Yu, Philip S.",
booktitle = "Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics",
month = "jun",
year = "2019",
}
@inproceedings{xu_understanding2020,
title = "Understanding Pre-trained BERT for Aspect-based Sentiment Analysis",
author = "Xu, Hu and Shu, Lei and Yu, Philip S. and Liu, Bing",
booktitle = "The 28th International Conference on Computational Linguistics",
month = "Dec",
year = "2020",
}