|Project Name||Stars||Downloads||Repos Using This||Packages Using This||Most Recent Commit||Total Releases||Latest Release||Open Issues||License||Language|
|Socialsentiment||302||5 years ago||9||mit||Python|
|Sentiment Analysis application created with Python and Dash, hosted at socialsentiment.net|
|Reddit Sentiment||6||2 years ago||Python|
|VADER Reddit Sentiment analysis using Dash in python|
|Sotu Db||5||4 years ago||18||gpl-3.0||HTML|
|State of the Union DataBase for performing sentiment analysis on the text of annual presidential addresses|
|Machine Learning Applications||4||2 years ago||mit||C#|
|A collection of real-world machine learning web applications built with ML.NET, ASP.NET Core, Azure Cosmos DB, and React, which can be used as a starting point for new projects.|
|Twitter Sentiment Live||4||4 years ago||apache-2.0||Python|
|Sentiment analysis for tweets written in Portuguese-Brazil|
|Kdd Lingpipe||3||10 years ago||Java|
|Use lingpipe to do sentiment analysis|
|Twittersentimentanalysis||3||2 years ago||mit||C#|
|This is the repository for my bachelor project, an ASP.NET app allowing user to understand sentiment and key phrases of interesting tweets.|
|Twitter Sentiment Analysis||2||9 years ago||Python|
|Python script who store in an ElasticSearch DB all the tweets for one (or more) given Keywords and make simple semtiment analysis with Textblob|
|Twitter Sentiment Analysis of a keyword with a React UI|
|Twitter Scripts||2||9 years ago||Python|
|Collection of python scripts for streaming and processing twitter data|
Live-streaming sentiment analysis application created with Python and Dash, hosted at SocialSentiment.net.
This application was created in conjunction with the Dash tutorial series.
dash_mess.py- This is currently the main front-end application code. Contains the dash application layouts, logic for graphs, interfaces with the database...etc. Name is descriptive of the overall state of code :) ...this code is setup to run on a flask instance. If you want to clone this and run it locally, you will be using the
dev_server.py- If you wish to run this application locally, on the dev server, run via this instead.
twitter_stream.py- This should run in the background of your application. This is what streams tweets from Twitter, storing them into the sqlite database, which is what the
dash_mess.pyfile interfaces with.
config.py- Meant for many configurations, but right now it just contains stop words. Words we don't intend to ever count in the "trending"
cache.py- For caching purposes in effort to get things to run faster.
db-truncate.py- A script to truncate the infinitely-growing sqlite database. You will get about 3.5 millionish tweets per day, depending on how fast you can process. You can keep these, but, as the database grows, search times will dramatically suffer.
pip install -r requirements.txt
twitter_stream.py. Go to apps.twitter.com to set that up if you need to.
twitter_stream.pyto build database
dev_server.pyscript. If you want to deploy this to a webserver, see my deploying Dash application tutorial
sudo add-apt-repository ppa:jonathonf/backports sudo apt-get update && sudo apt-get install sqlite3
db-truncate.pyfrom time to time (or via a cronjob), to keep the database reasonably sized. In its current state, the database really doesn't need to store more than 2-3 days of data most likely.
gunicorn dash_mess:server -b 0.0.0.0:80 -w 4
Want to help contribute???
The speed of the application, especially with a database with 10's of millions of records is thanks entirely to Daniel Kukiela who helped us to convert from regular sqlite to using fts, helping with the queries, new database structure, caching, and more.