Project Name | Stars | Downloads | Repos Using This | Packages Using This | Most Recent Commit | Total Releases | Latest Release | Open Issues | License | Language |
---|---|---|---|---|---|---|---|---|---|---|
Nyaa | 2,861 | 2 years ago | 61 | gpl-3.0 | Python | |||||
Bittorrent software for cats | ||||||||||
Ipmagnet | 220 | 2 years ago | 2 | other | JavaScript | |||||
Check which IP adresses your BitTorrent client is handing out to trackers | ||||||||||
Peertracker | 101 | 7 years ago | 4 | gpl-3.0 | PHP | |||||
latest known svn revision of the trigunflame peertracker, which is a lightweight PHP/SQL BitTorrent Tracker | ||||||||||
Chaosbay | 34 | 10 years ago | 1 | agpl-3.0 | Erlang | |||||
BitTorrent Tracker with upload & browsing, written on the 25th Chaos Communication Congress | ||||||||||
Biotorrents | 15 | 12 years ago | 7 | gpl-2.0 | PHP | |||||
A BitTorrent tracker for legal open sharing of scientific data | ||||||||||
Bitstorm | 14 | 11 years ago | PHP | |||||||
An updated version of BitStorm that uses PHP 5.5 database API. | ||||||||||
Bitstorm | 11 | 8 years ago | gpl-3.0 | PHP | ||||||
A fork of Bitstorm, a BitTorrent tracker written in PHP. | ||||||||||
Gobt | 10 | 6 years ago | mit | Go | ||||||
A BitTorrent Service based on https://github.com/shiyanhui/dht and https://github.com/btlike/repository. | ||||||||||
Pantor | 6 | 9 years ago | 2 | gpl-3.0 | Standard ML | |||||
Torrent indexer | ||||||||||
Chihaya Privtrak Middleware | 6 | 4 years ago | 3 | mit | Go | |||||
A private tracker middleware for the chihaya BitTorrent tracker |
This project uses Python 3.7. There are features used that do not exist in 3.6, so make sure to use Python 3.7.
This guide also assumes you 1) are using Linux and 2) are somewhat capable with the commandline.
It's not impossible to run Nyaa on Windows, but this guide doesn't focus on that.
./dev.py lint
before committing to see a list of warnings/problems.
./dev.py fix && ./dev.py isort
to automatically fix some of the issues reported by the previous command.The tests
folder contains tests for the the nyaa
module and the webserver. To run the tests:
./dev.py test
while in the repository directory.pyenv eases the use of different Python versions, and as not all Linux distros offer 3.7 packages, it's right up our alley.
pyenv
https://github.com/pyenv/pyenv/blob/master/README.md#installation
pyenv-virtualenv
https://github.com/pyenv/pyenv-virtualenv/blob/master/README.md
pyenv
and create a virtualenv for the project:
pyenv install 3.7.2
pyenv virtualenv 3.7.2 nyaa
pyenv activate nyaa
pip install -r requirements.txt
config.example.py
into config.py
SITE_FLAVOR
in your config.py
depending on which instance you want to hostYou may use SQLite but the current support for it in this project is outdated and rather unsupported.
USE_MYSQL
flag in config.pymysql Ver 15.1 Distrib 10.0.30-MariaDB, for debian-linux-gnu (x86_64) using readline 5.2
config.py
values if desired):
CREATE USER 'test'@'localhost' IDENTIFIED BY 'test123';
GRANT ALL PRIVILEGES ON *.* TO 'test'@'localhost';
FLUSH PRIVILEGES;
CREATE DATABASE nyaav2 DEFAULT CHARACTER SET utf8 COLLATE utf8_bin;
python db_create.py
to create the database and import categories
db_create.py
and run ./db_migrate.py stamp head
to mark the database version for Alembicpython run.py
pyenv deactivate
or source deactivate
(or just close your shell session)You're now ready for simple testing and development!
Continue below to learn about database migrations and enabling the advanced search engine, Elasticsearch.
./db_migrate.py stamp head
before pulling the new migration script(s).
./db_migrate.py history
instead and choose a hash that matches your current database state, then run ./db_migrate.py stamp <hash>
.git fetch && git rebase origin/master
)./db_migrate.py upgrade head
to run the migration. Done!models.py
and ensure the database schema matches the previous version (ie. your new tables/columns are not added to the live database)./db_migrate.py migrate -m "Short description of changes"
to automatically generate a migration script for the changes
migrations/versions/...
) and make sure it works! Alembic may not able to notice all changes../db_migrate.py upgrade
to run the migration and verify the upgrade works.
./db_migrate.py downgrade
to verify the downgrade works as well, then upgrade again)sudo apt-get install openjdk-8-jdk
sudo systemctl enable elasticsearch.service
sudo systemctl start elasticsearch.service
curl -XGET 'localhost:9200'
and make sure ES is running
[mariadb]
:
log-bin
server_id=1
log-basename=master1
binlog-format=row
sudo service mysql restart
)es_sync_config.example.json
) as es_sync_config.json
and adjust options in it to your liking (verify the connection options!)SHOW VARIABLES LIKE 'binlog_format';
is ROW
GRANT REPLICATION SLAVE ON *.* TO 'username'@'localhost';
to allow your configured user access to the binlog./create_es.sh
to create the indices for the torrents: nyaa
and sukebei
acknowledged: true
twicepython import_to_es.py
to import all the torrents (on nyaa and sukebei) into the ES indices.
Enable the USE_ELASTIC_SEARCH
flag in config.py
and (re)start the application.
Elasticsearch should now be functional! The ES indices won't be updated "live" with the current setup, continue below for instructions on how to hook Elasticsearch up to MySQL binlog.
However, take note that binglog is not necessary for simple ES testing and development; you can simply run import_to_es.py
from time to time to reindex all the torrents.
sync_es.py
keeps the Elasticsearch indices updated by reading the binlog and pushing the changes to the ES indices.
es_sync_config.json
is configured with the user you grated the REPLICATION
permissionsimport_to_es.py
and copy the outputted JSON into the file specified by save_loc
in your es_sync_config.json
sync_es.py
as-is or, for actual deployment, set it up as a service and run it, preferably as the system/root
sync_es.py
runs within the venv with the right dependencies!You're done! The script should now be feeding updates from the database to Elasticsearch.
Take note, however, that the specified ES index refresh interval is 30 seconds, which may feel like a long time on local development. Feel free to adjust it or poke Elasticsearch yourself!