|Project Name||Stars||Downloads||Repos Using This||Packages Using This||Most Recent Commit||Total Releases||Latest Release||Open Issues||License||Language|
|Pyinstrument||5,342||68||38||20 hours ago||38||July 03, 2022||18||bsd-3-clause||Python|
|🚴 Call stack profiler for Python. Shows you why your code is slow!|
|Learning||1,205||8 days ago||1||Python|
|Django Cachalot||1,037||82||1||6 days ago||39||February 25, 2022||27||bsd-3-clause||Python|
|No effort, no worry, maximum performance.|
|Django Bulk Update||416||107||9||a year ago||24||August 12, 2017||17||mit||Python|
|Bulk update using one query over Django ORM|
|Collectfast||375||849||a year ago||36||June 05, 2020||14||mit||Python|
|A faster collectstatic command.|
|Django Perf Rec||324||21 hours ago||39||June 05, 2022||7||mit||Python|
|Keep detailed records of the performance of your Django code.|
|Django Speedbar||320||4 years ago||7||January 26, 2016||6||mit||Python|
|Instrumentation for django page loads|
|Django Postgres Metrics||133||4 months ago||20||April 02, 2022||5||bsd-3-clause||Python|
|A Django application that exposes a bunch of PostgreSQL database metrics.|
|Django Virtual Models||130||a day ago||32||mit||Python|
|Improve performance and maintainability with a prefetching layer in your Django project|
|Django Admin Lightweight Date Hierarchy||88||1||8 months ago||5||February 03, 2022||2||mit||Python|
|Django Admin date_hierarchy with zero queries|
Caches your Django ORM queries and automatically invalidates them.
Table of Contents:
Cachalot officially supports Python 3.7-3.10 and Django 2.2, 3.2, and 4.0-4.1 with the databases PostgreSQL, SQLite, and MySQL.
Note: an upper limit on Django version is set for your safety. Please do not ignore it.
pip install django-cachalot
To start developing, install the requirements and run the tests via tox.
Make sure you have the following services:
pip install -r requirements/hacking.txt
CREATE ROLE cachalot LOGIN SUPERUSER;
tox --current-envto run the test suite on your current Python version.
tox -e py38-django3.1-postgresql-redis
Currently, benchmarks are supported on Linux and Mac/Darwin. You will need a database called "cachalot" on MySQL and PostgreSQL. Additionally, on PostgreSQL, you will need to create a role called "cachalot." You can also run the benchmark, and it'll raise errors with specific instructions for how to fix it.
pip install -r requirements/benchmark.txt
The output will be in benchmark/TODAY'S_DATE/
TODO Create Docker-compose file to allow for easier running of data.
There are three main third party caches: cachalot, cache-machine, and cache-ops. Which do you use? We suggest a mix:
TL;DR Use cachalot for cold or modified <50 times per minutes (Most people should stick with only cachalot since you
most likely won't need to scale to the point of needing cache-machine added to the bowl). If you're an enterprise that
already has huge statistics, then mixing cold caches for cachalot and your hot caches with cache-machine is the best
mix. However, when performing joins with
prefetch_related, you can
get a nearly 100x speed up for your initial deployment.
Recall, cachalot caches THE ENTIRE TABLE. That's where its inefficiency stems from: if you keep updating the records, then the cachalot constantly invalidates the table and re-caches. Luckily caching is very efficient, it's just the cache invalidation part that kills all our systems. Look at Note 1 below to see how Reddit deals with it.
Cachalot is more-or-less intended for cold caches or "just-right" conditions. If you find a partition library for Django (also authored but work-in-progress by Andrew Chen Wang), then the caching will work better since sharding the cold/accessed-the-least records aren't invalidated as much.
Cachalot is good when there are <50 modifications per minute on a hot cached table. This is mostly due to cache invalidation. It's the same with any cache, which is why we suggest you use cache-machine for hot caches. Cache-machine caches individual objects, taking up more in the memory store but invalidates those individual objects instead of the entire table like cachalot.
Yes, the bane of our entire existence lies in cache invalidation and naming variables. Why does cachalot suck when stuck with a huge table that's modified rapidly? Since you've mixed your cold (90% of) with your hot (10% of) records, you're caching and invalidating an entire table. It's like trying to boil 1 ton of noodles inside ONE pot instead of 100 pots boiling 1 ton of noodles. Which is more efficient? The splitting up of them.
Note 1: My personal experience with caches stems from Reddit's: https://redditblog.com/2017/01/17/caching-at-reddit/
Note 2: Technical comparison: https://django-cachalot.readthedocs.io/en/latest/introduction.html#comparison-with-similar-tools
Help? Technical chat? It's here on Discord.