Project Name | Stars | Downloads | Repos Using This | Packages Using This | Most Recent Commit | Total Releases | Latest Release | Open Issues | License | Language |
---|---|---|---|---|---|---|---|---|---|---|
Interviews.ai | 3,146 | 2 years ago | 4 | |||||||
It is my belief that you, the postgraduate students and job-seekers for whom the book is primarily meant will benefit from reading it; however, it is my hope that even the most experienced researchers will find it fascinating as well. | ||||||||||
Awesome Maths Learning | 269 | 3 months ago | gpl-3.0 | |||||||
😎 📜 Collection of the most awesome Maths learning resources in the form of notes, videos and cheatsheets. | ||||||||||
Hsic Bottleneck | 56 | 3 years ago | 2 | mit | Python | |||||
The HSIC Bottleneck: Deep Learning without Back-Propagation | ||||||||||
Mcr2 | 52 | 3 years ago | Python | |||||||
Official Implementation of Learning Diverse and Discriminative Representations via the Principle of Maximal Coding Rate Reduction (2020) | ||||||||||
Pisac | 39 | a year ago | 1 | apache-2.0 | Python | |||||
Tensorflow 2 source code for the PI-SAC agent from "Predictive Information Accelerates Learning in RL" (NeurIPS 2020) | ||||||||||
Information Dropout | 34 | 7 years ago | 2 | other | Python | |||||
Implementation of Information Dropout | ||||||||||
Information Bottleneck | 20 | 7 years ago | Jupyter Notebook | |||||||
demonstration of the information bottleneck theory for deep learning | ||||||||||
Unsupervisedattentionmechanism | 19 | 2 years ago | Jupyter Notebook | |||||||
Code for our paper: "Regularity Normalization: Neuroscience-Inspired Unsupervised Attention across Neural Network Layers". | ||||||||||
Infotopopy | 17 | 2 years ago | 11 | October 11, 2020 | other | Python | ||||
computes most of information functions (joint entropy, conditional, mutual information, total correlation information distance) and deep information networks | ||||||||||
Deep Bottleneck | 17 | 4 years ago | 11 | other | Jupyter Notebook | |||||
Repository of the Study Project "Understanding learning in deep neural networks with the help of information theory" |