Nngeometry

{KFAC,EKFAC,Diagonal,Implicit} Fisher Matrices and finite width NTKs in PyTorch
Alternatives To Nngeometry
Project NameStarsDownloadsRepos Using ThisPackages Using ThisMost Recent CommitTotal ReleasesLatest ReleaseOpen IssuesLicenseLanguage
Performer Pytorch77742 years ago77February 02, 202232mitPython
An implementation of Performer, a linear attention-based transformer, in Pytorch
Pytorch_block_sparse379
3 years ago9otherC++
Fast Block Sparse Matrices for Pytorch
Sequitur293
a year ago9January 28, 20216mitPython
Library of autoencoders for sequential data
Nngeometry155
2 months ago1February 24, 20212mitPython
{KFAC,EKFAC,Diagonal,Implicit} Fisher Matrices and finite width NTKs in PyTorch
Butterfly76
2 years ago12apache-2.0Python
Butterfly matrix multiplication in PyTorch
Structured Nets48
a year ago6apache-2.0Python
Structured matrices for compressing neural networks
Lssvm33
4 months ago3mitPython
Python implementation of Least Squares Support Vector Machine for classification on CPU (NumPy) and GPU (PyTorch).
Openai Gemm.pytorch21
7 years agoPython
PyTorch bindings for openai-gemm
Pytorch_sym3eig18
a year ago1apache-2.0C++
Pytorch extension: Batch-wise eigencomputation for symmetric 3x3 matrices
Mogrifier Lstm Pytorch10
3 years agoPython
Implementation of Mogrifier LSTM in PyTorch
Alternatives To Nngeometry
Select To Compare


Alternative Project Comparisons
Readme

NNGeometry

Build Status codecov DOI PyPI version

NNGeometry allows you to:

  • compute Fisher Information Matrices (FIM) or derivates, using efficient approximations such as low-rank matrices, KFAC, diagonal and so on.
  • compute finite-width Neural Tangent Kernels (Gram matrices), even for multiple output functions.
  • compute per-examples jacobians of the loss w.r.t network parameters, or of any function such as the network's output.
  • easily and efficiently compute linear algebra operations involving these matrices regardless of their approximation.
  • compute implicit operations on these matrices, that do not require explicitely storing large matrices that would not fit in memory.

Example

In the Elastic Weight Consolidation continual learning technique, you want to compute . It can be achieved with a diagonal approximation for the FIM using:

F = FIM(model=model,
        loader=loader,
        representation=PMatDiag,
        n_output=10)

regularizer = F.vTMv(w - w_a)

If diagonal is not sufficiently accurate then you could instead choose a KFAC approximation, by just changing PMatDiag to PMatKFAC in the above. Note that it internally involves very different operations, depending on the chosen representation (e.g. KFAC, EKFAC, ...).

Documentation

You can visit the documentation at https://nngeometry.readthedocs.io.

More example usage are available in the repository tfjgeorge/nngeometry-examples.

Feature requests, bugs or contributions

We welcome any feature request or bug report in the issue tracker.

We also welcome contributions, please submit your PRs!

Citation

If you use NNGeometry in a published project, please cite our work using the following bibtex entry

@software{george_nngeometry,
  author       = {Thomas George},
  title        = {{NNGeometry: Easy and Fast Fisher Information 
                   Matrices and Neural Tangent Kernels in PyTorch}},
  month        = feb,
  year         = 2021,
  publisher    = {Zenodo},
  version      = {v0.2.1},
  doi          = {10.5281/zenodo.4532597},
  url          = {https://doi.org/10.5281/zenodo.4532597}
}

License

This project is distributed under the MIT license (see LICENSE file). This project also includes code licensed under the BSD 3 clause as it borrows some code from owkin/grad-cnns.

Popular Pytorch Projects
Popular Matrices Projects
Popular Machine Learning Categories
Related Searches

Get A Weekly Email With Trending Projects For These Categories
No Spam. Unsubscribe easily at any time.
Python
Pytorch
Kernel
Neural
Matrices