Deep neural network kernel for Gaussian process
Alternatives To Nngp
Project NameStarsDownloadsRepos Using ThisPackages Using ThisMost Recent CommitTotal ReleasesLatest ReleaseOpen IssuesLicenseLanguage
Cilium16,5732217 hours ago763July 27, 20231,086apache-2.0Go
eBPF-based Networking, Security, and Observability
Linux Network Performance Parameters5,091
7 days ago1bsd-3-clause
Learn where some of the network sysctl variables fit into the Linux/Kernel network flow. Translations: 🇷🇺
F Stack3,572
7 days ago222otherC
F-Stack is an user space network development kit with high performance based on DPDK, FreeBSD TCP/IP stack and coroutine API.
Neural Tangents2,10014 days ago30August 24, 202358apache-2.0Jupyter Notebook
Fast and Easy Infinite Neural Networks in Python
4 days ago13July 14, 202312apache-2.0C
Packet, where are you? -- eBPF-based Linux kernel networking debugger
17 days ago4gpl-3.0C
wavemon is an ncurses-based monitoring application for wireless network devices on Linux.
11 days ago98otherC
XanMod: Linux kernel source code tree
Network Lab522
a month ago1Shell
Networking lab using root-less VM
4 years agogpl-2.0C
Functional Network Framework for Multi-Core Architectures
Tenus467173 years ago6July 30, 20202apache-2.0Go
Linux networking in Go
Alternatives To Nngp
Select To Compare

Alternative Project Comparisons

NNGP: Deep Neural Network Kernel for Gaussian Process

TensorFlow open source implementation of

Deep Neural Networks as Gaussian Processes

by Jaehoon Lee*, Yasaman Bahri*, Roman Novak, Sam Schoenholz, Jeffrey Pennington, Jascha Sohl-dickstein

Presented at the International Conference on Learning Representation(ICLR) 2018.

UPDATE (September 2020):

See also Neural Tangents: Fast and Easy Infinite Neural Networks in Python (ICLR 2020) available at github.com/google/neural-tangents for more up-to-date progress on computing NNGP as well as NT kernels supporting wide variety of architectural components.


A deep neural network with i.i.d. priors over its parameters is equivalent to a Gaussian process in the limit of infinite network width. The Neural Network Gaussian Process (NNGP) is fully described by a covariance kernel determined by corresponding architecture.

This code constructs covariance kernel for the Gaussian process that is equivalent to infinitely wide, fully connected, deep neural networks.


To use the code, run run_experiments.py, which uses NNGP kernel to make full Bayesian prediction on the MNIST dataset.

python run_experiments.py \
       --num_train=100 \
       --num_eval=10000 \
       --hparams='nonlinearity=relu,depth=100,weight_var=1.79,bias_var=0.83' \


Code author: Jaehoon Lee, Yasaman Bahri, Roman Novak

Pull requests and issues: @jaehlee


If you use this code, please cite our paper:

    title={Deep Neural Networks as Gaussian Processes},
    author={Jaehoon Lee, Yasaman Bahri, Roman Novak, Sam Schoenholz, Jeffrey Pennington, Jascha Sohl-dickstein},
    journal={International Conference on Learning Representations},


This is not an official Google product.

Popular Kernel Projects
Popular Network Projects
Popular Operating Systems Categories

Get A Weekly Email With Trending Projects For These Categories
No Spam. Unsubscribe easily at any time.
Neural Network