Implementation of the Convolutional Conditional Neural Process
Alternatives To Convcnp
Project NameStarsDownloadsRepos Using ThisPackages Using ThisMost Recent CommitTotal ReleasesLatest ReleaseOpen IssuesLicenseLanguage
Pytorch71,0033,3416,7288 hours ago37May 08, 202312,762otherPython
Tensors and Dynamic neural networks in Python with strong GPU acceleration
Tensorflow Examples42,312
a year ago218otherJupyter Notebook
TensorFlow Tutorial and Examples for Beginners (support TF v1 & v2)
Data Science Ipython Notebooks25,242
3 months ago34otherPython
Data science Python notebooks: Deep learning (TensorFlow, Theano, Caffe, Keras), scikit-learn, Kaggle, big data (Spark, Hadoop MapReduce, HDFS), matplotlib, pandas, NumPy, SciPy, Python essentials, AWS, and various command lines.
Awesome Deep Learning21,571
7 days ago31
A curated list of awesome Deep Learning tutorials, projects and communities.
Jina19,103132 days ago2,421July 30, 202314apache-2.0Python
☁️ Build multimodal AI applications with cloud-native stack
3 months ago12
Oxford Deep NLP 2017 course
4 years ago22
Papers with code. Sorted by stars. Updated weekly.
Awesome Pytorch List14,103
4 months ago4
A comprehensive list of pytorch related content on github,such as different models,implementations,helper libraries,tutorials etc.
Nni13,2578269 days ago54June 22, 2022323mitPython
An open source AutoML toolkit for automate machine learning lifecycle, including feature engineering, neural architecture search, model compression and hyper-parameter tuning.
Sonnet9,58741112 months ago28March 27, 202035apache-2.0Python
TensorFlow-based neural network library
Alternatives To Convcnp
Select To Compare

Alternative Project Comparisons

Demonstration of a ConvCNP

Convolutional Conditional Neural Processes

This repository contains code for the 1-dimensional experiments from Convolutional Convolutional Neural Processes. The code for the 2-dimensional experiments can be found here.



  • Python 3.6 or higher.

  • gcc and gfortran: On OS X, these are both installed with brew install gcc. On Linux, gcc is most likely already available, and gfortran can be installed with apt-get install gfortran.

To begin with, clone and enter the repo.

git clone
cd convcnp

Then make a virtual environment and install the requirements.

virtualenv -p python3 venv
source venv/bin/activate
pip install --upgrade pip
pip install -r requirements.txt

This will install the latest version of torch. If your version of CUDA is not the latest version, then you might need to install an earlier version of torch.

You should now be ready to go! If you encounter any problems, feel free to open an issue, and will try to help you resolve the problem as soon as possible.

Common issues:

  • fatal error: Python.h: No such file or directory: Python libraries seem to be missing. Try sudo apt-get install python3.X-dev with X replaced by your particular version.

Expository Notebooks

For a tutorial-style exposition of ConvCNPs, see the following two expository notebooks:

Reproducing the 1D Experiments

To reproduce the numbers from the 1d experiments, python <data> <model> --train can be used. The first argument, <data>, specifies the data that the model will be trained on, and should be one of the following:

  • eq: samples from a GP with an exponentiated quadratic (EQ) kernel;
  • matern: samples from a GP with a Matern-5/2 kernel;
  • noisy-mixture: samples from a GP with a mixture of two EQ kernels and some noise;
  • weakly-periodic: samples from a GP with a weakly-periodic kernel; or
  • sawtooth: random sawtooth functions.

The second argument, <model>, specifies the model that will be trained, and should be one of the following:

  • convcnp: small architecture for the Convolutional Conditional Neural Process;
  • convcnpxl: large architecture for the Convolutional Conditional Neural Process;
  • cnp: Conditional Neural Process; or
  • anp: Attentive Conditional Neural Process.

Upon calling python <data> <model> --train, first the specified model will be trained on the specified data source. Afterwards, the script will print the average log-likelihood on unseen data.

To reproduce the numbers from all the 1d experiments from the paper at once, you can use ./

For more options, please see python --help:

usage: [-h] [--root ROOT] [--train] [--epochs EPOCHS]
                [--learning_rate LEARNING_RATE] [--weight_decay WEIGHT_DECAY]

positional arguments:
                        Data set to train the CNP on.
                        Choice of model.

optional arguments:
  -h, --help            show this help message and exit
  --root ROOT           Experiment root, which is the directory from which the
                        experiment will run. If it is not given, a directory
                        will be automatically created.
  --train               Perform training. If this is not specified, the model
                        will be attempted to be loaded from the experiment
  --epochs EPOCHS       Number of epochs to train for.
  --learning_rate LEARNING_RATE
                        Learning rate.
  --weight_decay WEIGHT_DECAY
                        Weight decay.


Gordon, J., Bruinsma W. P., Foong, A. Y. K., Requeima, J., Dubois Y., Turner, R. E. (2019). "Convolutional Conditional Neural Processes," International Conference on Learning Representations (ICLR), 8th.


    title = {Convolutional Conditional Neural Processes},
    author = {Jonathan Gordon and Wessel P. Bruinsma and Andrew Y. K. Foong and James Requeima and Yann Dubois and Richard E. Turner},
    year = {2020},
    booktitle = {International Conference on Learning Representations},
    url = {}
Popular Neural Projects
Popular Machine Learning Projects
Popular Machine Learning Categories
Related Searches

Get A Weekly Email With Trending Projects For These Categories
No Spam. Unsubscribe easily at any time.
Jupyter Notebook
Machine Learning