Maml Pytorch

Elegant PyTorch implementation of paper Model-Agnostic Meta-Learning (MAML)
Alternatives To Maml Pytorch
Project NameStarsDownloadsRepos Using ThisPackages Using ThisMost Recent CommitTotal ReleasesLatest ReleaseOpen IssuesLicenseLanguage
Pytorch70,9383,3416,72812 hours ago37May 08, 202312,734otherPython
Tensors and Dynamic neural networks in Python with strong GPU acceleration
Real Time Voice Cloning42,864
2 months ago148otherPython
Clone a voice in 5 seconds to generate arbitrary speech in real-time
a day ago8September 21, 2021237agpl-3.0Python
YOLOv5 🚀 in PyTorch > ONNX > CoreML > TFLite
Deepspeed28,4525314 hours ago68July 17, 2023789apache-2.0Python
DeepSpeed is a deep learning optimization library that makes distributed training and inference easy, efficient, and effective.
Fastai24,5001841453 days ago146March 28, 2023182apache-2.0Jupyter Notebook
The fastai deep learning library
Pytorch Handbook18,594
2 months ago52Jupyter Notebook
pytorch handbook是一本开源的书籍,目标是帮助那些希望和使用PyTorch进行深度学习开发和研究的朋友快速入门,其中包含的Pytorch教程全部通过测试保证可以成功运行
Ncnn17,945114 hours ago24November 28, 20221,044otherC++
ncnn is a high-performance neural network inference framework optimized for the mobile platform
Dive Into Dl Pytorch13,747
2 years ago76apache-2.0Jupyter Notebook
本项目将《动手学深度学习》(Dive into Deep Learning)原书中的MXNet实现改为PyTorch实现。
Horovod13,56420112 days ago77June 12, 2023358otherPython
Distributed training framework for TensorFlow, Keras, PyTorch, and Apache MXNet.
Ivy13,352612 hours ago20June 01, 20223,463otherPython
The Unified AI Framework
Alternatives To Maml Pytorch
Select To Compare

Alternative Project Comparisons


PyTorch implementation of the supervised learning experiments from the paper: Model-Agnostic Meta-Learning (MAML).

Version 1.0: Both MiniImagenet and Omniglot Datasets are supported! Have Fun~

Version 2.0: Re-write meta learner and basic learner. Solved some serious bugs in version 1.0.

For Tensorflow Implementation, please visit official HERE and simplier version HERE.

For First-Order Approximation Implementation, Reptile namely, please visit HERE.



  • python: 3.x
  • Pytorch: 0.4+



For 5-way 1-shot exp., it allocates nearly 6GB GPU memory.

  1. download MiniImagenet dataset from here, splitting: train/val/test.csv from here.
  2. extract it like:
├── images
	├── n0210891500001298.jpg  
	├── n0287152500001298.jpg 
├── test.csv
├── val.csv
└── train.csv

  1. modify the path in
        mini = MiniImagenet('miniimagenet/', mode='train', n_way=args.n_way, k_shot=args.k_spt,
                    batchsz=10000, resize=args.imgsz)
        mini_test = MiniImagenet('miniimagenet/', mode='test', n_way=args.n_way, k_shot=args.k_spt,
                    batchsz=100, resize=args.imgsz)

to your actual data path.

  1. just run python and the running screenshot is as follows: screenshot-miniimagetnet

If your reproducation perf. is not so good, maybe you can enlarge your training epoch to get longer training. And MAML is notorious for its hard training. Therefore, this implementation only provide you a basic start point to begin your research. and the performance below is true and achieved on my machine.


Model Fine Tune 5-way Acc. 20-way Acc.
1-shot 5-shot 1-shot 5-shot
Matching Nets N 43.56% 55.31% 17.31% 22.69%
Meta-LSTM 43.44% 60.60% 16.70% 26.06%
MAML Y 48.7% 63.11% 16.49% 19.29%
Ours Y 46.2% 60.3% - -



run python, the program will download omniglot dataset automatically.

decrease the value of args.task_num to fit your GPU memory capacity.

For 5-way 1-shot exp., it allocates nearly 3GB GPU memory.

Refer to this Rep.

  author = {Liangqu Long},
  title = {MAML-Pytorch Implementation},
  year = {2018},
  publisher = {GitHub},
  journal = {GitHub repository},
  howpublished = {\url{}},
  commit = {master}
Popular Pytorch Projects
Popular Gpu Projects
Popular Machine Learning Categories

Get A Weekly Email With Trending Projects For These Categories
No Spam. Unsubscribe easily at any time.