Awesome Open Source
Awesome Open Source


Unofficial PyTorch Reimplementation of RandAugment. Most of codes are from Fast AutoAugment.


Models can be trained with RandAugment for the dataset of interest with no need for a separate proxy task. By only tuning two hyperparameters(N, M), you can achieve competitive performances as AutoAugments.


$ pip install git+


from torchvision.transforms import transforms
from RandAugment import RandAugment

transform_train = transforms.Compose([
    transforms.RandomCrop(32, padding=4),
    transforms.Normalize(_CIFAR_MEAN, _CIFAR_STD),

# Add RandAugment with N, M(hyperparameter)
transform_train.transforms.insert(0, RandAugment(N, M))


We use same hyperparameters as the paper mentioned. We observed similar results as reported.

You can run an experiment with,

$ python RandAugment/ -c confs/wresnet28x10_cifar10_b256.yaml --save cifar10_wres28x10.pth

CIFAR-10 Classification

Model Paper's Result Ours
Wide-ResNet 28x10 97.3 97.4
Shake26 2x96d 98.0 98.1
Pyramid272 98.5

CIFAR-100 Classification

Model Paper's Result Ours
Wide-ResNet 28x10 83.3 83.3

SVHN Classification

Model Paper's Result Ours
Wide-ResNet 28x10 98.9 98.8

ImageNet Classification

I have experienced some difficulties while reproducing paper's result.

Issue :

Model Paper's Result Ours
ResNet-50 77.6 / 92.8 TODO
EfficientNet-B5 83.2 / 96.7 TODO
EfficientNet-B7 84.4 / 97.1 TODO


Get A Weekly Email With Trending Projects For These Topics
No Spam. Unsubscribe easily at any time.
python (54,487
deep-learning (3,987
pytorch (2,381
computer-vision (1,275
convolutional-neural-networks (456
classification (282
imagenet (109
augmentation (28