Mnist Neural Network Plain C

A neural network implementation for the MNIST dataset, written in plain C
Alternatives To Mnist Neural Network Plain C
Project NameStarsDownloadsRepos Using ThisPackages Using ThisMost Recent CommitTotal ReleasesLatest ReleaseOpen IssuesLicenseLanguage
Tensorflow Examples42,312
5 months ago218otherJupyter Notebook
TensorFlow Tutorial and Examples for Beginners (support TF v1 & v2)
Fashion Mnist9,856
a year ago24mitPython
A MNIST-like fashion product database. Benchmark :point_down:
2 years ago31Python
[CVPR2020] Adversarial Latent Autoencoders
Tensorflow 1012,450
3 years ago15mitJupyter Notebook
TensorFlow Tutorials
3 months ago44mitPython
GANomaly: Semi-Supervised Anomaly Detection via Adversarial Training
2 months ago3May 06, 20222apache-2.0Python
18 MNIST-like Datasets for 2D and 3D Biomedical Image Classification: pip install medmnist
a year ago21mitPython
Reference implementation of a two-level RCN model
Tf Dann580
a year ago10mitJupyter Notebook
Domain-Adversarial Neural Network in Tensorflow
Free Spoken Digit Dataset518
3 months ago7Python
A free audio dataset of spoken digits. Think MNIST for audio.
2 years ago8cc-by-sa-4.0Python
Repository for Kuzushiji-MNIST, Kuzushiji-49, and Kuzushiji-Kanji
Alternatives To Mnist Neural Network Plain C
Select To Compare

Alternative Project Comparisons

MNIST Neural Network in C

This source code seeks to replicate the (now removed) MNIST For ML Beginners tutorial from the Tensorflow website using plain C code.

The task is to recognise digits, such as the ones below, as accurately as possible.

MNIST digits

By AndrewCarterUK (Twitter)


  • mnist.c: Glue code that runs the algorithm steps and reports algorithm accuracy
  • mnist_file.c: Retrieves images and labels from the MNIST dataset
  • neural_network.c: Implements training and prediction routines for a simple neural network




The neural network implemented has one output layer and no hidden layers. Softmax activation is used, and this ensures that the output activations form a probability vector corresponding to each label. The cross entropy is used as a loss function.

The algorithm reaches an accuracy of around 92% over 1000 steps.

Expected Output

Step 0000	Average Loss: 4.36	Accuracy: 0.152
Step 0001	Average Loss: 3.42	Accuracy: 0.188
Step 0002	Average Loss: 2.97	Accuracy: 0.298
Step 0003	Average Loss: 2.53	Accuracy: 0.319
Step 0004	Average Loss: 2.19	Accuracy: 0.412
Step 0005	Average Loss: 2.08	Accuracy: 0.437
Step 0006	Average Loss: 1.73	Accuracy: 0.468
Step 0007	Average Loss: 1.51	Accuracy: 0.447
Step 0008	Average Loss: 1.57	Accuracy: 0.496
Step 0009	Average Loss: 1.45	Accuracy: 0.516
Step 0010	Average Loss: 1.78	Accuracy: 0.559

training evolution

Popular Dataset Projects
Popular Mnist Projects
Popular Data Processing Categories

Get A Weekly Email With Trending Projects For These Categories
No Spam. Unsubscribe easily at any time.