Project Name | Stars | Downloads | Repos Using This | Packages Using This | Most Recent Commit | Total Releases | Latest Release | Open Issues | License | Language |
---|---|---|---|---|---|---|---|---|---|---|
Tensorflow Examples | 42,312 | 5 months ago | 218 | other | Jupyter Notebook | |||||
TensorFlow Tutorial and Examples for Beginners (support TF v1 & v2) | ||||||||||
Fashion Mnist | 9,856 | a year ago | 24 | mit | Python | |||||
A MNIST-like fashion product database. Benchmark :point_down: | ||||||||||
Alae | 2,850 | 2 years ago | 31 | Python | ||||||
[CVPR2020] Adversarial Latent Autoencoders | ||||||||||
Tensorflow 101 | 2,450 | 3 years ago | 15 | mit | Jupyter Notebook | |||||
TensorFlow Tutorials | ||||||||||
Ganomaly | 767 | 3 months ago | 44 | mit | Python | |||||
GANomaly: Semi-Supervised Anomaly Detection via Adversarial Training | ||||||||||
Medmnist | 701 | 2 months ago | 3 | May 06, 2022 | 2 | apache-2.0 | Python | |||
18 MNIST-like Datasets for 2D and 3D Biomedical Image Classification: pip install medmnist | ||||||||||
Science_rcn | 664 | a year ago | 21 | mit | Python | |||||
Reference implementation of a two-level RCN model | ||||||||||
Tf Dann | 580 | a year ago | 10 | mit | Jupyter Notebook | |||||
Domain-Adversarial Neural Network in Tensorflow | ||||||||||
Free Spoken Digit Dataset | 518 | 3 months ago | 7 | Python | ||||||
A free audio dataset of spoken digits. Think MNIST for audio. | ||||||||||
Kmnist | 490 | 2 years ago | 8 | cc-by-sa-4.0 | Python | |||||
Repository for Kuzushiji-MNIST, Kuzushiji-49, and Kuzushiji-Kanji |
This source code seeks to replicate the (now removed) MNIST For ML Beginners tutorial from the Tensorflow website using plain C code.
The task is to recognise digits, such as the ones below, as accurately as possible.
make
./mnist
The neural network implemented has one output layer and no hidden layers. Softmax activation is used, and this ensures that the output activations form a probability vector corresponding to each label. The cross entropy is used as a loss function.
The algorithm reaches an accuracy of around 92% over 1000 steps.
Step 0000 Average Loss: 4.36 Accuracy: 0.152
Step 0001 Average Loss: 3.42 Accuracy: 0.188
Step 0002 Average Loss: 2.97 Accuracy: 0.298
Step 0003 Average Loss: 2.53 Accuracy: 0.319
Step 0004 Average Loss: 2.19 Accuracy: 0.412
Step 0005 Average Loss: 2.08 Accuracy: 0.437
Step 0006 Average Loss: 1.73 Accuracy: 0.468
Step 0007 Average Loss: 1.51 Accuracy: 0.447
Step 0008 Average Loss: 1.57 Accuracy: 0.496
Step 0009 Average Loss: 1.45 Accuracy: 0.516
Step 0010 Average Loss: 1.78 Accuracy: 0.559
...