Project Name | Stars | Downloads | Repos Using This | Packages Using This | Most Recent Commit | Total Releases | Latest Release | Open Issues | License | Language |
---|---|---|---|---|---|---|---|---|---|---|
Tensorflow Examples | 42,312 | 5 months ago | 218 | other | Jupyter Notebook | |||||
TensorFlow Tutorial and Examples for Beginners (support TF v1 & v2) | ||||||||||
Fashion Mnist | 9,856 | a year ago | 24 | mit | Python | |||||
A MNIST-like fashion product database. Benchmark :point_down: | ||||||||||
Alae | 2,850 | 2 years ago | 31 | Python | ||||||
[CVPR2020] Adversarial Latent Autoencoders | ||||||||||
Tensorflow 101 | 2,450 | 3 years ago | 15 | mit | Jupyter Notebook | |||||
TensorFlow Tutorials | ||||||||||
Ganomaly | 767 | 3 months ago | 44 | mit | Python | |||||
GANomaly: Semi-Supervised Anomaly Detection via Adversarial Training | ||||||||||
Medmnist | 701 | 2 months ago | 3 | May 06, 2022 | 2 | apache-2.0 | Python | |||
18 MNIST-like Datasets for 2D and 3D Biomedical Image Classification: pip install medmnist | ||||||||||
Science_rcn | 664 | a year ago | 21 | mit | Python | |||||
Reference implementation of a two-level RCN model | ||||||||||
Tf Dann | 580 | a year ago | 10 | mit | Jupyter Notebook | |||||
Domain-Adversarial Neural Network in Tensorflow | ||||||||||
Free Spoken Digit Dataset | 518 | 3 months ago | 7 | Python | ||||||
A free audio dataset of spoken digits. Think MNIST for audio. | ||||||||||
Kmnist | 490 | 2 years ago | 8 | cc-by-sa-4.0 | Python | |||||
Repository for Kuzushiji-MNIST, Kuzushiji-49, and Kuzushiji-Kanji |
This source code seeks to replicate the (now removed) MNIST For ML Beginners tutorial from the Tensorflow website using straight forward PHP code. Hopefully, this example will make that tutorial a bit more manageable for PHP developers.
The task is to recognise digits, such as the ones below, as accurately as possible.
php mnist.php
The neural network implemented has one output layer and no hidden layers. Softmax activation is used, and this ensures that the output activations form a probability vector corresponding to each label. The cross entropy is used as a loss function.
This algorithm can achieve an accuracy of around 92% (with a batch size of 100 and 1000 training steps). That said, you are likely to get bored well before that point with PHP.
Loading training dataset... (may take a while)
Loading test dataset... (may take a while)
Starting training...
Step 0001 Average Loss 4.12 Accuracy: 0.19
Step 0002 Average Loss 3.21 Accuracy: 0.23
Step 0003 Average Loss 2.59 Accuracy: 0.32
Step 0004 Average Loss 2.43 Accuracy: 0.36
Step 0005 Average Loss 1.87 Accuracy: 0.45
Step 0006 Average Loss 2.06 Accuracy: 0.47
Step 0007 Average Loss 1.67 Accuracy: 0.51
Step 0008 Average Loss 1.81 Accuracy: 0.46
Step 0009 Average Loss 1.74 Accuracy: 0.55
Step 0010 Average Loss 1.24 Accuracy: 0.56
...