A Binary Multi-Layer Neural Network
Alternatives To Binaryneuralnet
Project NameStarsDownloadsRepos Using ThisPackages Using ThisMost Recent CommitTotal ReleasesLatest ReleaseOpen IssuesLicenseLanguage
Pytorch64,57514620 hours ago23August 10, 202211,522otherC++
Tensors and Dynamic neural networks in Python with strong GPU acceleration
Tensorflow Examples42,312
5 months ago218otherJupyter Notebook
TensorFlow Tutorial and Examples for Beginners (support TF v1 & v2)
Pytorch Tutorial25,860
20 days ago88mitPython
PyTorch Tutorial for Deep Learning Researchers
9 days ago1,948otherC
Convolutional Neural Networks
Ml From Scratch21,618
5 months ago4June 17, 201748mitPython
Machine Learning From Scratch. Bare bones NumPy implementations of machine learning models and algorithms with a focus on accessibility. Aims to cover everything from linear regression to deep learning.
3 months ago76mitTeX
Latex code for making neural networks diagrams
Awesome Tensorflow16,809
3 months ago30cc0-1.0
TensorFlow - A curated list of dedicated resources
5 months ago13apache-2.0Lua
Face recognition with deep neural networks.
3 years ago22
Papers with code. Sorted by stars. Updated weekly.
a year ago53HTML吴恩达老师的深度学习课程笔记及资源)
Alternatives To Binaryneuralnet
Select To Compare

Alternative Project Comparisons


A elementary feedforward multilayer perceptron Neural Network designed to pattern match basic binary operators such as And Or and Xor. The network will usually successfully converge when "hinted" values/ express settings are used. This is the best option for users interested in the functionality of the program who do not want to do a deep dive into the actual calculations and granular details of Neural Network Training.

Eventually the program will be fully implemented using Tensorflow in Python ( for maximum performance for testing purposes, while the ground up Java code located in will be used as a teaching and experimental low-level tool.

What is does:

The Binary Operators AND, OR, and XOR are the basis for propositional logic which is fundamental to a great deal of topics in mathematics and computer science. When many university students take their first course in propositional logic and discrete mathematics they learn the truth tables for these operators, this Neural Network does just that. When given a truth table the Neural Network cannot correctly apply the operators in it's initial (untrained) state however, following proper training the Neural Network can "learn" the correct patterns for the different operators and apply them with a high degree of accuracy.

As an example if a user selects the "XOR" (Exclusive Or) option from the console and does not use custom weights, the results from the 4 possible inputs to the neural network (i.e. 0,1 0,0 1,0 1,1) will simply be random floating point values (such that 0 <= x <= 1). These outputs are due to the initial weights in the network being set to random values. Once the network has been trained which can be done easily by selecting "express" training (this also sets some key training parameters to ideal values for successful training i.e. a sufficiently low error rate), the network's weights will be adjusted such that the network produces outputs within the margin of error specified (Note: This process can take anywhere from thousands to billions of iterative adjustments depending on the margin of error the network must meet). Once trained the network should be able to reproduce "XOR" to a high degree of accuracy, to elaborate this means that when given the inputs 0,1 the code is likely to produce a result such as 0.99xxx which is considered sufficiently close to the correct answer of 1.

Note as of October 2017 Neural Networks can be saved for later analysis. This will allow for ensemble style learning as well as other research oppurtunities.

How it works:

The basis of any Neural Network is a series of Neurons and Weights (connections with synaptic strengths between neurons)[see diagram]. A Neural Network may have i input neurons and o output neurons and some number of hidden neurons arranged in layers between the input and output neurons. When a Neural Network is initialized it's weights are set to random floating-point values which when given an input a is unlikely to produce a desired input b however, through the process of training the neural networks weights can be adjusted such that the network produces an output which is very near to the desired output b. This is achieved by combining several discrete processes iteratively. First on each iteration the error for the network is calculated using the Mean Squared Error (MSE), assuming the error rate is above the user selected threshold individual error is calculated across each weight using gradient descent (a process which can be likened to a ball rolling in a hilly valley trying to find the lowest point). At this point weights are updated backwards across the network from output to input (reverse feed-forward direction) using Backpropagation, and the process is repeated.

Save functionality for trained networks operats using the Java Serializable interface, an addition made by U/Blackspade741.

Development Plan:

Currently functions completely while (tf ~ Tensorflow) is in development, once is fully functional it will inherit the user interface of and be used purely for demonstration purposes. At this point will be given more granular controls for use in experimentation and for teaching purposes. In addition performance testing functionality will be added and used to compare the implementation of the neural networks both bottom up, and using Tensorflow. Note: At present can be used to generate potential weights to be used in our Neural Network however this functionality is in initial stages and has yet to be validated for accuracy.


This project is largely inspired by and based on information and techniques from Jeff Heaton's videos and lectures on Neural Networks posted on Youtube:

I also would like to acknowlege Ray Kurzweil's excellent book "How to Create a Mind" which provided inspiration for this project as well as a broad overview on Machine Learning. (Amazon Link:

For more technical details, filling in gaps, and expanding on information from the previously mentioned lectures and videos I utilized Wikipedias excellent in-depth pages on Backpropagation (, Perceptrons (, Multi-Layer Perceptrons (, Activation Functions (, the Logistic Curve (, and Gradient Descent (

Popular Network Projects
Popular Neural Projects
Popular Networking Categories
Related Searches

Get A Weekly Email With Trending Projects For These Categories
No Spam. Unsubscribe easily at any time.
Neural Network