Neural Financial Prediction Encog Java

Neural Financial Prediction Encog Java
Alternatives To Neural Financial Prediction Encog Java
Project NameStarsDownloadsRepos Using ThisPackages Using ThisMost Recent CommitTotal ReleasesLatest ReleaseOpen IssuesLicenseLanguage
Pytorch64,04914616 hours ago23August 10, 202211,362otherC++
Tensors and Dynamic neural networks in Python with strong GPU acceleration
Tensorflow Examples42,312
5 months ago218otherJupyter Notebook
TensorFlow Tutorial and Examples for Beginners (support TF v1 & v2)
Pytorch Tutorial25,860
9 days ago88mitPython
PyTorch Tutorial for Deep Learning Researchers
11 days ago1,948otherC
Convolutional Neural Networks
Ml From Scratch21,618
5 months ago4June 17, 201748mitPython
Machine Learning From Scratch. Bare bones NumPy implementations of machine learning models and algorithms with a focus on accessibility. Aims to cover everything from linear regression to deep learning.
3 months ago76mitTeX
Latex code for making neural networks diagrams
Awesome Tensorflow16,809
3 months ago30cc0-1.0
TensorFlow - A curated list of dedicated resources
5 months ago13apache-2.0Lua
Face recognition with deep neural networks.
3 years ago22
Papers with code. Sorted by stars. Updated weekly.
a year ago53HTML吴恩达老师的深度学习课程笔记及资源)
Alternatives To Neural Financial Prediction Encog Java
Select To Compare

Alternative Project Comparisons


We report on our analysis of the feasibility of using artificial neural networks (ANNs) to perform financial time series prediction. Using subsets of 7 different financial time series, we generate and train 29,718 types of multi-layer perceptron NNs and then evaluate them on test data consisting of S&P 500 end-of-day time series data. Our metric for evaluation was the accuracy to which each NN predicted the direction of the end-of-day price of the S&P 500 on the subsequent day.

Our analysis proceeds in two stages. First, we generate 29,718 different NNs by allowing numerous parameters to vary, including temporal lag, how many days back in time to start the training data, the number of hidden layers, the neuron count within each hidden layer, the training algorithm, and the subset of the 7 different financial time series that we used as inputs. From this analysis, we find that, with the single exception of the back propagation training algorithm, all of the training algorithms have mean accuracies between 48-52% but have max accuracies clustered between 60-70%. The various Resilient Propagation (RPROP) and Quick Propagation (QPROP) training algorithms have the highest max returns. In the phase of the analysis, we reduce our NN parameter space to only consider the RPROP and QPROP training algorithms, and we focus on extracting finer granularity results with respect to the parameters. We find that mean accuracies increase roughly linearly with the increase in temporal lag of the training data. Additionally, we find that increasing the number of neurons in the first and second hidden layer produces increased mean accuracy, but adding neurons in a third or fourth hidden layer do not significantly affect results. Finally, we utilize these results to train and test 1000 optimized NNs and compare their mean accuracies with those from 1000 baseline NNs. We find that the optimized results produce a mean directional accuracy that is 5.6% greater than the baseline case.

Popular Network Projects
Popular Neural Projects
Popular Networking Categories
Related Searches

Get A Weekly Email With Trending Projects For These Categories
No Spam. Unsubscribe easily at any time.
Time Series
Trading Algorithms