Project Name | Stars | Downloads | Repos Using This | Packages Using This | Most Recent Commit | Total Releases | Latest Release | Open Issues | License | Language |
---|---|---|---|---|---|---|---|---|---|---|
Sgd | 53 | 2 | 4 years ago | 4 | July 12, 2019 | 39 | C++ | |||
An R package for large scale estimation with stochastic gradient descent | ||||||||||
Mathematics For Machine Learning And Data Science Specialization Coursera | 42 | 4 months ago | Jupyter Notebook | |||||||
Mathematics for Machine Learning and Data Science Specialization - Coursera - deeplearning.ai - solutions and notes | ||||||||||
Ml Dl Implementation | 41 | 9 months ago | 1 | April 05, 2021 | 27 | bsd-3-clause | Python | |||
An implementation of ML and DL algorithms from scratch in python using nothing but NumPy and Matplotlib. | ||||||||||
Ktboost | 36 | 2 years ago | 46 | November 22, 2021 | 2 | other | Python | |||
A Python package which implements several boosting algorithms with different combinations of base learners, optimization algorithms, and loss functions. | ||||||||||
Chainer Param Monitor | 12 | 7 years ago | Python | |||||||
Monitor parameter and gradient statistics during neural network training with Chainer | ||||||||||
Moduleext Mx | 3 | 6 years ago | Python | |||||||
The extended version of mxnet.module.Module | ||||||||||
Heatmap | 2 | 7 years ago | mit | R | ||||||
Racecar | 2 | 3 years ago | mit | Python | ||||||
An easy-to-use and extendable Python library packaging together novel MCMC sampling algorithms. | ||||||||||
Linear Regression | 2 | 6 years ago | Python | |||||||
Linear Regression using Gradient Descent |
sgd is an R package for large scale estimation. It features many stochastic gradient methods, built-in models, visualization tools, automated hyperparameter tuning, model checking, interval estimation, and convergence diagnostics.
At the core of the package is the function
sgd(formula, data, model, model.control, sgd.control)
It estimates parameters for a given data set and model using stochastic gradient
descent. The optional arguments model.control
and sgd.control
specify
attributes about the model and stochastic gradient method. Taking advantage of
the bigmemory package, sgd also operates on data sets which are too large to fit
in RAM as well as streaming data.
Example of large-scale linear regression:
library(sgd)
# Dimensions
N <- 1e5 # number of data points
d <- 1e2 # number of features
# Generate data.
X <- matrix(rnorm(N*d), ncol=d)
theta <- rep(5, d+1)
eps <- rnorm(N)
y <- cbind(1, X) %*% theta + eps
dat <- data.frame(y=y, x=X)
sgd.theta <- sgd(y ~ ., data=dat, model="lm")
Any loss function may be specified. For convenience the following are built-in:
The following stochastic gradient methods exist:
Check out the vignette in vignettes/
or examples in demo/
.
In R, the equivalent commands are vignette(package="sgd")
and
demo(package="sgd")
.
To install the latest version from CRAN:
install.packages("sgd")
To install the latest development version from Github:
# install.packages("devtools")
devtools::install_github("airoldilab/sgd")
sgd is written by Dustin Tran and Panos Toulis, and is under active development. Please feel free to contribute by submitting any issues or requests—or by solving any current issues!
We thank all other members of the Airoldi Lab (led by Prof. Edo Airoldi) for their feedback and contributions.
@article{tran2015stochastic,
author = {Tran, Dustin and Toulis, Panos and Airoldi, Edoardo M},
title = {Stochastic gradient descent methods for estimation with large data sets},
journal = {arXiv preprint arXiv:1509.06459},
year = {2015}
}