Sgd

An R package for large scale estimation with stochastic gradient descent
Alternatives To Sgd
Project NameStarsDownloadsRepos Using ThisPackages Using ThisMost Recent CommitTotal ReleasesLatest ReleaseOpen IssuesLicenseLanguage
Sgd53
24 years ago4July 12, 201939C++
An R package for large scale estimation with stochastic gradient descent
Mathematics For Machine Learning And Data Science Specialization Coursera42
4 months agoJupyter Notebook
Mathematics for Machine Learning and Data Science Specialization - Coursera - deeplearning.ai - solutions and notes
Ml Dl Implementation41
9 months ago1April 05, 202127bsd-3-clausePython
An implementation of ML and DL algorithms from scratch in python using nothing but NumPy and Matplotlib.
Ktboost36
2 years ago46November 22, 20212otherPython
A Python package which implements several boosting algorithms with different combinations of base learners, optimization algorithms, and loss functions.
Chainer Param Monitor12
7 years agoPython
Monitor parameter and gradient statistics during neural network training with Chainer
Moduleext Mx3
6 years agoPython
The extended version of mxnet.module.Module
Heatmap2
7 years agomitR
Racecar2
3 years agomitPython
An easy-to-use and extendable Python library packaging together novel MCMC sampling algorithms.
Linear Regression2
6 years agoPython
Linear Regression using Gradient Descent
Alternatives To Sgd
Select To Compare


Alternative Project Comparisons
Readme

sgd

sgd is an R package for large scale estimation. It features many stochastic gradient methods, built-in models, visualization tools, automated hyperparameter tuning, model checking, interval estimation, and convergence diagnostics.

Features

At the core of the package is the function

sgd(formula, data, model, model.control, sgd.control)

It estimates parameters for a given data set and model using stochastic gradient descent. The optional arguments model.control and sgd.control specify attributes about the model and stochastic gradient method. Taking advantage of the bigmemory package, sgd also operates on data sets which are too large to fit in RAM as well as streaming data.

Example of large-scale linear regression:

library(sgd)

# Dimensions
N <- 1e5  # number of data points
d <- 1e2  # number of features

# Generate data.
X <- matrix(rnorm(N*d), ncol=d)
theta <- rep(5, d+1)
eps <- rnorm(N)
y <- cbind(1, X) %*% theta + eps
dat <- data.frame(y=y, x=X)

sgd.theta <- sgd(y ~ ., data=dat, model="lm")

Any loss function may be specified. For convenience the following are built-in:

  • Linear models
  • Generalized linear models
  • Method of moments
  • Generalized method of moments
  • Cox proportional hazards model
  • M-estimation

The following stochastic gradient methods exist:

  • (Standard) stochastic gradient descent
  • Implicit stochastic gradient descent
  • Averaged stochastic gradient descent
  • Averaged implicit stochastic gradient descent
  • Classical momentum
  • Nesterov's accelerated gradient

Check out the vignette in vignettes/ or examples in demo/. In R, the equivalent commands are vignette(package="sgd") and demo(package="sgd").

Installation

To install the latest version from CRAN:

install.packages("sgd")

To install the latest development version from Github:

# install.packages("devtools")
devtools::install_github("airoldilab/sgd")

Authors

sgd is written by Dustin Tran and Panos Toulis, and is under active development. Please feel free to contribute by submitting any issues or requests—or by solving any current issues!

We thank all other members of the Airoldi Lab (led by Prof. Edo Airoldi) for their feedback and contributions.

Citation

@article{tran2015stochastic,
  author = {Tran, Dustin and Toulis, Panos and Airoldi, Edoardo M},
  title = {Stochastic gradient descent methods for estimation with large data sets},
  journal = {arXiv preprint arXiv:1509.06459},
  year = {2015}
}
Popular Gradient Projects
Popular Statistics Projects
Popular User Interface Components Categories
Related Searches

Get A Weekly Email With Trending Projects For These Categories
No Spam. Unsubscribe easily at any time.
C Plus Plus
R
Statistics
Data Analysis
Gradient
Big Data
Gradient Descent