Probabilistic programming via source rewriting
Alternatives To Soss.jl
Project NameStarsDownloadsRepos Using ThisPackages Using ThisMost Recent CommitTotal ReleasesLatest ReleaseOpen IssuesLicenseLanguage
4 days ago166bsd-3-clauseC++
Stan development repository. The master branch contains the current release. The develop branch contains the latest stable development. See the Developer Process Wiki for details.
21 hours ago70mitJulia
Bayesian inference with probabilistic programming.
Pymc Resources1,722
3 months ago31mitJupyter Notebook
PyMC educational resources
Pytorch Bayesiancnn1,181
2 months ago15mitPython
Bayesian Convolutional Neural Network with Variational Inference based on Bayes by Backprop in PyTorch.
14 days ago325C++
RStan, the R interface to Stan
Bayesian Stats Modelling Tutorial601
a year ago14mitJupyter Notebook
How to do Bayesian statistical modelling using numpy and PyMC3
Bayesian Analysis Recipes510
a year ago2mitJupyter Notebook
A collection of Bayesian data analysis recipes using PyMC3
a month ago106mitJulia
Probabilistic programming via source rewriting
2 months ago148gpl-3.0R
rstanarm R package for Bayesian applied regression modeling
Statsexpressions2951212 days ago26August 11, 202217otherR
Tidy data frames and expressions with statistical summaries 📜
Alternatives To Soss.jl
Select To Compare

Alternative Project Comparisons


Stable Dev Build Status Coverage

Soss is a library for probabilistic programming.

Let's look at an example. First we'll load things:

using MeasureTheory
using Soss

MeasureTheory.jl is designed specifically with PPLs like Soss in mind, though you can also use Distributions.jl.

Now for a model. Here's a linear regression:

m = @model x begin
    α ~ Lebesgue(ℝ)
    β ~ Normal()
    σ ~ Exponential()
    y ~ For(x) do xj
        Normal(α + β * xj, σ)
    return y

Next we'll generate some fake data to work with. For x-values, let's use

x = randn(20)

Now loosely speaking, Lebesgue(ℝ) is uniform over the real numbers, so we can't really sample from it. Instead, let's transform the model and make α an argument:

julia> predα = predictive(m, :α)
@model (x, α) begin
        σ ~ Exponential()
        β ~ Normal()
        y ~ For(x) do xj
                Normal(α + β * xj, σ)
        return y

Now we can do

julia> y = rand(predα(x=x,α=10.0))
20-element Vector{Float64}:

Now for inference! Let's use DynamicHMC, which we have wrapped in SampleChainsDynamicHMC.

julia> using SampleChainsDynamicHMC
[ Info: Precompiling SampleChainsDynamicHMC [6d9fd711-e8b2-4778-9c70-c1dfb499d4c4]

julia> post = sample(m(x=x) | (y=y,), dynamichmc())
4000-element MultiChain with 4 chains and schema (σ = Float64, β = Float64, α = Float64)
(σ = 1.0±0.15, β = 0.503±0.26, α = 10.2±0.25)

How is Soss different from Turing?

First, a fine point: When people say "the Turing PPL" they usually mean what's technically called "DynamicPPL".

  • In Soss, models are first class, and can be composed or nested. For example, you can define a model and later nest it inside another model, and inference will handle both together. DynamicPPL can also handle nested models (see this PR) though I'm not aware of a way to combine independently-defined DynamicPPL models for a single inference pass.
  • Soss has been updated to use MeasureTheory.jl, though everything from Distributions.jl is still available.
  • Soss allows model transformations. This can be used, for example, to easily express predictive distributions or Markov blanket as a new model.
  • Most of the focus of Soss is at the syntactic level; inference works in terms of "primitives" that transform the model's abstract syntax tree (AST) to new code. This adds the same benefits as using Julia's macros and generated functions, as opposed to higher-order functions alone.
  • Soss can evaluate log-densities symbolically, which can then be used to produce optimized evaluations for much faster inference. This capability is in relatively early stages, and will be made more robust in our ongoing development.
  • The Soss team is much smaller than that of DynamicPPL. But I hope that will change (contributors welcome!)

Soss and DynamicPPL are both maturing and becoming more complete, so the above will change over time. It's also worth noting that we (the Turing team and I) hope to move toward a natural way of using these systems together to arrive at the best of both.

How can I get involved?

I'm glad you asked! Lots of things:

For more details, please see the documentation.

Stargazers over time

Stargazers over time

Popular Bayesian Inference Projects
Popular Bayesian Statistics Projects
Popular Machine Learning Categories
Related Searches

Get A Weekly Email With Trending Projects For These Categories
No Spam. Unsubscribe easily at any time.
Bayesian Inference
Bayesian Statistics
Probabilistic Programming