mgplvm.likelihoods module

mgplvm.likelihoods.FA_init(Y, d=None)[source]
class mgplvm.likelihoods.Gaussian(n, sigma=None, n_gh_locs=20, learn_sigma=True, Y=None, d=None)[source]

Bases: mgplvm.likelihoods.Likelihood

dist(fs)[source]
Parameters
fsTensor

GP mean function values (n_mc x n_samples x n x m)

Returns
distdistribution

resulting Gaussian distributions

dist_mean(fs)[source]
Parameters
fsTensor

GP mean function values (n_mc x n_samples x n x m)

Returns
meanTensor

means of the resulting Gaussian distributions (n_mc x n_samples x n x m) for a Gaussian, this is simply fs

log_prob(y)[source]
property msg
name = 'Gaussian'
property prms: torch.Tensor
Return type

Tensor

sample(f_samps)[source]
Parameters
f_sampsTensor

GP output samples (n_mc x n_samples x n x m)

Returns
y_sampsTensor

samples from the resulting Gaussian distributions (n_mc x n_samples x n x m)

Return type

Tensor

property sigma: torch.Tensor
Return type

Tensor

training: bool
variational_expectation(y, fmu, fvar)[source]
Parameters
yTensor

number of MC samples (n_samples x n x m)

fmuTensor

GP mean (n_mc x n_samples x n x m)

fvarTensor

GP diagonal variance (n_mc x n_samples x n x m)

Returns
Log likelihoodTensor

SVGP likelihood term per MC, neuron, sample (n_mc x n_samples x n)

class mgplvm.likelihoods.Likelihood(n, n_gh_locs=20)[source]

Bases: mgplvm.base.Module

abstract static dist(self, x)[source]
abstract static dist_mean(self, x)[source]
abstract property log_prob
abstract property msg
abstract static sample(self, x)[source]
training: bool
abstract property variational_expectation
class mgplvm.likelihoods.NegativeBinomial(n, inv_link=<function id_link>, binsize=1, total_count=None, c=None, d=None, fixed_total_count=False, fixed_c=True, fixed_d=False, n_gh_locs=20, Y=None)[source]

Bases: mgplvm.likelihoods.Likelihood

dist(fs)[source]
Parameters
fsTensor

GP mean function values (n_mc x n_samples x n x m)

Returns
distdistribution

resulting negative binomial distributions

dist_mean(fs)[source]
Parameters
fsTensor

GP mean function values (n_mc x n_samples x n x m)

Returns
meanTensor

means of the resulting negative binomial distributions (n_mc x n_samples x n x m)

log_prob(total_count, rate, y)[source]
property msg
name = 'Negative binomial'
property prms
sample(f_samps)[source]
Parameters
f_sampsTensor

GP output samples (n_mc x n_samples x n x m)

Returns
y_sampsTensor

samples from the resulting negative binomial distributions (n_mc x n_samples x n x m)

property total_count
training: bool
variational_expectation(y, fmu, fvar)[source]
Parameters
yTensor

number of MC samples (n_samples x n x m)

fmuTensor

GP mean (n_mc x n_samples x n x m)

fvarTensor

GP diagonal variance (n_mc x n_samples x n x m)

Returns
Log likelihoodTensor

SVGP likelihood term per MC, neuron, sample (n_mc x n_samples x n)

class mgplvm.likelihoods.Poisson(n, inv_link=<function exp_link>, binsize=1, c=None, d=None, fixed_c=True, fixed_d=False, n_gh_locs=20)[source]

Bases: mgplvm.likelihoods.Likelihood

dist(fs)[source]
Parameters
fsTensor

GP mean function values (n_mc x n_samples x n x m)

Returns
distdistribution

resulting Poisson distributions

dist_mean(fs)[source]
Parameters
fsTensor

GP mean function values (n_mc x n_samples x n x m)

Returns
meanTensor

means of the resulting Poisson distributions (n_mc x n_samples x n x m)

log_prob(lamb, y)[source]
property msg
name = 'Poisson'
property prms
sample(f_samps)[source]
Parameters
f_sampsTensor

GP output samples (n_mc x n_samples x n x m)

Returns
y_sampsTensor

samples from the resulting Poisson distributions (n_mc x n_samples x n x m)

training: bool
variational_expectation(y, fmu, fvar)[source]
Parameters
yTensor

number of MC samples (n_samples x n x m)

fmuTensor

GP mean (n_mc x n_samples x n x m)

fvarTensor

GP diagonal variance (n_mc x n_samples x n x m)

Returns
Log likelihoodTensor

SVGP likelihood term per MC, neuron, sample (n_mc x n)

class mgplvm.likelihoods.ZIPoisson(n, inv_link=<function exp_link>, binsize=1, c=None, d=None, fixed_c=True, fixed_d=False, alpha=None, learn_alpha=True, n_gh_locs=20)[source]

Bases: mgplvm.likelihoods.Likelihood

https://en.wikipedia.org/wiki/Zero-inflated_model

dist(fs)[source]
Parameters
fsTensor

GP mean function values (n_mc x n_samples x n x m)

Returns
distdistribution

resulting Poisson distributions (for use internally)

dist_mean(fs)[source]
Parameters
fsTensor

GP mean function values (n_mc x n_samples x n x m)

Returns
meanTensor

means of the resulting ZIP distributions (n_mc x n_samples x n x m)

log_prob(lamb, y, alpha)[source]
..math::
nowrap

begin{eqnarray} P(N=0) &= alpha + (1-lpha) ext{Poisson}(N=0) P(N>0) &= (1-lpha) ext{Poisson}(N) end{eqnarray}

property msg
name = 'Zero-inflated Poisson'
property prms
sample(f_samps)[source]
Parameters
f_sampsTensor

GP output samples (n_mc x n_samples x n x m)

Returns
y_sampsTensor

samples from the resulting ZIP distributions (n_mc x n_samples x n x m)

training: bool
variational_expectation(y, fmu, fvar)[source]
Parameters
yTensor

number of MC samples (n_samples x n x m)

fmuTensor

GP mean (n_mc x n_samples x n x m)

fvarTensor

GP diagonal variance (n_mc x n_samples x n x m)

Returns
Log likelihoodTensor

SVGP likelihood term per MC, neuron, sample (n_mc x n)

exponential link function used for positive observations

identity link function used for neg binomial data