Based on the intermediate representation built by AePPL:

import aemcmc
import aesara
import aesara.tensor as at

srng = at.random.RandomStream(0)

X = at.matrix("X")

# Horseshoe prior for `beta_rv`
tau_rv = srng.halfcauchy(0, 1, name="tau")
lmbda_rv = srng.halfcauchy(0, 1, size=X.shape[1], name="lambda")
beta_rv = srng.normal(0, lmbda_rv * tau_rv, size=X.shape[1], name="beta")

a = at.scalar("a")
b = at.scalar("b")
h_rv = srng.gamma(a, b, name="h")

# Negative-binomial regression
eta = X @ beta_rv
p = at.sigmoid(-eta)
Y_rv = srng.nbinom(h_rv, p, name="Y")

y_vv = Y_rv.clone() = "y"

sampler, initial_values = aemcmc.construct_sampler({Y_rv: y_vv}, srng)


Rewrites on the logprob

We've discussed this and it should happen, as some people will have bad geometry that's induced by a big function they're using. (see models used by Marcus who wrote the MUSE paper)

Using copulas

Links to this note