-
PoRB-Nets: Poisson Process Radial Basis Function Networks
- Scope:
- Data is scarce
- Uncertainty must be quantified
- Related work:
- Gap - Common wright priors encode little functional knowledge. Solution - Bayesian neural networks are flexible function priors (suited to situations where data is scarce and uncertaintly must be quantified).
- Gap - Common BNN priors are highly non-stationary.
- Background:
- Bayesian neural networks (BNNs) assume a prior
$w, b ~ p(w, b)$ . - Radial basis function networks (RBFNs) are classical shallow neural networks that approximate arbitrary nonlinear functions through a linear combination of linear kernels.
- Bayesian neural networks (BNNs) assume a prior
- Contributions:
- Novel prior over radial basis function (RBF) networks - Allows for independent specification of functional amplitude variance and lengthsale (inverse lengthscale corresponds to concentration of RBFs).
- Prove consistency and approximate variance stationarity, when lengthscale is uniform over input space.
- Infer input dependence of lengthscale, when unkown.
- Scope: