Bayes Net Toolbox (BNT)
Category Intelligent Software>Bayesian Network Systems/Tools
Abstract The Bayes Net Toolbox (BNT) is an open-source MATLAB package (see ‘System Requirements’ below…) for directed graphical models.
BNT supports many kinds of nodes (probability distributions), exact and approximate inference, parameter and structure learning, and static and dynamic models.
The source code is extensively documented, object-oriented (OO), and free, making it an excellent tool for teaching, research and rapid prototyping.
Bayes Net Toolbox (BNT) major features/capabilities --
1) BNT supports many types of ‘conditional probability distributions’ (nodes), and it is easy to add more --
Tabular (multinomial); Gaussian; Softmax (logistic / sigmoid);
Multi-layer Perceptron (Neural Network); and Noisy or Deterministic.
2) BNT supports ‘decision and utility nodes’, as well as chance nodes, i.e., ‘influence diagrams’ as well as Bayesian Networks (BNs).
3) BNT supports static and dynamic BNs (useful for modeling ‘dynamical systems’ and sequence data).
4) BNT supports many different ‘inference algorithms’, and it is easy to add more --
‘Exact inference’ for static BNs -
- a) junction tree;
- b) variable elimination;
- c) brute force enumeration (for discrete nets);
- d) linear algebra (for Gaussian nets);
- e) Pearl's algorithm (for poly-trees); and
- f) quickscore (for QMR).
‘Approximate inference’ for static BNs -
- a) likelihood weighting;
- b) Gibbs sampling; and
- c) loopy belief propagation.
‘Exact inference’ for Dynamic Bayesian Networks (DBNs) -
- a) junction tree;
- b) frontier algorithm;
- c) forwards-backwards [for Hidden Markov Models (HMMs)]; and
- d) Kalman-Rauch-Tung-Striebel (RTS) [for ‘Linear Dynamical Systems’LDSs)].
‘Approximate inference’ for DBNs -
- a) Boyen-Koller and
- b) factored-frontier/loopy belief propagation.
5) BNT supports several methods for ‘parameter learning’, and it is easy to add more --
- a) Batch Maximum-Likelihood Estimation (MLE)/Maximum A Posteriori
(MAP) ‘parameter learning’ using Expectation-Maximization (EM).
(Each ‘node type’ has its own M method, e.g. Softmax nodes use IRLS, and each ‘inference engine’ has its own E method, so the programming code is fully modular.) - b) Sequential/batch Bayesian ‘parameter learning’ (for fully observed tabular nodes only).
6) BNT supports several methods for ‘regularization’, and it is easy to add more --
- a) Any node can have its parameters clamped (made non-adjustable).
- b) Any set of compatible nodes can have their parameters tied (c.f., weight sharing in a neural network).
- c) Some node types (e.g., tabular) supports priors for MAP estimation.
- d) Gaussian ‘covariance matrices’ can be declared full or diagonal, and can be tied across states of their ‘discrete parents’ (if any).
7) BNT supports several methods for ‘structure learning’, and it is easy to add more --
- a) ‘Bayesian structure learning’, using Markov Chain Monte Carlo (MCMC) or local search (for ‘fully observed’ tabular nodes only).
- b) ‘Constraint-based structure learning’ (IC/PC and IC*/FCI).
Bayes Net Toolbox (BNT) ‘Supported probabilistic’ models --
According to the manufacturer, it is trivial to implement all of the following probabilistic models using the BNT.
1) Static -
- a) Linear regression, logistic regression, hierarchical mixtures of experts;
- b) Naive Bayes classifiers, mixtures of Gaussians, sigmoid belief nets;
- c) Factor analysis, probabilistic Principle Component Analysis (PCA), probabilistic Independent Component Analysis (ICA), and mixtures of these models.
2) Dynamic -
- a) HMMs, Factorial HMMs, coupled HMMs, input-output HMMs, and DBNs;
- b) Kalman filters, AutoRegressive Moving Average model (ARMAX) models, switching Kalman filters, tree-structured Kalman filters, and multiscale AR models.
3) Many other combinations, for which there are (as yet) No names.
Bayes Net Toolbox (BNT) documentation --
The Bayes Net Toolbox (BNT) contains an extensive HTML based “How to use the Bayes Net Toolbox” document and an “A Brief Introduction to Graphical Models and Bayesian Networks” which is extremely informative.
System Requirements
Note: According to the manufacturer - as of January 2010 it should be possible to run most of BNT in Octave (an open-source MATLAB clone).
‘GNU Octave’ is a high-level language, primarily intended for numerical computations.
GNU Octave provides a convenient ‘command line’ interface for solving linear and nonlinear problems numerically, and for performing other numerical experiments using a language that is mostly compatible with MATLAB. It may also be used as a batch-oriented language.
Manufacturer
- Kevin P. Murphy
- Contact the manufacturer by “Subscribing to the BNT Email List” located on the Manufacturer's Web Site (see below...)
Manufacturer Web Site Kevin P. Murphy BNT
Price Contact manufacturer.
G6G Abstract Number 20577
G6G Manufacturer Number 104181