Neural Network Toolbox 5.1

Category Intelligent Software>Neural Network Systems/Tools

Abstract The Neural Network Toolbox extends MATLAB (see Note 1) with tools for designing, implementing, visualizing, and simulating neural networks. Neural networks (NNs) are invaluable for applications where formal analysis would be difficult or impossible, such as pattern recognition and nonlinear system identification and control. Product provides comprehensive support for many proven network paradigms, as well as graphical user interfaces (GUIs) that enable you to design and manage your networks. The modular, open, and extensible design of the toolbox simplifies the creation of customized functions and networks.

Like its counterpart in the biological nervous system, a neural network (NN) can learn, and therefore can be trained to find solutions, recognize patterns, classify data, and forecast future events. The behavior of a NN is defined by the way its individual computing elements are connected and by the strength of those connections, or weights. The weights are automatically adjusted by training the network according to a specified learning rule until it performs the desired task correctly.

Products GUIs make it easy to work with neural networks. The Neural Network Fitting Tool is a wizard that leads you through the process of fitting data using NNs. You can use the tool to import large and complex data sets, quickly create and train networks, and evaluate network performance.

Network Architectures - The Neural Network Toolbox supports both supervised and unsupervised networks.

Supervised Networks - Supervised NNs are trained to produce desired outputs in response to sample inputs, making them particularly well suited to modeling and controlling dynamic systems, classifying noisy data, and predicting future events.

1) Feedforward networks have one-way connections from input to output layers. They are most commonly used for prediction, pattern recognition, and nonlinear function fitting;

2) Radial basis networks provide an alternative, fast method for designing nonlinear feedforward networks. Supported variations include generalized regression and probabilistic NNs;

3) Dynamic networks use memory and recurrent feedback connections to recognize spatial and temporal patterns in data. They are commonly used for time-series prediction, nonlinear dynamic system modeling, and control system applications;

4)Learning Vector Quantization (LVQ) is a powerful method for classifying patterns that are Not linearly separable. LVQ lets you specify class boundaries and the granularity of classification.

Unsupervised Networks - Unsupervised NNs are trained by letting the network continually adjust itself to new inputs. They find relationships within data and can automatically define classification schemes. Product supports two (2) types of self-organizing, unsupervised networks - competitive layers and self-organizing maps.

1) Competitive layers recognize and group similar input vectors. By using these groups, the network automatically sorts the inputs into categories;

2) Self-organizing maps learn to classify input vectors according to similarity. Unlike competitive layers they also preserve the topology of the input vectors, assigning nearby inputs to nearby categories.

Training and Learning Functions - These functions are mathematical procedures used to automatically adjust the network's weights and biases. The training function dictates a global algorithm that affects all the weights and biases of a given network. The learning function can be applied to individual weights and biases within a network. Product supports a variety of training algorithms, including several gradient descent methods, conjugate gradient methods, the Levenberg- Marquardt algorithm (LM), and the resilient backpropogation algorithm (Rprop). A suite of learning functions, including gradient descent, hebbian learning, LVQ, Widrow-Hoff, and Kohonen, is also provided.

Simulink Support - Product provides a set of blocks for building NNs in Simulink (see Note 2). These blocks are divided into three (3) libraries -

1) Transfer function blocks, which take a net-input vector and generate a corresponding output vector;

2) Net input function blocks, which take any number of weighted input vectors, weight layer output vectors, and bias vectors, and return a net-input vector;

3) Weight function blocks, which apply a neuron's weight vector to an input vector (or a layer output vector) to get a weighted input value for a neuron. Alternatively, you can create and train your networks in the MATLAB environment and automatically generate network simulation blocks for use with Simulink. This approach also enables you to view your networks graphically.

Control System Applications - Product lets you apply NNs to the identification and control of nonlinear systems. The toolbox includes descriptions, demonstrations, and Simulink blocks for three (3) popular control applications - model predictive control, feedback linearization, and model reference adaptive control.

Pre- and Post-Processing Functions - Pre-processing the network inputs and targets improves the efficiency of NN training. Post- processing enables detailed analysis of network performance. Product provides pre- and post-processing functions that enable you to -

1) Reduce the dimensions of the input vectors using principal component analysis;

2) Perform regression analysis between the network response and the corresponding targets;

3) Scale inputs and targets so that they fall in the range [-1,1];

4) Normalize the mean and standard deviation of the training set.

Improving Generalization - Improving the network's ability to generalize helps prevent overfitting, a common problem in NN design. Overfitting occurs when a network has memorized the training set but has Not learned to generalize to new inputs. Overfitting produces a relatively small error on the training set but a much larger error when new data is presented to the network.

Product provides two (2) solutions to improve generalization - regularization and early stopping -

1) Regularization modifies the network's performance function (the measure of error that the training process minimizes). By including the sizes of the weights and biases, training produces a network that performs well with the training data and exhibits smoother behavior when presented with new data.

2) Early stopping uses two different data sets - the training set, to update the weights and biases, and the validation set, to stop training when the network begins to overfit the data.

Note 1: MATLAB is a high-level language and interactive environment that enables you to perform computationally intensive tasks faster than with traditional programming languages such as C, C++, and FORTRAN.

Note 2: Simulink is an environment for multidomain simulation and Model-Based Design for dynamic and embedded systems.

System Requirements

Product Requirements

General System Requirements for


Manufacturer Web Site The MathWorks, Inc.

Price Contact manufacturer.

G6G Abstract Number 20029

G6G Manufacturer Number 102625