In computer science, the forward-backward algorithm is an algorithm for computing the probability of a particular observation sequence. It operates in the context of hidden Markov models.
The algorithm comprises three steps:
1. computing forward probabilities
2. computing backward probabilities
3. computing smoothed values
The forward and backward steps are often called "forward message pass" and "backward message pass". The wording originates from the way the algorithm processes the given observation sequence. First the algorithm moves forward starting with the first observation in the sequence and going to the last, and then returning back to the first. At each single observation in the sequence probabilities to be used for calculations at the next observation are computed. During the backward pass the algorithm simultaneously performs the smoothing step. This step allows the algorithm to take into account any past observations of output for computing more accurate results.
http://en.wikipedia.org/wiki/Forward-backward_algorithm
The Viterbi algorithm is a dynamic programming algorithm for finding the most likely sequence of hidden states "called the Viterbi path" that results in a sequence of observed events, especially in the context of Markov information sources, and more generally, hidden Markov models. The forward algorithm is a closely related algorithm for computing the probability of a sequence of observed events. These algorithms belong to the realm of information theory.
The algorithm makes a number of assumptions.
1. Both the observed events and hidden events must be in a sequence. This sequence often corresponds to time.
2. These two sequences need to be aligned, and an instance of an observed event needs to correspond to exactly one instance of a hidden event.
3. Computing the most likely hidden sequence up to a certain point t must depend only on the observed event at point t, and the most likely sequence at point t - 1. These assumptions are all satisfied in a first-order hidden Markov model.
The terms "Viterbi path" and "Viterbi algorithm" are also applied to related dynamic programming algorithms that discover the single most likely explanation for an observation. For example, in statistical parsing a dynamic programming algorithm can be used to discover the single most likely context-free derivation (parse) of a string, which is sometimes called the "Viterbi parse".
Dynamical systems are mathematical objects used to model physical phenomena whose state (or instantaneous description) changes over time. These models are used in financial and economic forecasting, environmental modeling, medical diagnosis, industrial equipment diagnosis, and a host of other applications.
For the most part, applications fall into three broad categories:
1. Predictive (also referred to as generative), in which the objective is to predict future states of the system from observations of the past and present states of the system,
2. Diagnostic, in which the objective is to infer what possible past states of the system might have led to the present state of the system (or observations leading up to the present state), and, finally,
3. Applications in which the objective is neither to predict the future nor explain the past but rather to provide a theory for the physical phenomena. These three categories correspond roughly to the need to predict, explain, and understand physical phenomena.