Stochastic matrix

In mathematics, a stochastic matrix is a square matrix used to describe the transitions of a Markov chain. Each of its entries is a nonnegative real number representing a probability.[1][2]: 10  It is also called a probability matrix, transition matrix, substitution matrix, or Markov matrix. The stochastic matrix was first developed by Andrey Markov at the beginning of the 20th century, and has found use throughout a wide variety of scientific fields, including probability theory, statistics, mathematical finance and linear algebra, as well as computer science and population genetics. There are several different definitions and types of stochastic matrices:

A right stochastic matrix is a square matrix of nonnegative real numbers, with each row summing to 1.
A left stochastic matrix is a square matrix of nonnegative real numbers, with each column summing to 1.
A doubly stochastic matrix is a square matrix of nonnegative real numbers with each row and column summing to 1.

In the same vein, one may define a probability vector as a vector whose elements are nonnegative real numbers which sum to 1. Thus, each row of a right stochastic matrix (or column of a left stochastic matrix) is a probability vector. Right stochastic matrices act upon row vectors of probabilities by multiplication from the right, and left stochastic matrices act upon column vectors of probabilities by multiplication from the left. This article follows the former convention. In addition, a substochastic matrix is a real square matrix whose row sums are all

  1. ^ Asmussen, S. R. (2003). "Markov Chains". Applied Probability and Queues. Stochastic Modelling and Applied Probability. Vol. 51. pp. 3–8. doi:10.1007/0-387-21525-5_1. ISBN 978-0-387-00211-8.
  2. ^ Lawler, Gregory F. (2006). Introduction to Stochastic Processes (2nd ed.). CRC Press. ISBN 1-58488-651-X.

From Wikipedia, the free encyclopedia · View on Wikipedia

Developed by Nelliwinne