• Thumbnail for Markov chain
    A Markov chain or Markov process is a stochastic process describing a sequence of possible events in which the probability of each event depends only on...
    93 KB (12,531 words) - 08:05, 16 September 2024
  • Markov decision process (MDP), also called a stochastic dynamic program or stochastic control problem, is a model for sequential decision making when...
    34 KB (5,068 words) - 00:34, 2 October 2024
  • Markov renewal processes are a class of random processes in probability and statistics that generalize the class of Markov jump processes. Other classes...
    4 KB (834 words) - 02:10, 13 July 2023
  • Gauss–Markov stochastic processes (named after Carl Friedrich Gauss and Andrey Markov) are stochastic processes that satisfy the requirements for both...
    4 KB (473 words) - 21:31, 5 July 2023
  • A continuous-time Markov chain (CTMC) is a continuous stochastic process in which, for each state, the process will change state according to an exponential...
    23 KB (4,241 words) - 01:52, 27 June 2024
  • A hidden Markov model (HMM) is a Markov model in which the observations are dependent on a latent (or hidden) Markov process (referred to as X {\displaystyle...
    51 KB (6,799 words) - 21:37, 23 September 2024
  • Thumbnail for Discrete-time Markov chain
    In probability, a discrete-time Markov chain (DTMC) is a sequence of random variables, known as a stochastic process, in which the value of the next variable...
    25 KB (4,252 words) - 01:57, 27 June 2024
  • observable Markov decision process (POMDP) is a generalization of a Markov decision process (MDP). A POMDP models an agent decision process in which it...
    22 KB (3,309 words) - 00:39, 23 July 2024
  • Thumbnail for Markov property
    probability theory and statistics, the term Markov property refers to the memoryless property of a stochastic process, which means that its future evolution...
    9 KB (1,211 words) - 07:29, 3 April 2024
  • characterize continuous-time Markov processes. In particular, they describe how the probability of a continuous-time Markov process in a certain state changes...
    9 KB (1,405 words) - 01:15, 31 August 2024
  • Thumbnail for Stochastic process
    Markov processes, Lévy processes, Gaussian processes, random fields, renewal processes, and branching processes. The study of stochastic processes uses...
    166 KB (18,416 words) - 19:57, 9 October 2024
  • theory, a Markov kernel (also known as a stochastic kernel or probability kernel) is a map that in the general theory of Markov processes plays the role...
    11 KB (2,052 words) - 14:25, 11 September 2024
  • statistics, diffusion processes are a class of continuous-time Markov process with almost surely continuous sample paths. Diffusion process is stochastic in...
    2 KB (171 words) - 04:03, 9 October 2024
  • Thumbnail for Andrey Markov
    Andrey Markov Chebyshev–Markov–Stieltjes inequalities Gauss–Markov theorem Gauss–Markov process Hidden Markov model Markov blanket Markov chain Markov decision...
    10 KB (1,072 words) - 17:02, 13 June 2024
  • In probability theory, a Markov model is a stochastic model used to model pseudo-randomly changing systems. It is assumed that future states depend only...
    10 KB (1,175 words) - 07:55, 18 July 2024
  • In statistics, Markov chain Monte Carlo (MCMC) is a class of algorithms used to draw samples from a probability distribution. Given a probability distribution...
    29 KB (3,091 words) - 22:08, 27 September 2024
  • balance in kinetics seem to be clear. A Markov process is called a reversible Markov process or reversible Markov chain if there exists a positive stationary...
    36 KB (5,847 words) - 13:18, 3 September 2024
  • In probability theory, a piecewise-deterministic Markov process (PDMP) is a process whose behaviour is governed by random jumps at points in time, but...
    6 KB (671 words) - 14:56, 31 August 2024
  • probability, a Markov additive process (MAP) is a bivariate Markov process where the future states depends only on one of the variables. The process { ( X (...
    3 KB (402 words) - 03:32, 13 March 2024
  • Markov chain Markov chain central limit theorem Continuous-time Markov process Markov process Semi-Markov process Gauss–Markov processes: processes that...
    5 KB (407 words) - 21:21, 25 August 2023
  • The phrase Gauss–Markov is used in two different ways: Gauss–Markov processes in probability theory The Gauss–Markov theorem in mathematical statistics...
    293 bytes (67 words) - 18:03, 5 February 2018
  • Processing (programming language), an open-source language and integrated development environment In probability theory: Branching process, a Markov process...
    6 KB (686 words) - 04:06, 5 July 2024
  • Thumbnail for Mixing (mathematics)
    colloquially, the process, in a strong sense, forgets its history. Suppose ( X t ) {\displaystyle (X_{t})} were a stationary Markov process with stationary...
    26 KB (4,728 words) - 23:25, 17 September 2024
  • contains examples of Markov chains and Markov processes in action. All examples are in the countable state space. For an overview of Markov chains in general...
    14 KB (2,429 words) - 21:01, 8 July 2024
  • Thumbnail for Poisson point process
    More complicated processes with the Markov property, such as Markov arrival processes, have been defined where the Poisson process is a special case...
    118 KB (15,491 words) - 14:51, 3 October 2024
  • Gauss–Markov theorem Gauss–Markov process Markov blanket Markov boundary Markov chain Markov chain central limit theorem Additive Markov chain Markov additive...
    2 KB (229 words) - 07:10, 17 June 2024
  • Thumbnail for Ornstein–Uhlenbeck process
    The Ornstein–Uhlenbeck process is a stationary Gauss–Markov process, which means that it is a Gaussian process, a Markov process, and is temporally homogeneous...
    30 KB (4,605 words) - 12:58, 23 August 2024
  • stochastic matrix is a square matrix used to describe the transitions of a Markov chain. Each of its entries is a nonnegative real number representing a probability...
    18 KB (2,726 words) - 14:06, 6 September 2024
  • Chapman–Kolmogorov equation (category Markov processes)
    equation Examples of Markov chains Category of Markov kernels Perrone (2024), pp. 10–11 Pavliotis, Grigorios A. (2014). "Markov Processes and the Chapman–Kolmogorov...
    6 KB (1,003 words) - 10:11, 1 October 2024
  • subjects named for Andrey Markov: A Markov chain or Markov process, a stochastic model describing a sequence of possible events The Markov property, the memoryless...
    576 bytes (101 words) - 21:25, 3 June 2022