What is a Markov Chain?
1 Answer
A stochastic process which is memoryless.
Explanation:
Suppose you have a system that changes state over time, with some random variability. The time sequence of states of the system is called a stochastic process. A stochastic process is a Markov chain if at any point in time, the probability of future states is only dependent on the current state, not on anything that has gone before. Another term used is memoryless. The system does not remember what happened before in the sense that future states are independent of past states.
Consider a model of Brownian motion:
- There are
#n# particles of pollen which we keep track of. - The particles of pollen tend to keep moving in the same direction with the same velocity.
- The movement is changed by the random impact of unseen smaller particles which we are not tracking.
If our model only maintained a state which only described the instantaneous position of each particle of pollen at each time
If however the model maintained a state which described not only the position of each pollen particle, but also its current velocity and angular momentum then it probably is a Markov chain.