What is a Markov chain?
The simplest example is a drunkard's walk (also called a random walk). The drunk might stumble in any direction but will move only 1 step from the current position.
The ink drop in a glass of water example
Imagine a traffic light with three states: yellow, green, red; however, instead of going Green-> Yellow-> Red at "fixed intervals", it would go at any color at any time.(randomly - Imagine a dice with 3 color and you throw it and decide what color it will be next). Alternatively, imagine you are in certain color, say green. If you don't allow to be in the same color again, flip a coin. If it is heads go to red, and if tails go to yellow.
So to make a "chain" we just feed tomorrows result back into today. Then we can get a long chain like rain rain rain no rain no rain no rain rain rain rain no rain no rain no rain a pattern will emerge that there will be long "chains" of rain or no rain based on how we setup our "chances" or probabilities.
- Hidden blue prints of nature / objects around us
- Once you begin each sequence will converge to one ratio
- First order and second order model defined by Claude Shannon
No comments:
Post a Comment