Markov chain

[mahr-kawf] /ˈmɑr kɔf/
noun, Statistics.
1.
a Markov process restricted to discrete random events or to discontinuous time sequences.
Also, Markoff chain.
Origin
1940-45; see Markov process
British Dictionary definitions for markov chains

Markov chain

/ˈmɑːkɒf/
noun
1.
(statistics) a sequence of events the probability for each of which is dependent only on the event immediately preceding it
Word Origin
C20: named after Andrei Markov (1856–1922), Russian mathematician