Markov chain: Meaning and Definition of

Mar'kov chain"

Pronunciation: (mär'kôf), [key]
— Statistics. Statistics.
  1. a Markov process restricted to discrete random events or to discontinuous time sequences.
Random House Unabridged Dictionary, Copyright © 1997, by Random House, Inc., on Infoplease.