SynonymX
Speak
Markov Chain
Definition of Markov Chain
1.
A Markov process for which the parameter is discrete time values
Noun
Synonyms for word "markov chain"
markoff chain
Semanticaly linked words with "markov chain"
markoff process
markov process
Hyponims for word "markov chain"
markoff process
markov process