SynonymX
Speak
Markoff Chain
Definition of Markoff Chain
1.
A Markov process for which the parameter is discrete time values
Noun
Synonyms for word "markoff chain"
markov chain
Semanticaly linked words with "markoff chain"
markoff process
markov process
Hyponims for word "markoff chain"
markoff process
markov process