SynonymX
Speak
Markov Process
Definition of Markov Process
1.
A simple stochastic process in which the distribution of future states depends only on the present state and not on how it arrived in the present state
Noun
Synonyms for word "markov process"
markoff process
Semanticaly linked words with "markov process"
markoff chain
markov chain
stochastic process
Hyponims for word "markov process"
stochastic process