Markov chain
English Dictionary
->
Letter M
-> Markov chain
Search Dictionary:
Markov chain Definition
(n)
a
Markov
process
for
which
the
parameter
is
discrete
time
values
Markov chain Synonyms
Markoff chain
Markov chain
© Art Branch Inc.