Markov process
English Dictionary
->
Letter M
-> Markov process
Search Dictionary:
Markov process Definition
(n)
a
simple
stochastic
process
in
which
the
distribution
of
future
states
depends
only
on
the
present
state
and
not
on
how
it
arrived
in
the
present
state
Markov process Synonyms
Markoff process
Markov process
© Art Branch Inc.