markov chain

This is not a dictionary! - Search for "markov chain" in The Danish Dictionary

  • February 29. 2016 from wordnet.princeton.edu

    Markov chain noun english

    a Markov process for which the parameter is discrete time values