Skip to main content

Search from vocabulary

Content language

Concept information

Preferred term

markov chains  

Definition

  • The topic of Markov chains is a well-developed topic in probability. There are many fine expositions of Markov chains (e.g., Bremaud, 2008; Feller, 1968; Hoel, Port, & Stone, 1972; Kemeny & Snell, 1960). [Source: Encyclopedia of Research Design; Markov Chains]

Belongs to group

URI

https://concepts.sagepub.com/social-science/concept/markov_chains

Download this concept: