Markov modelJoe Segen2016-12-13T06:13:13+00:00
A model used in decision analysis for evaluating potential outcomes of a disease process, which are defined as specific health states, transitions among which are modeled iteratively.
In standard decision tree analysis, a patient moves through states, for example, from not treated to treated to final outcome; in a Markov process, a patient moves between states–e.g., backwards and forward between continuous ambulatory peritoneal dialysis and haemodialysis. Some states (so-called “absorbing states”) cannot be left once entered. Death is the ultimate absorbing state.
Synonyms Markov decision-making model, state-transition model