Markov model

Markov model2016-12-13T06:13:13+00:00

Markov model 

EVIDENCE-BASED MEDICINE

A model used in decision analysis for evaluating potential outcomes of a disease process, which are defined as specific health states, transitions among which are modeled iteratively.

In standard decision tree analysis, a patient moves through states, for example, from not treated to treated to final outcome; in a Markov process, a patient moves between states–e.g., backwards and forward between continuous ambulatory peritoneal dialysis and haemodialysis. Some states (so-called “absorbing states”) cannot be left once entered. Death is the ultimate absorbing state. 

Synonyms Markov decision-making model, state-transition model 

Reference www.nsc.nhs.uk/glossary/glossary_ind.htm 

Leave A Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.