Medical Subject Headings

Last uploaded: January 31, 2024
Preferred Name

Markov Chains

Synonyms

Chain, Markov

Markov Process

Processes, Markov

Chains, Markov

Markov Processes

Markov Chain

Process, Markov

Definitions

A stochastic process such that the conditional probability distribution for a state at any future instant, given the present state, is unaffected by any additional knowledge of the past history of the system.

ID

http://purl.bioontology.org/ontology/MESH/D008390

altLabel

Chain, Markov

Markov Process

Processes, Markov

Chains, Markov

Markov Processes

Markov Chain

Process, Markov

cui

C0024828

DC

1

definition

A stochastic process such that the conditional probability distribution for a state at any future instant, given the present state, is unaffected by any additional knowledge of the past history of the system.

DX

19910101

HN

91(75); was see under PROBABILITY 1975-90

Machine permutation

91; was see under PROBABILITY 1975-90

MDA

19990101

MMR

20080708

MN

E05.318.740.600.500

N06.850.520.830.996.500

N05.715.360.750.770.500

N06.850.520.830.600.500

E05.318.740.996.500

N05.715.360.750.625.500

G17.830.500

notation

D008390

OL

search PROBABILITY 1968-74

prefLabel

Markov Chains

TERMUI

T024959

T024957

T024958

TH

NLM (1990)

NLM (1975)

POPLINE (1978)

tui

T081

subClassOf

http://purl.bioontology.org/ontology/MESH/D011336

http://purl.bioontology.org/ontology/MESH/D013269

Delete Subject Author Type Created
No notes to display