网站首页  英汉词典

请输入您要查询的英文单词:

 

单词 Homogeneous Markov chain
释义

Homogeneous Markov chain

英语百科

Markov chain

A simple two-state Markov chain
Russian mathematician Andrey Markov.

A Markov chain (discrete-time Markov chain or DTMC), named after Andrey Markov, is a random process that undergoes transitions from one state to another on a state space. It must possess a property that is usually characterized as "memorylessness": the probability distribution of the next state depends only on the current state and not on the sequence of events that preceded it. This specific kind of "memorylessness" is called the Markov property. Markov chains have many applications as statistical models of real-world processes.

随便看

 

英汉网英语在线翻译词典收录了3779314条英语词汇在线翻译词条,基本涵盖了全部常用英语词汇的中英文双语翻译及用法,是英语学习的有利工具。

 

Copyright © 2004-2024 encnc.com All Rights Reserved
更新时间:2025/6/19 11:32:54