网站首页  英汉词典

请输入您要查询的英文单词:

 

单词 Markov time
释义

Markov time

英语百科

Stopping time

Example of a stopping time: a hitting time of Brownian motion. The process starts at 0 and is stopped as soon as it hits 1.

In probability theory, in particular in the study of stochastic processes, a stopping time (also Markov time) is a specific type of “random time”: a random variable whose value is interpreted as the time at which a given stochastic process exhibits a certain behavior of interest. A stopping time is often defined by a stopping rule, a mechanism for deciding whether to continue or stop a process on the basis of the present position and past events, and which will almost always lead to a decision to stop at some finite time.

随便看

 

英汉网英语在线翻译词典收录了3779314条英语词汇在线翻译词条,基本涵盖了全部常用英语词汇的中英文双语翻译及用法,是英语学习的有利工具。

 

Copyright © 2004-2024 encnc.com All Rights Reserved
更新时间:2025/6/22 20:37:47