代码之家  ›  专栏  ›  技术社区  ›  Greg

如何求解/拟合蟒蛇几何布朗运动过程?

  •  3
  • Greg  · 技术社区  · 6 年前

    例如,下面的代码模拟几何布朗运动(GBM)过程,该过程满足以下随机微分方程的要求。

    代码是这个维基百科文章中代码的浓缩版本。

    导入numpy as np 随机种子(1) 定义GBM(mu=1,sigma=0.6,x0=100,n=50,dt=0.1): 步骤=np.exp((mu-sigma**2/2)*dt)*np.exp(sigma*np.random.normal(0,np.sqrt(dt),(1,n))) 返回X0*step.cumprod() 级数=GBM-() < /代码>

    如何在Python中适应GBM流程?也就是说,如何估算 mu sigma 并求解给定时间序列的随机微分方程 series ?:

    enter image description here

    代码是 code in this Wikipedia article .

    import numpy as np
    np.random.seed(1)
    
    def gbm(mu=1, sigma = 0.6, x0=100, n=50, dt=0.1):
        step = np.exp( (mu - sigma**2 / 2) * dt ) * np.exp( sigma * np.random.normal(0, np.sqrt(dt), (1, n)))
        return x0 * step.cumprod()
    
    series = gbm()
    

    如何在Python中适应GBM流程?也就是说,如何估计 mu sigma 并在给定时间序列的情况下求解随机微分方程。 series ?

    1 回复  |  直到 6 年前
        1
  •  2
  •   user3658307    6 年前

    SDES的参数估计是一个研究水平的领域,因而相当重要。整本书都是关于这个话题的。请随意查看这些内容以了解更多详细信息。

    但这里有一个简单的方法来解决这个问题。首先,请注意,GBM的对数是一个仿射变换的维纳过程(即线性ITO漂移扩散过程)。所以

    d ln(s_t)=(mu-sigma^2/2)d t+sigma db_t

    因此,我们可以估计出日志过程参数,并将其转换为与原始过程相匹配的参数。退房 [1] , [2] ,请 [3] , [4] 例如。

    这是一个脚本,它以两种简单的方式来实现这个漂移(只是想看看不同之处),并且只有一种用于传播(抱歉)。测井过程的漂移估计为: (X_T - X_0) / T 以及通过增量MLE(参见代码)。扩散参数是以偏态的方式估计的,其定义是无穷小的方差。

    import numpy as np
    
    np.random.seed(9713)
    
    # Parameters
    mu = 1.5
    sigma = 0.9
    x0 = 1.0
    n = 1000
    dt = 0.05
    
    # Times
    T = dt*n
    ts = np.linspace(dt, T, n)
    
    # Geometric Brownian motion generator
    def gbm(mu, sigma, x0, n, dt):
        step = np.exp( (mu - sigma**2 / 2) * dt ) * np.exp( sigma * np.random.normal(0, np.sqrt(dt), (1, n)))
        return x0 * step.cumprod()
    
    # Estimate mu just from the series end-points
    # Note this is for a linear drift-diffusion process, i.e. the log of GBM
    def simple_estimate_mu(series):
        return (series[-1] - x0) / T
    
    # Use all the increments combined (maximum likelihood estimator)
    # Note this is for a linear drift-diffusion process, i.e. the log of GBM
    def incremental_estimate_mu(series):
        total = (1.0 / dt) * (ts**2).sum()
        return (1.0 / total) * (1.0 / dt) * ( ts * series ).sum()
    
    # This just estimates the sigma by its definition as the infinitesimal variance (simple Monte Carlo)
    # Note this is for a linear drift-diffusion process, i.e. the log of GBM
    # One can do better than this of course (MLE?)
    def estimate_sigma(series):
        return np.sqrt( ( np.diff(series)**2 ).sum() / (n * dt) )
    
    # Estimator helper
    all_estimates0 = lambda s: (simple_estimate_mu(s), incremental_estimate_mu(s), estimate_sigma(s))
    
    # Since log-GBM is a linear Ito drift-diffusion process (scaled Wiener process with drift), we
    # take the log of the realizations, compute mu and sigma, and then translate the mu and sigma
    # to that of the GBM (instead of the log-GBM). (For sigma, nothing is required in this simple case).
    def gbm_drift(log_mu, log_sigma):
        return log_mu + 0.5 * log_sigma**2
    
    # Translates all the estimates from the log-series
    def all_estimates(es):
        lmu1, lmu2, sigma = all_estimates0(es)
        return gbm_drift(lmu1, sigma), gbm_drift(lmu2, sigma), sigma
    
    print('Real Mu:', mu)
    print('Real Sigma:', sigma)
    
    ### Using one series ###
    series = gbm(mu, sigma, x0, n, dt)
    log_series = np.log(series)
    
    print('Using 1 series: mu1 = %.2f, mu2 = %.2f, sigma = %.2f' % all_estimates(log_series) )
    
    ### Using K series ###
    K = 10000
    s = [ np.log(gbm(mu, sigma, x0, n, dt)) for i in range(K) ]
    e = np.array( [ all_estimates(si) for si in s ] )
    avgs = np.mean(e, axis=0)
    
    print('Using %d series: mu1 = %.2f, mu2 = %.2f, sigma = %.2f' % (K, avgs[0], avgs[1], avgs[2]) )
    

    输出:

    Real Mu: 1.5
    Real Sigma: 0.9
    Using 1 series: mu1 = 1.56, mu2 = 1.54, sigma = 0.96
    Using 10000 series: mu1 = 1.51, mu2 = 1.53, sigma = 0.93