Compute Expectation of Autoregressive Moving-Average Process from Its Definition
This example explores an ARMA process with initial values. It constructs process values in terms of the sequence of innovations and uses enhanced support of random processes in Expectation to compute the mean and covariance of the process slices. Furthermore, stationary time series process is reinterpreted as time series process with random initial values.
Define autoregressive moving-average process values via its defining relation as a function of driving white noise process values .
Out[1]= | |
Process values for ARMA(2,1) process.
Compute mean of process values for , given past process values, and given zero values of innovations in the past.
Out[4]= | |
Compare to the values of the process mean function.
Out[5]= | |
Use Expectation to compute covariance function of process values under the same conditions.
Out[6]//Short= |
| |
Covariance function for a particular value of process parameters.
Out[7]//MatrixForm= |
| |
Compare computed covariance matrix to values of CovarianceFunction.
Out[8]= | |
Compute mean and covariance of process values assuming a Gaussian joint distribution for past values and past innovations.
Out[11]= | |
Out[12]//Short= |
| |
Weak stationarity condition implies that the mean values should be equal and the covariance matrix values across subdiagonals should be the same. This determines parameters of the joint distribution of past values.
Out[13]= | |
Compare to the covariance function of weakly stationary ARMA(2,1) process.
Out[14]= | |