Improved Predictive Deep Temporal Neural Networks

with Trend Filtering

 

Youngjin Park1, Deokjun Eom2, Byoungki Seo1 and Jaesik Choi23*

1 Ulsan National Institute of Science and Technology, UNIST

2 Korea Advanced Institute of Science and Technology, KAIST

3 INEEJI

* Corresponding Author

 

 

Forecasting with multivariate time series, which aims to predict future values given previous and current several univariate time series data, has been studied for decades, with one example being ARIMA. Because it is difficult to measure the extent to which noise is mixed with informative signals within rapidly fluctuating financial time series data, designing a good predictive model is not a simple task. Recently, many researchers have become interested in recurrent neural networks and attention-based neural networks, applying them in financial forecasting. There have been many attempts to utilize these methods for the capturing of long-term temporal dependencies and to select more important features in multivariate time series data in order to make accurate predictions.

In this paper, we propose a new prediction framework based on deep neural networks and trend filtering, which converts noisy time series data into a piecewise linear fashion. We reveal that the predictive performance of deep temporal neural networks improves when the training data is temporally processed by trend filtering. To verify the effect of our framework, three deep temporal neural networks, state of the art models for predictions in time series finance data, are used and compared with models that contain trend filtering as an input feature. Extensive experiments on real-world multivariate time series data show that the proposed method is effective and significantly better than existing baseline methods.

 

 

L1 Trend Filtering

If a univariate time series \(y_{t}\), \(t=1,…,n\) it consists of trend \(x_{t}\) that changes slowly and a random component \(z_{t}\) which changes rapidly. The goal of L1 trend filtering is to estimate the trend component \(x_{t}\) or, equivalently, estimate the random component \(z_{t} = y_{t}-x_{t}\). This method can adjust \(x_{t}\) to be smooth and can adjust \(z_{t}\), referred to as the residual, to be small. The main principle is that the trend filtering produces trend estimates that are smooth in the sense that they are piecewise linear. This method allows the selection of the trend estimate as the minimizer of the objective function as follow,

$$ \frac{1}{2}\sum_{t=1}^{n}{(y_{t}-x_{t})^2}+\lambda \sum_{t=2}^{n-1} \left| x_{t-1}-2x_{t}+x_{t+1} \right|  $$

which can be written in matrix form as

$$    \frac{1}{2}\left \| \mathbf{y}-\mathbf{x} \right \|_{2}^{2} + \lambda \left \| D\mathbf{x} \right \|_{1}, $$

where \( \left \| \mathbf{u}\right\|_{1} = \sum_{i} \left| u_{i} \right| \) denotes the \(l_{1}\) norm of the vector \(\mathbf{u}\) and \(D\in\mathbb{R}^{(n-2)\times n}\) is second-order difference matrix.

$$        D = \begin{bmatrix} 1 & -2 & 1 &   & &\\ & 1 & 2 & -1 \\ &   & \ddots & \ddots & \ddots \\ &   &  & 1 & -2 & 1 \end{bmatrix} $$

 

In addition, \(\lambda\) is a non-negative parameter used to control the trade-off between the smoothness of \(x\) and the size of the residual. Figure 2 shows after the trend filtering that the fluctuating behavior of the original time series graph is stabilized and the trend filtering simply follows the general trend between two adjacent knot points. This means that time series models can detect some important signals more easily and that filtering simplifies the prediction of noisy time series. The important signal represents a change point (knot) when the time series trend changes. Many experiments in this paper show the effect of trend filtering for three deep temporal neural network models.

 

 

The figure above shows prediction results of DA-RNN+L1TF compared to other methods. Each dataset is located in each row having two different terms of time by representing two columns, and the prediction comparison from each figure is shown. As shown in the figure, the DA-RNN+L1TF outperforms other methods.

 

Citation

Youngjin Park, Deokjun Eom, Byoungki Seo, and Jaesik Choi. 2020. Improved Predictive Deep Temporal Neural Networks with Trend Filtering. In ACM International Conference on AI in Finance (ICAIF ’20), October 15–16, 2020, New York, NY, USA. ACM, New York, NY, USA, 8 pages. https://doi.org/10. 1145/3383455.3422565