• 沒有找到結果。

Forecasting Wavelet Transformed Time Series with Attentive Neural Networks

N/A
N/A
Protected

Academic year: 2022

Share "Forecasting Wavelet Transformed Time Series with Attentive Neural Networks"

Copied!
12
0
0

加載中.... (立即查看全文)

全文

(1)

Forecasting Wavelet Transformed Time Series with Attentive Neural Networks

ICDM 2018

Yi Zhao1, Yanyan Shen*1, Yanmin Zhu1, Junjie Yao2

1Shanghai Jiao Tong University

2East China Normal University

(2)

Outline

Motivation

Preliminaries

Model

Experiments

Conclusion

2

(3)

Motivation

Forecasting complex time series demands time-domain & frequency- domain information.

e.g., stock prices, web traffic, etc.

Various methods to extract local time-frequency features which are i mportant to predict the future values.

Fourier Transform

Short-time Fourier Transform

Wavelet Transform

Use the varying global trend to identify the most salient parts of local time-frequency information to better predict the future values.

3

(4)

Preliminaries

4

Problem Statement

Given a time series , predict , the future value in time via a function :

Wavelets

Given a basic wavelet function h, we can get the wavelets

Continuous Wavelet Transform (CWT)

The continuous wavelet transform refers to the “similarity” between the signal and the basis function

 

(5)

Model Overview

5

LST M

1. Input time series

2. Scalogram 3. CNN feature extraction

4. Attention Module

5. Fusion &

Prediction

^�+�

f_att(, W)  

Preprocessing

Given input time series , we denote by the wavelet transform coefficients matrix.

The scalogram is defined as follows:

 

Source: Wavelet Tutorial by Robi Polikar, http://users.rowan.edu/~polikar/WTpart3.html

(6)

Model

D

AttentionNet

 

 

 

 �−� D

LST M

LST M

LST M

h1 h�−1   

h h�−1 

h    

^�+� 

VGG output features Attention Module

Fusion & Prediction

 1

 t

 T

C

(7)

Model

CNN: extract local time-frequency features

Feed scalogram to a stack of convolution layers:

LSTM: learn global long-term trend and get hidden state in the last step

Attention module: discriminate the importance of local features dynamically

Given time-frequency features and

Attention score:

Weighted sum of local time-frequency features:

Fusion & Prediction: combine local and global features for prediction

Objective Function

Squared Loss:

7

(8)

Datasets

Stock opening prices

Collected from Yahoo! Finance.

Daily opening prices of 50 stocks among 10 sectors from 2007 t o 2016.

Each stock has 2518 daily opening prices. Daily opening prices fr om 2007 to 2014 are used as training data, and those in 2015 and 2016 are used for validation and testing, respectively.

Power consumption

Electric power consumption in one household over 4 years.

Sampled at one-minute rate.

475,023 data points in year 2010.

8

(9)

Main Results

9

Metric

:

Mean Squared Error:

Baselines

Naïve: take the last value in the series as the prediction value

Ensemble of LSTM & CNN: feed the concatenation of features from VGGnet and the last h idden state from LSTM into the fusion and prediction directly.

(10)

Case Study

10

Illustration of attention mechanism

Given an input of 20 stock prices, we show the scalogram, and t he attention weights.

The model attends to the local features that are similar to the glo bal trend and helps in predicting the future value.

(11)

Conclusion

Wavelet transform is able to explicitly disclose the latent component s at different frequencies from a complex time series.

We develop a novel attention-based neural network that leverages C NN to extract local time-frequency features and applies LSTM to capt ure the long-term global trend simultaneously.

The experimental results on two real life datasets verify the usefulne ss of time-frequency information from wavelet transformed time serie s and the our method in terms of prediction accuracy.

11

(12)

THANK YOU!

Q&A

參考文獻

相關文件

2.1.1 The pre-primary educator must have specialised knowledge about the characteristics of child development before they can be responsive to the needs of children, set

Reading Task 6: Genre Structure and Language Features. • Now let’s look at how language features (e.g. sentence patterns) are connected to the structure

Understanding and inferring information, ideas, feelings and opinions in a range of texts with some degree of complexity, using and integrating a small range of reading

Writing texts to convey information, ideas, personal experiences and opinions on familiar topics with elaboration. Writing texts to convey information, ideas, personal

 Promote project learning, mathematical modeling, and problem-based learning to strengthen the ability to integrate and apply knowledge and skills, and make. calculated

Writing texts to convey simple information, ideas, personal experiences and opinions on familiar topics with some elaboration. Writing texts to convey information, ideas,

Using this formalism we derive an exact differential equation for the partition function of two-dimensional gravity as a function of the string coupling constant that governs the

This kind of algorithm has also been a powerful tool for solving many other optimization problems, including symmetric cone complementarity problems [15, 16, 20–22], symmetric