FIR NNs and Time Series Prediction: Applications to Stock Market

7 downloads 0 Views 201KB Size Report
... learning algorithm are proposed for Stock Market forecasting results on real- ... can use standard algorithms or neural paradigms specifically proposed for the.
FIR NNs and Time Series Prediction: Applications to Stock Market Forecasting Antonio d' Acierno, Salvatore Palma and Walter Ripullone I.R.S.I.P. - C.N.R. via P. Castellino 111 - 80131 Napoli - Italy

Abstract

This paper deals with the Time Series Prediction problem, that is the prediction of a system future evolution given a certain knowledge about it and its past behavior the neural networks based approach oers a new and a very promising concept of non linear prediction. The use of FIR neural networks and of the Temporal Back{Propagation learning algorithm are proposed for Stock Market forecasting results on realworld series from "Milano Piazza Aari" Stock Exchange are shown and commented.

Keywords: Time Series Forecasting, FIR NNs, Temporal BP.

1 Introduction The time series prediction problem can be simply stated as follows: given a sequence (1) (2) ( ) up to time , nd the continuation ( + 1) ( + 2) The series may arise either from discrete time systems or from the sampling of a continuous system, and be either pseudo{stochastic or deterministic in origin. Applications of prediction in signal processing range from adaptive line enhancers to dierential pulse code modulation schemes for telecommunications. Prediction is also used in modelling turbolence and solar ux, forecasting the weather and managing stockmarket portfolios. The standard prediction approach involves constructing an underlying model which gives rise to the observed sequence. In this sense (i.e. prediction as system identication) one is usually interested in estimating a single time step and, moreover, typically it exists an exogenous input which drives the system. When this solution cannot be applied since, for example, the underlying model is unknown, or it does not perform satisfactorialy, or one is interested in estimating several time steps, dierent approaches such as radial basis functions networks and neural networks can be used. Among neural networks based methods, we can use standard algorithms or neural paradigms specically proposed for the problem at hand. y

y N

y N

 Corresponding

y

:::y N

:::

author: [email protected]

N

x(n)

x(n-1)

q-1

w(0)

q-1

x(n-2)

w(1)

+

x(n-M)

q-1

w(2)

w(M)

+

+

s(n)

Figure 1: A simple connection in a FIR neural network. In this paper we use a neural network paradigm specically devised for time series prediction we apply such a model to real world series from "Milano Piazza Aari" Stock Exchange and we show results for medium term forecasting.

2 FIR NNs and Temporal BackPropagation FIR neural networks 1] have been proposed for time-series prediction these networks are classical multilayer perceptrons whose connections are Finiteduration Impulse Response (FIR) lters (g. 1). Let jil ( ) denote the weight that connects neuron in layer ; 1 to neuron in layer (the index ranges from 0 to l , where l is the total number of delay units of FIR lters between layers ; 1 and ). The signal lji ( ) appearing at the output of the ; incoming synapse of neuron is given by the linear combination of delayed values of the input signal li;1 ( ) (convolution sum): w

j

k

i

l

k

M

l

i

l sji (n) =

XM

k=0

l

s

th

n

j

x

l

l

M

n

l;1 l ji (k )  xi (n ; k )

(1)

w

where denotes the discrete time step. Hence, summing the contributions of the complete set of synapse, we may describe the output lj of neuron by using the following equation: n

p

x

Xp lji( ) ; j )

l (n) = ( j

x

i=1

s

n

j

(2)



where j denotes the externally applied threshold, and ( ) denotes the nonlinear activation function of the neuron. The classical Back{Propagation cannot be applied as it is since, by unfolding in time the network in its static equivalent, we obtain a structure with duplicated weights so obtaining that (i) the necessary calculations expands and that (ii) the locally distributed process is lost since individual gradient terms must be recombined to obtain the total gradient for each unique lter coecient. 

 :

To train the network, Wan proposed a generalised version of the standard Back{Propagation, for which he suggested the name Temporal Back{ Propagation. The key concept is the following. Consider the FIR lter shown in g. 1 what happens in the backward step of the standard Back{Propagation algorithm, when error terms of, say, last hidden layer have to be evaluated using error terms of output neurons and connection backward? Wan suggested the concept of reverse FIR lter, where delay units are replaced with advances units. The learning algorithm 1] can be now summarised as follows: l;1 l l (3) ji (n) ;   j (n)  xi (n) ;2  ej (n)   (xLj (n)) f or l = L (4) Nl+1 l+1 f or 1