iDocSlide.Com

Free Online Documents. Like!

All materials on our website are shared by users. If you have any questions about copyright issues, please report us to resolve them. We are always happy to assist you.

Share

Description

A NEW ARCHITECTURE SELECTION STRATEGY IN SOLVING SEASONAL AUTOREGRESSIVE TIME SERIES BY ARTIFICIAL NEURAL NETWORKS

Tags

Transcript

Hacettepe Journal of Mathematics and StatisticsVolume 37(2) (2008), 185–200
A NEW ARCHITECTURE SELECTIONSTRATEGY IN SOLVING SEASONALAUTOREGRESSIVE TIME SERIES BYARTIFICIAL NEURAL NETWORKS
Cagdas Hakan Aladag
∗
, Erol Egrioglu
†
and Suleyman Gunay
∗
Received 07:08:2008 : Accepted 31:10:2008
Abstract
The only suggestions given in the literature for determining the archi-tecture of neural networks are based on observations, and a simulationstudy to determine the architecture has not yet been reported. Basedon the results of the simulation study described in this paper, a newarchitecture selection strategy is proposed and shown to work well. Itis noted that although in some studies the period of a seasonal timeseries has been taken as the number of inputs of the neural networkmodel, it is found in this study that the period of a seasonal time seriesis not a parameter in determining the number of inputs.
Keywords:
Architecture selection, Seasonal autoregressive time series, Neural net-works, Forecasting, Simulation.
2000 AMS Classiﬁcation:
62–04, 62M10, 82C32, 90C59.
1. Introduction
There are many studies in the literature that use artiﬁcial neural networks (ANN) toanalyze time series. However, a simulation study does not seem to have been used todetermine the architecture of an ANN. Due to the lack of theoretical knowledge aboutdetermining the architecture, empirical results are widely employed by researchers. Tangand Fishwick [13] suggested that the number of inputs can be taken as the number of terms of an autoregressive (AR). Lachtermacher and Fuller [8] claimed that when oneneuron is used in the output layer, a number of inputs greater than one could aﬀect theresults in a negative way. Sharda and Patil [11] and Tang
et al.
[12] took heuristically
∗
Department of Statistics, Hacettepe University, Ankara 06800, Turkey.E-mail : (C.H. Aladag)
aladag@hacettepe.edu.tr
(S. Gunay)
sgunay@hacettepe.edu.tr
†
Department of Statistics, Ondokuz Mayis University, Samsun 55139, Turkey.E-mail :
erole@omu.edu.tr
186 C.H. Aladag, E. Egrioglu, S. Gunay
the period of the time series as the number inputs in the analysis of seasonal time series.Buhamra
et al.
[2] suggested that the number of inputs should be determined according tothe Box-Jenkins approach. The most often applied method in determining the number of neurons in the hidden layer is the trial and error method given by Zhang [15]. In addition,when the number of inputs is
n
, Lippmann [9] and Hecth-Nielsen [5] took 2
n
+ 1 as thenumber of neuron in the hidden layer, Wong [14], Tang and Fishwick [13], and Kang [7]took it to be 2
n
,
n
and
n/
2, respectively.In this study, a simulation study has been performed to determine an architecturefor analyzing autoregressive time series that has diﬀerent periods when an output anda hidden layer are used. A new selection strategy is proposed to decide the numberof neurons in the input layer and in the hidden layer. In contrast to the trial anderror method by Zhang [15], the proposed method saves time and has an accepted errorlevel without trying all combinations of architecture. In addition, the error that can becaused by the random selection of architectures can be avoided. Section 2 gives some brief information about ANN. Section 3 includes the detailed information about the simulationstudy and the results. The new selection strategy is introduced in Section 4. Section 5consists of a discussion of new method and results.
2. Artiﬁcial Neural Networks
“What is an artiﬁcial neural network?” is the ﬁrst question that should be answered.Picton [10] answered this question by separating it into two parts. The ﬁrst part is whyit is called an artiﬁcial neural network. It is called an artiﬁcial neural network becauseit is a network of interconnected elements. These elements were inspired by studies of biological nervous systems. In other words, artiﬁcial neural networks are an attemptat creating machines that work in a similar way to the human brain by building thesemachines using components that behave like biological neurons. The second part is whatan artiﬁcial neural network does. The function of an artiﬁcial neural network is to producean output pattern when presented with an input pattern. In forecasting, artiﬁcial neuralnetworks are mathematical models that imitate biological neural networks. Artiﬁcialneural networks consist of some elements. Determining the elements of the artiﬁcialneural networks is an issue that aﬀects the forecasting performance of the network, andshould be considered carefully. The elements of an artiﬁcial neural networks are generallygiven as the network architecture, the learning algorithm and the activation function [4].One critical decision is to determine the appropriate architecture, that is, the number of layers, number of nodes in each layers and the number of arcs which interconnects withthe nodes [18]. However, in the literature, there are no general rules for determiningthe best architecture. Therefore, many architecture have to be tried to get the correctresults. There are various types of artiﬁcial neural network. One of these is known as the
feed forward
neural network. Feed forward neural networks have been used successfullyin many studies [4]. In feed forward neural networks, there are no feedback connections.The
broad feed forward neural network architecture
that has single hidden layer and singleoutput is illustrated on the next page.Determining the
learning algorithm
of an artiﬁcial neural network for a speciﬁc taskis equivalent to ﬁnding the values of all weights such that the desired output is generatedby the corresponding input. Various training algorithms have been used for the deter-mination of the optimal weights values. The most popularly used training method isthe back propagation algorithm. In the back propagation algorithm, learning consists of adjusting all weights according to the measure of error between the desired output andactual output [4].
A New Architecture Selection Strategy 187
A broad feed forward neural network architecture
Another element of an artiﬁcial neural network is the activation function. This determinesthe relationship between inputs and outputs of the network. In general, the activationfunction introduces a degree of the non-linearity that is valuable in most artiﬁcial neuralnetwork applications. The well known activation functions are the logistic, hyperbolictangent, sine (or cosine) and the linear functions. Among these, the logistic activationfunction is the most popular [15].
3. The simulation study
The computer code, called NN-Back Propagation, given in [1], is employed in thesimulation study. By using the expression below,(1)
Z
t
=
φZ
t
−
s
+
a
t
ten time series, each with a length of 100 were generated with parameters
s
= 4
,
6
,
12and
φ
= 0
.
5 for the ﬁrst order autoregressive model (SAR(1)) where
s
and
φ
representthe period and autoregressive parameter respectively. Thus, the total number of timeseries was 30. In the literature, a small number of time series are employed for simulationstudies in order to complete them in an acceptable time frame when ANN is the methodused to analyze the time series. For example, Zhang
et al.
[16] used a total number of 8 series, and a similar study was conducted with 5 time series by Hwrang [6]. In thisstudy, the choice of a total number of 30 times series is aimed at obtaining more reliableresults. Each of the time series generated are analyzed with ANN. Then 95 percent of the whole data was taken as training data, and the remaining 5 percentused as test data.The components of the ANN used in the simulation study are expressed below.
Architecture structure:
For each case examined, the number of neurons in the inputlayer varied from 1 through 12, the number of neurons in the hidden layer likewise variesfrom 1 through 12, and there was one neuron in the output layer so a total of 144 archi-tectures were used for each time series. Hence a total of 4320 diﬀerent architectures wereexamined for the 30 time series generated. The feed forward neural network architecturestructure, which includes a direct link between the neurons in the input layer and theoutput neuron, was used.
The learning algorithm:
The Back Propagation Algorithm, in which the learningparameter is updated at each iteration, was used to ﬁnd the best values of the weights.
The activation function:
The logistic function given below was used as an activationfunction.(2)
f
(
x
) = (1 + exp(
−
γ x
))
−
1
A New Architecture Selection Strategy 189
Figure 1. The graphs of calculated values based on the number of neuronsin the hidden layer for time series with period 4

We Need Your Support

Thank you for visiting our website and your interest in our free products and services. We are nonprofit website to share and download documents. To the running of this website, we need your help to support us.

Thanks to everyone for your continued support.

No, Thanks