iDocSlide.Com

Free Online Documents. Like!

All materials on our website are shared by users. If you have any questions about copyright issues, please report us to resolve them. We are always happy to assist you.

Share

Description

Forecasting in high order fuzzy times series by using neural networks to define fuzzy relations

Tags

Transcript

Forecasting in high order fuzzy times series by using neural networksto deﬁne fuzzy relations
Cagdas H. Aladag
a,
*
, Murat A. Basaran
b
, Erol Egrioglu
c
, Ufuk Yolcu
c
, Vedide R. Uslu
c
a
Department of Statistics, University of Hacettepe, Ankara 06800, Turkey
b
Department of Mathematics, University of Nigde, Nigde 51000, Turkey
c
Department of Statistics, University of Ondokuz Mayis, Samsun 55139, Turkey
a r t i c l e i n f o
Keywords:
ForecastingFuzzy relationFuzzy setHigh order fuzzy time seriesNeural networks
a b s t r a c t
A given observation in time series does not only depend on preceding one but also previous ones ingeneral. Therefore, high order fuzzy time series approach might obtain better forecasts than does ﬁrstorder fuzzy time series approach. Deﬁning fuzzy relation in high order fuzzy time series approach aremore complicated than that in ﬁrst order fuzzy time series approach. A new proposed approach, whichuses feed forward neural networks to deﬁne fuzzy relation in high order fuzzy time series, is introducedin this paper. The newproposed approachis applied to well-known enrollment data for the University of Alabama and obtained results are compared with other methods proposed in the literature. It is foundthat the proposed method produces better forecasts than the other methods.
2008 Elsevier Ltd. All rights reserved.
1. Introduction
In recent years, fuzzy time series approach introduced by Songand Chissom (1993a, 1993b) has been used widely. Chen (1996)proposed a method which is simpler than the method proposedbySongand Chissom(1993a, 1993b) in forecasting fuzzy timeser-ies. The method proposed by Chen (1996) does not include com-plex matrix operations in deﬁning fuzzy relation. Huarng andHui-Kuang (2006) uses a simple feed forward neural network todeﬁne fuzzy relation. Due to the ﬁrst order fuzzy times series ap-proach implementation in Huarng and Hui-Kuang (2006), his pro-
posed method includes a simple feed forward neural networkmodel which has one input neuron, two hidden layers’ neurons,and one output neuron. Because of not losing the generalizationability of neural network model, Hurang used two neurons in hid-den layer.Hwang, Chen, and Lee (1998) and Chen (2002) used high orderfuzzy time series model. Chen’s (2002) model consists of deﬁningfuzzyrelationbasedonpreviousobservations.TheimplementationofChen’sapproachbecomesmoredifﬁcultwhentheorderof fuzzytimeseriesincreases. However,neural networkscanbeusedeasilyfor high order fuzzy time series. In this study, feed forward neuralnetworks are employed to deﬁne fuzzy relation by trying variousarchitectures for high order fuzzy time series. The proposedapproach based on neural networks is applied to well-knownenrollment data for University of Alabama. Obtained results arecompared with other methods and it is clearly seen that our pro-posed method has better forecasting accuracy when comparedwith other methods proposed in the literature.Section 2 includes the deﬁnitions of ﬁrst and high order timeseries. The brief information related to neural networks is givenin Section 3. The new proposed method is introduced and theimplementation results of enrollment data are given in Sections4 and 5 respectively. Final section is for conclusion.
2. Fuzzy time series
The deﬁnition of fuzzy time series was ﬁrstly introduced bySongand Chissom(1993a, 1993b). Infuzzy timeseries approxima-tion, you do not need various theoretical assumptions just as youneed in conventional time series procedures. The most importantadvantageoffuzzytimeseriesapproximationsistobeabletoworkwith a very small set of data and not to require the linearityassumption. The some general deﬁnitions of fuzzy time series aregiven as follows:Let
U
be the universe of discourse, where
U
={
u
1
,
u
2
,
. . .
,
u
b
}. Afuzzy set
A
i
of
U
is deﬁned as
A
i
¼
f
A
i
ð
u
1
Þ
=
u
1
þ
f
A
i
ð
u
2
Þ
=
u
2
þ þ
f
A
i
ð
u
b
Þ
=
u
b
, where
f
A
i
is the membership function of the fuzzy set
A
i
;
f
A
i
:
U
! ½
0
;
1
.
u
a
is a generic element of fuzzy set
A
i
;
f
A
i
ð
u
a
Þ
isthe degree of belongingness of
u
a
to
A
i
;
f
A
i
ð
u
a
Þ 2 ½
0
;
1
and1
6
a
6
b
.
Deﬁnition 1.
Fuzzy time series Let
Y
(
t
)(
t
=
. . .
,0, 1, 2,
. . .
) a subsetof real numbers, be the universe of discourse by which fuzzy sets
f
j
(
t
) are deﬁned. If
F
(
t
) is a collection of
f
1
(
t
),
f
2
(
t
),
. . .
then
F
(
t
) iscalled a fuzzy time series deﬁned on
Y
(
t
).
0957-4174/$ - see front matter
2008 Elsevier Ltd. All rights reserved.doi:10.1016/j.eswa.2008.04.001
*
Corresponding author.
E-mail address:
chaladag@gmail.com (C.H. Aladag).Expert Systems with Applications 36 (2009) 4228–4231
Contents lists available at ScienceDirect
Expert Systems with Applications
journal homepage: www.elsevier.com/locate/eswa
Deﬁnition 2.
Fuzzy time series relationships assume that
F
(
t
) iscaused only by
F
(
t
1), then the relationship can be expressedas:
F
(
t
) =
F
(
t
1)
*
R
(
t
,
t
1), which is the fuzzy relationship between
F
(
t
) and
F
(
t
1), where *represents as an operator. To sum up, let
F
(
t
1) =
A
i
and
F
(
t
) =
A
j
. The fuzzy logical relationship between
F
(
t
) and
F
(
t
1) can be denoted as
A
i
?
A
j
where
A
i
refers to theleft-hand side and
A
j
refers to the right-hand side of the fuzzy log-ical relationship. Furthermore, these fuzzy logical relationships canbe grouped to establish different fuzzy relationship.
Deﬁnition 3.
Let
F
(
t
) be a fuzzy time series. If
F
(
t
) is a caused by
F
(
t
1),
F
(
t
2),
. . .
,
F
(
t
m
), then this fuzzy logical relationship isrepresented by
F
ð
t
m
Þ
;
. . .
;
F
ð
t
2
Þ
;
F
ð
t
1
Þ !
F
ð
t
Þ
;
and it is called the
m
th order fuzzy time series forecasting model.
3. Artiﬁcial neural networks
‘What is an artiﬁcial neural network?’ is the ﬁrst question thatshould be answered. Picton (1994) answered this question byseparating this question into two parts. The ﬁrst part is why it iscalled an artiﬁcial neural network. It is called an artiﬁcial neuralnetwork because it is a network of interconnected elements. Theseelements were inspired from studies of biological nervous systems.In other words, artiﬁcial neural networks are an attempt at creat-ing machines that work in a similar way to the human brain bybuilding these machines using components that behave like bio-logical neurons. The second part is what an artiﬁcial neural net-work does. The function of an artiﬁcial neural network is toproduce an output pattern when presented with an input pattern.In forecasting, artiﬁcial neural networks are mathematical modelsthat imitate biological neural networks. Artiﬁcial neural networksconsist of some elements. Determining the elements of the artiﬁ-cial neural networks issue that affect the forecasting performanceof artiﬁcial neural networks should be considered carefully.Elements of the artiﬁcial neural networks are generally given asnetwork architecture, learning algorithm and activation function.One critical decision is to determine the appropriate architecture,that is, the number of layers, number of nodes in each layers andthe number of arcs which interconnects with the nodes (Zurada,1992). However, in the literature, there are not general rules fordetermining the best architecture. Therefore, many architectureshould be tried for the correct results. There are various types of artiﬁcial neural networks. One of them is called as feed forwardneural networks. The feed forward neural networks have beenused successfully in many studies. In the feed forward neural net-works, there are no feedback connections. Fig. 1 depicts the broadfeed forward neural network architecture that has single hiddenlayer and single output.Learning of an artiﬁcial neural network for a speciﬁc task isequivalent to ﬁnding the values of all weights such that the desiredoutput is generated by the corresponding input. Various trainingalgorithms have been used for the determination of the optimalweights values. The most popularly used training method is theback propagation algorithm presented by Smith (2002). In the backpropagation algorithm, learning of the artiﬁcial neural networksconsists of adjusting all weights considering the error measure be-tween the desired output and actual output (Cichocki & Unbehau-en, 1993). Another element of the artiﬁcial neural networks is theactivation function. It determines the relationship between inputsand outputs of a network. In general, the activation function intro-duces a degree of the non-linearity that is valuable in most of theartiﬁcial neural networks applications. The well-known activationfunctions are logistic, hyperbolic tangent, sine (or cosine) and thelinear functions. Among them, logistic activation function is themost popular one (Zhang, Patuwo, & Hu, 1998).In the application, feed forward neural networks architecture,which includes one hidden layer and one output, is used to deﬁnefuzzy relation. Back propagation learning algorithm is used to trainneural network models and logistic activation function is em-ployed in all neurons.
4. The proposed method
In order to constructhigh order fuzzy time series model, variousfeed forward neural networks architectures are employed to deﬁnefuzzy relation. The stages of the proposed method based on neuralnetworks are given below.Stage 1. Deﬁne and partition the universe of discourseThe universe of discourse for observations,
U
= [starting,ending], is deﬁned. After the length of intervals,
l
, isdetermined, the
U
can be partitioned into equal-lengthintervals
u
1
,
u
2
,
. . .
,
u
b
,
b
= 1,
. . .
and their correspondingmidpoints
m
1
,
m
2
,
. . .
,
m
b
, respectively.
u
b
¼ ½
starting
þ ð
b
1
Þ
l
;
starting
þ
b
l
;
m
b
¼ ½
starting
þ ð
b
1
Þ
l
;
starting
þ
b
l
2Stage 2. Deﬁne fuzzy sets.Each linguistic observation,
A
i
, can be deﬁned by theintervals
u
1
,
u
2
,
. . .
,
u
b
.
A
i
¼
f
A
i
ð
u
1
Þ
=
u
1
þ
f
A
i
ð
u
2
Þ
=
u
2
þ þ
f
A
i
ð
u
b
Þ
=
u
b
Stage 3. Fuzzify the observations.For example, a datum is fuzziﬁed to
A
i
, if the maximaldegree of membership of that datum is in
A
i
.Stage 4. Establish the fuzzy relationship with feed forward neu-ral network.An example will be given to explain stage 4 more clearlyfor the second order fuzzy time series. Because of deal-ing with second order fuzzy time series, two inputs areemployed in neural network model, so that lagged vari-ables
F
t
2
and
F
t
1
are obtained from fuzzy time series
F
t
.These series are given in Table 1. The index numbers (
i
)of
A
i
of
F
t
2
and
F
t
1
series are taken as input valueswhose titles are Input-1 and Input-2 in Table 1 for theneural network model. Also, the index numbers of
A
i
of
Fig. 1.
A broad feed forward neural network architecture.
C.H. Aladag et al./Expert Systems with Applications 36 (2009) 4228–4231
4229
F
t
series are taken as target values whose title is Targetin Table 1 for the neural network model. When the thirdobservation is taken as an example, input values for thelearning sample [A
6
,A
2
] are 6 and 2. Then, target valuefor this learning sample is 3.Stage 5. Defuzzify resultsThe defuzzyﬁed forecasts are middle points of intervalswhich correspond to fuzzy forecasts obtained by neuralnetworks in the previous stage.
5. Application
The proposed method is applied to the enrollment data of University of Alabama which is shown in Table 2. First, second,third, and fourth order fuzzy time series model are used in theapplication. In the ﬁrst stage of the proposed method, as Huarng(2001) did, the length of intervals are chosen as 200, 300, 400,500, 600, 700, 800, 900 and 1000. After following stages 2 and 3in algorithm given in Section 4, the number of neurons of hiddenlayer is altered 1 through 4 not to lose generalization ability of neural network model. Table 3 gives results of the proposed meth-od for ﬁrst through fourth high order fuzzy time series androunded mean square error (MSE) values. It is seen that secondorder fuzzy time series model produces better results than thoseobtained from other high order fuzzy time series models, whenTable 3 is examined.Basedonthe proposedmethod, thebest resultis obtained from the second order fuzzy time series model inwhich fuzzy relation is deﬁned by 2–4–1 neural network architec-ture and when the length of interval is 200. Its MSE value is 78,073which is the smallest one among the alternatives.MSE values of other proposed methods available in theliterature such as Song and Chissom (1993a), Song and Chissom(1994), Sullivan and Woodall (1994), Chen (1996), Hwang et al.(1998), Chen (2002) and MSE value of our proposed method arealso given for comparison purpose in Table 4. The result of our pro-
posed method has the smallest MSE value when compared withthe other methods so it can be said that the new proposed methodproduces better forecasts.
6. Results and discussion
A given observation in time series does not only depend onpreceding one but also previous ones in general. Therefore, highorder fuzzy time series approach might represent the fuzzy rela-tion better than does the ﬁrst order fuzzy time series approach.However, deﬁning fuzzy relation in high order is more difﬁcultthan that in the ﬁrst order. In this study, we proposed a methodto deﬁne a fuzzy relation by using neural networks for high orderfuzzy time series. The new proposed method is applied to well-known enrollment data. Obtained results are compared withother methods proposed in the literature. It is observed that thenew proposed method produces the lowest MSE value amongthe other ones. It concludes that using neural networks in deﬁn-ing fuzzy relation can produce better forecasts for high orderfuzzy time series.
Table 1
Notations for second order fuzzy time series
Observation no.
F
t
2
F
t
1
F
t
Input-1 Input-2 Target1 – –
A
6
– – –2 –
A
6
A
2
– – –3
A
6
A
2
A
3
6 2 34
A
2
A
3
A
7
2 3 75
A
3
A
7
A
4
3 7 46
A
7
A
4
A
2
7 4 2
Table 2
Enrollment data
Years Actual Years Actual1971 13,055 1982 15,4331972 13,563 1983 15,4971973 13,867 1984 15,1451974 14,696 1985 15,1631975 15,460 1986 15,9841976 15,311 1987 16,8591977 15,603 1988 18,1501978 15,861 1989 18,9701979 16,807 1990 19,3281980 16,919 1991 19,3371981 16,388 1992 18,876
Table 3
The results of the proposed method
IntervallengthNeurons numberin hidden layerMSE valuesFirst order SecondorderThirdorderFourthorder200 1 288,659 197,413 764,552 1,174,929300 1 293,983 227,093 301,715 1,216,368400 1 299,602 221,413 278,531 1,221,962500 1 377,040 310,943 242,315 1,289,190600 1 355,897 270,833 282,721 1,319,485700 1 516,602 420,843 307,600 927,535800 1 475,659 394,693 509,394 1,187,540900 1 429,297 285,593 368,094 1,196,3011000 1 574,659 462,893 420,236 1,144,107200 2 11,547,935 170,493 11,021,773 1,222,640300 2 11,846,068 201,263 11,318,600 1,355,101400 2 12,149,202 215,973 11,620,426 1,258,674500 2 11,254,802 207,343 10,729,947 1,201,357600 2 12,770,468 224,513 12,239,079 1,310,085700 2 11,846,068 355,743 11,318,600 1,233,201800 2 13,411,735 327,893 12,877,731 1,373,940900 2 13,739,868 208,463 13,204,557 1,405,5011000 2 12,770,468 364,193 12,239,079 942,662200 3 176,545 10,479,673 369,015 1,392,885300 3 188,011 10,763,903 392,063 1,128,468400 3 202,040 11,053,133 508,594 1,112,585500 3 217,421 207,343 459,368 1,163,079600 3 225,783 11,646,593 515,394 1,526,485700 3 334,335 367,013 355,126 1,012,779800 3 302,554 269,093 571,457 1,688,429900 3 344,868 253,913 460,652 1,360,0011000 3 530,849 361,193 246,973 2,064,551200 4 175,973
78,073
396,847 1,318,640300 4 194,583 117,413 421,084 1,433,035400 4 193,278 132,533 450,742 1,202,585500 4 215,992 168,743 414,631 1,270,635600 4 218,468 142,733 9,410,110 1,490,485700 4 334,335 151,693 274,147 1,068,468800 4 345,754 243,173 449,857 1,366,562900 4 330,297 226,283 510,010 13,932,7011000 4 522,754 387,193 246,973 12,917,551
Table 4
The comparison of the results
Method Order MSESong and Chissom (1993a) 1 412,499Song and Chissom (1994) 1 775,687Sullivan and Woodall (1994) 1 386,055Chen (1996) 1 407,507Hwang et al. (1998) 5 278,919Chen (2002) 3 86,694Our proposed method 2 78,0734230
C.H. Aladag et al./Expert Systems with Applications 36 (2009) 4228–4231
References
Chen, S. M. (1996). Forecasting enrollments based on fuzzy time-series.
Fuzzy Setsand Systems, 81
, 311–319.Chen, S. M. (2002). Forecasting enrollments based on high order fuzzy time series.
Cybernetics and Systems, 33
, 1–16.Cichocki, A., & Unbehauen, R. (1993).
Neural networks for optimization and signal processing
. New York: John Wiley & Sons.Huarng, Kunhuang (2001). Effective length of intervals to improve forecasting infuzzy time-series.
Fuzzy Sets and Systems, 123
, 387–394.Huarng, Kunhuang, & Yu, Hui-Kuang (2006). The application of neural networks toforecast fuzzy time series.
Physica A, 363
, 481–491.Hwang, J. R., Chen, S. M., & Lee, C. H. (1998). Handling forecasting problems usingfuzzy time series.
Fuzzy Sets and Systems, l00
(2), 2l7–228.Picton, P. D. (1994).
Introduction to neural networks
. Macmillan Press Ltd.Smith, K. A. (2002).
Neural networks in business: Techniques and applications
. IdeaGroup Publishing: Imprint Info Hershey, PA.Song, Q., & Chissom, B. S. (1993a). Fuzzy time series and its models.
Fuzzy Sets andSystems, 54
, 269–277.Song, Q., & Chissom, B. S. (1993b). Forecasting enrollments with fuzzy time series –Part I.
Fuzzy Sets and Systems, 54
, 1–10.Song, Q., & Chissom, B. S. (1994). Forecasting enrollments with fuzzy time series –Part II.
Fuzzy Sets and Systems, 62
(l), 1–8.Sullivan, J., & Woodall, W. H. (1994). A comparison of fuzzy forecasting and Markovmodeling.
Fuzzy Sets and Systems, 64
(3), 279–293.Zhang, G., Patuwo, B. E., & Hu, Y. M. (1998). Forecasting with artiﬁcial neuralnetworks: The state of the art.
International Journal of Forecasting, 14
,35–62.Zurada, J. M. (1992).
Introduction of artiﬁcial neural systems
. St. Paul: WestPublishing.
C.H. Aladag et al./Expert Systems with Applications 36 (2009) 4228–4231
4231

We Need Your Support

Thank you for visiting our website and your interest in our free products and services. We are nonprofit website to share and download documents. To the running of this website, we need your help to support us.

Thanks to everyone for your continued support.

No, Thanks