COMPGROUPS.NET | Search | Post Question | Groups | Stream | About | Register

### forecasting time series with neural network (basics)

• Follow

```Hello,
im trying to learn the basics of neural networks i.e. time series prediction.

My code is:
------------------
P = [1:40];
T = [5:44];
net = newff(P,T,1);
net.trainParam.epochs = 400;
net = train(net,P,T);
P = [1:60];
Y = sim(net,P);
------------------
With this basic code and simple series ive gotten very good results when simulating the network on a [1:60] series to predict 4 steps ahead. However when i increase the number of neurons the results are worse, how come? Is more neurons used if the series is more complex? Also should i bother using a network with more hidden layers as the input series is only 1 dimensional, more hidden layers are used if input has more dimensions i take it?

When i use a series which includes a trend and seasonality im having trouble getting satisfying results. Training stops after a few epochs due to 'Validation stop' so ive increased net.trainParam.max_fail. Also sometimes 'Maximum MU reached' occurs, and its not clear to me what is that responsible for this altho a fix is to increase net.trainParam.mu_max to some rediculously high number it seems.
```
 0

```On Sep 7, 5:45=A0pm, "Damian Falk" <date...@gmail.com> wrote:
> Hello,
> im trying to learn the basics of neural networks i.e. time series predict=
ion.
>
> My code is:
> ------------------
> P =3D [1:40];
> T =3D [5:44];
> net =3D newff(P,T,1);
> net.trainParam.epochs =3D 400;
> net =3D train(net,P,T);
> P =3D [1:60];
> Y =3D sim(net,P);
> ------------------
> With this basic code and simple series ive gotten very good results when =
simulating the network on a [1:60] series to predict 4 steps ahead. However=
when i increase the number of neurons the results are worse, how come? Is =
more neurons used if the series is more complex? Also should i bother using=
a network with more hidden layers as the input series is only 1 dimensiona=
l, more hidden layers are used if input has more dimensions i take it?
>
> When i use a series which includes a trend and seasonality im having trou=
ble getting satisfying results. Training stops after a few epochs due to 'V=
alidation stop' so ive increased net.trainParam.max_fail. Also sometimes 'M=
aximum MU reached' occurs, and its not clear to me what is that responsible=
for this altho a fix is to increase >net.trainParam.mu_max to some redicul=
ously high number it seems.

It's hard to comment when you refer to code and functions that
have not been included in the post.

Greg

```
 0

```Greg Heath <heath@alumni.brown.edu> wrote in message <f284d799-54d6-4fea-9166-835986055cd7@u6g2000yqh.googlegroups.com>...
> On Sep 7, 5:45 pm, "Damian Falk" <date...@gmail.com> wrote:
> > Hello,
> > im trying to learn the basics of neural networks i.e. time series prediction.
> >
> > My code is:
> > ------------------
> > P = [1:40];
> > T = [5:44];
> > net = newff(P,T,1);
> > net.trainParam.epochs = 400;
> > net = train(net,P,T);
> > P = [1:60];
> > Y = sim(net,P);
> > ------------------
> > With this basic code and simple series ive gotten very good results when simulating the network on a [1:60] series to predict 4 steps ahead. However when i increase the number of neurons the results are worse, how come? Is more neurons used if the series is more complex? Also should i bother using a network with more hidden layers as the input series is only 1 dimensional, more hidden layers are used if input has more dimensions i take it?
> >
> > When i use a series which includes a trend and seasonality im having trouble getting satisfying results. Training stops after a few epochs due to 'Validation stop' so ive increased net.trainParam.max_fail. Also sometimes 'Maximum MU reached' occurs, and its not clear to me what is that responsible for this altho a fix is to increase >net.trainParam.mu_max to some rediculously high number it seems.
>
> It's hard to comment when you refer to code and functions that
> have not been included in the post.
>
> Greg

If validation stops see to happen too soon, you can try Bayesian Regularization (net.trainFcn = 'trainbr') which promotes generalization by preferring solutions with low weight and bias values (which will result in a smoother network function) as apposed to using validation stops.
```
 0

```On Sep 30, 2:16=A0am, "Mark Hudson Beale" <matlabcentral....@mhbinc.com>
wrote:
> Greg Heath <he...@alumni.brown.edu> wrote in message <f284d799-54d6-4fea-=
> > On Sep 7, 5:45=A0pm, "Damian Falk" <date...@gmail.com> wrote:
> > > Hello,
> > > im trying to learn the basics of neural networks i.e. time series pre=
diction.
>
> > > My code is:
> > > ------------------
> > > P =3D [1:40];
> > > T =3D [5:44];
> > > net =3D newff(P,T,1);
> > > net.trainParam.epochs =3D 400;
> > > net =3D train(net,P,T);
> > > P =3D [1:60];
> > > Y =3D sim(net,P);
> > > ------------------
> > > With this basic code and simple series ive gotten very good results w=
hen simulating the network on a [1:60] series to predict 4 steps ahead. How=
ever when i increase the number of neurons the results are worse, how come?=
Is more neurons used if the series is more complex? Also should i bother u=
sing a network with more hidden layers as the input series is only 1 dimens=
ional, more hidden layers are used if input has more dimensions i take it?
>
> > > When i use a series which includes a trend and seasonality im having =
trouble getting satisfying results. Training stops after a few epochs due t=
o 'Validation stop' so ive increased net.trainParam.max_fail. Also sometime=
s 'Maximum MU reached' occurs, and its not clear to me what is that respons=
ible for this altho a fix is to increase >net.trainParam.mu_max to some red=
iculously high number it seems.
>
> > It's hard to comment when you refer to code and functions that
> > have not been included in the post.
>
> > Greg
>
> If validation stops see to happen too soon, you can try Bayesian Regulari=
zation (net.trainFcn =3D 'trainbr') which promotes generalization by prefer=
ring solutions with low weight and bias values (which will result in a smoo=
ther network function) as apposed to >using validation stops.

Back to basics:

Why didn't you respond by posting the training, validation and
testing
set equations for your more complicated series?

Greg
series with trainin
```
 0

```> > > On Sep 7, 5:45 pm, "Damian Falk" <date...@gmail.com> wrote:
> > > > My code is:
> > > > ------------------
> > > > P = [1:40];
> > > > T = [5:44];
> > > > net = newff(P,T,1);
> > > > net.trainParam.epochs = 400;
> > > > net = train(net,P,T);
> > > > P = [1:60];
> > > > Y = sim(net,P);
> > > > ------------------
> > > > With this basic code and simple series ive gotten very good results when simulating the network on a [1:60] series to predict 4 steps ahead. However when i increase the number of neurons the results are worse, how come? Is more neurons used if the series is more complex? Also should i bother using a network with more hidden layers as the input series is only 1 dimensional, more hidden layers are used if input has more dimensions i take it?

The code above doesn't define a time series, if that is the intent.  TRAIN and SIM represent time series as columns of a cell array, to distinguish them from multiple independent samples which are columns of a matrix.

Also, NEWFF returns a static network with no predict ahead capabilities.

Try the following if your toolbox version is before 2010b/NNT 7.0:

P = con2seq(1:40);
T = con2seq(5:44);
numDelayStates = 4;
numHiddenNeurons = 1;
net = newfftd(P,T,0:numDelayStates,numHiddenNeurons);
net.trainParam.epochs = 400;
net = train(net,P,T);
Y = sim(net,P);

I don't suggest using the network with input P values up to 60.  The reason is that the training inputs range from 0 to 40, so the network is not likely to do well outside this range.  Neural networks tend to generalize, but not extrapolate.  (Occasionally they do for toy problems, but its not something that can be counted on.)

If you have the latest toolbox the recommended function is TIMEDELAYNET instead of NEWFFTD.

net = timedelaynet(0:numDelayStates,numHiddenNeurons);

With regard validation, NEWFFTD and other networks prior to 2010b didn't support validation across time steps (only across samples), but TIMEDELAYNET does.
```
 0

```On Sep 7, 5:45 pm, "Damian Falk" <date...@gmail.com
> wrote:
> Hello,
> im trying to learn the basics of neural networks i.e.
>time series prediction.
>
> My code is:
> ------------------
> P = [1:40];
> T = [5:44];
> net = newff(P,T,1);
> net.trainParam.epochs = 400;
> net = train(net,P,T);
> P = [1:60];
> Y = sim(net,P);
> ------------------
> With this basic code and simple series ive gotten
very good results when simulating the network on a
[1:60] series to predict 4 steps ahead. However when
i increase the number of neurons the results are
worse, how come? Is more neurons used if the series
is more complex? Also should i bother using a network
with more hidden layers as the input series is only 1
dimensional, more hidden layers are used if input has
>more dimensions i take it?
>
> When i use a series which includes a trend and
seasonality im having trouble getting satisfying
results. Training stops after a few epochs due to
'Validation stop' so ive increased
net.trainParam.max_fail. Also sometimes 'Maximum MU
reached' occurs, and its not clear to me what is that
responsible for this altho a fix is to increase
net.trainParam.mu_max to some rediculously high number
>it seems.

In order to use newff for time series you have
to define
1. The delay between output and input (n)
2. The number of input values (m) causing each
output value

For a SISO n-step ahead predictor think in terms
of the model

y(k+n) = f( y(k-m+1:k) ), k >= m >= 1, n >= 1.

However, in many cases it may be practical to
predict all n future values instead of just
the nth future value, i.e.,

y(k+1:k+n) = f( y(k-m+1:k) ).

This is convenient for using predicted values
to be used in future predictions.

In more complex cases additional input
information is used:

y(k+1:k+n) = f( x(k-m+1:k), y(k-m+1:k) )

and

y(k+1:k+n) = f( x(k-m+1:k+1), y(k-m+1:k) ).

In many problems both m and n have to be
determined from the data given some requirement
for prediction accuracy. Auto and cross
correlation information is very valuable for
these decisions.

If y is a column vector, the input and target
matrices are

p = [y(1:m)         y(2:m+1)        ... y(N-n-m+1:N-n)];
t = [y(m+1:m+n) y(m+2:m+n+1) ... y(N-n+1:N)      ];

with

size(p) = [ m  N-n-m+1 ]
size(t)  = [ n  N-n-m+1 ]

Hope this helps.

Greg
```
 0

5 Replies
536 Views

Similiar Articles:

7/24/2012 11:55:01 AM