Forecasting time series with neural networks in R

By | February 10, 2017

I have been looking for a package to do time series modelling in R with neural networks for quite some time with limited success. The only implementation I am aware of that takes care of autoregressive lags in a user-friendly way is the nnetar function in the forecast package, written by Rob Hyndman. In my view there is space for a more flexible implementation, so I decided to write a few functions for that purpose. For now these are included in the TStools package that is available in GitHub, but when I am happy with their performance and flexibility I will put them in a package of their own.

Here I will provide a quick overview of what these is available right now. I plan to write a more detailed post about these functions when I get the time.

For this example I will model the AirPassengers time series available in R. I have kept the last 24 observations as a test set and will use the rest to fit the neural networks. Currently there are two types of neural network available, both feed-forward: (i) multilayer perceptrons (use function mlp); and extreme learning machines (use function elm).

# Fit MLP <- mlp(

This is the basic command to fit an MLP network to a time series. This will attempt to automatically specify autoregressive inputs and any necessary pre-processing of the time series. With the pre-specified arguments it trains 20 networks which are used to produce an ensemble forecast and a single hidden layer with 5 nodes. You can override any of these settings. The output of print is a summary of the fitted network:

MLP fit with 5 hidden nodes and 20 repetitions.
Series modelled in differences: D1.
Univariate lags: (1,3,4,6,7,8,9,10,12)
Deterministic seasonal dummies included.
Forecast combined using the median operator.
MSE: 6.2011.

As you can see the function determined that level differences are needed to capture the trend. It also selected some autoregressive lags and decided to also use dummy variables for the seasonality. Using plot displays the architecture of the network (Fig. 1).

Fig. 1. Output of plot(

The light red inputs represent the binary dummies used to code seasonality, while the grey ones are autoregressive lags. To produce forecasts you can type:

mlp.frc <- forecast(,h=tst.n)

Fig. 2 shows the ensemble forecast, together with the forecasts of the individual neural networks. You can control the way that forecasts are combined (I recommend using the median or mode operators), as well as the size of the ensemble.

Fig. 2. Output of the plot function for the MLP forecasts.

You can also let it choose the number of hidden nodes. There are various options for that, but all are computationally expensive (I plan to move the base code to CUDA at some point, so that computational cost stops being an issue).

# Fit MLP with automatic hidden layer specification <- mlp(,"valid",hd.max=10)

This will evaluate from 1 up to 10 hidden nodes and pick the best on validation set MSE. You can also use cross-validation (if you have patience…). You can ask it to output the errors for each size:

H.1  0.0083
H.2  0.0066
H.3  0.0065
H.4  0.0066
H.5  0.0071
H.6  0.0074
H.7  0.0061
H.8  0.0076
H.9  0.0083
H.10 0.0076

There are a few experimental options in specifying various aspects of the neural networks, which are not fully documented and is probably best if you stay away from them for now!

ELMs work pretty much in the same way, although for these I have made default the automatic specification of the hidden layer.

# Fit ELM <- elm(

This gives the following network summary:

ELM fit with 100 hidden nodes and 20 repetitions.
Series modelled in differences: D1.
Univariate lags: (1,3,4,6,7,8,9,10,12)
Deterministic seasonal dummies included.
Forecast combined using the median operator.
Output weight estimation using: lasso.
MSE: 83.0044.

I appreciate that using 100 hidden nodes on such a short time series can make some people uneasy, but I am using a shrinkage estimator instead of conventional least squares to estimate the weights, which in fact eliminates most of the connections. This is apparent in the network architecture in Fig. 3. Only the nodes connected with the black lines to the output layer contribute to the forecasts. The remaining connection weights have been shrunk to zero.

Fig. 3. ELM network architecture.

Another nice thing about these functions is that you can call them from the thief package, which implements Temporal Hierarchies forecasting in R. You can do that in the following way:

# Use THieF
mlp.thief <- thief(,h=tst.n,forecastfunction=mlp.thief)

There is a similar function for using ELM networks: elm.thief.

Since for this simple example I kept some test set, I benchmark the forecasts against exponential smoothing:

Method MAE
MLP (5 nodes) 62.471
MLP (auto) 48.234
ELM 48.253
THieF-MLP 45.906
ETS 64.528

Temporal hierarchies, like MAPA, are great for making your forecasts more robust and often more accurate. However, with neural networks the additional computational cost is evident!

These functions are still in development, so the default values may change and there are a few experimental options that may give you good results or not!

30 thoughts on “Forecasting time series with neural networks in R

  1. Dmitrii

    Hello Nikos, great post, thank you! Can I ask you some questions please:
    1. Do you know, to what extent mlp{TStools} differs from mlp{RSNNS} or they essentially use a similar technique?
    2. Does mlp{TStools} need prior scaling of input time series and/or exogenous regressors, or it scales them automatically?
    3. Is there any maximum number of exogenous regressors that can be included?
    Thank you!

    1. Nikos Post author

      Hello! They are quite different in that mlp{TStools} does all the preprocessing and model setup for time series forecasting automatically. So it takes care of scaling of the target and exogenous variables, differencing, it can do automatic selection of input lags and hidden nodes, unless otherwise specified, introduce seasonal dummy variables (if needed) and so on. The mlp{RSNNS} offers functionality to build and train a neural network, but you would have to do all the preprocessing manually. In some sense the mlp{TStools} is build in the same philosophy as auto.arima, where the user does not have to worry about preprocessing the data or specifying the model details. There is no maximum number of regressors it can accommodate, but I should say that it is not very fast in training large neural networks (especially given that the output is ensemble based by default). For that GPU based neural networks would be idea, which I have not implemented in R yet.

      1. Dmitrii

        Thank you for reply, Nikos.
        I have recently read your article “Segmenting electrical load time series for forecasting? An empirical evaluation of daily UK load patterns”, where you wrote that you had trained your MLPs using Levenberg-Marquardt function and eventually you received quite low MAPEs. Is it something that is implemented within mlp{TStools}? And may I ask what R function did you use at that time for this analysis?
        In your research you came to the conclusion that forecasting entire time series is better than forecasting decomposed series. Would you suggest using for mlp{TStools} for forecasting consecutive electricity load with 100,000+ observation, or its computational capacity is unable to capture such a big dataset and it is worth using mlp{TStools} for forecasting decomposed time series?
        Thank you!

        1. Nikos Post author

          That analysis was done in MatLab. I do not expect that there will be too much difference due to the specific training algorithms.
          100k time series will take a lot of time to train with most neural network implementations in R. mlp in TStools will most surely be slow… go for lunch, coffee and a nice walk while calculating slow. For such massive datasets you need very efficient implementations that make use of your GPU (assuming your graphics card is CUDA capable). Matlab allows that, but there are also options for R, but essential connect with external implementations (for e.g. java).
          Unfortunately, neural networks are still not trivial to use and require the computer skills to put together a solution programmatically.

  2. Daniel

    Hello Nikos,

    Let’s see if you could give me a hand on this. Im trying to do a prediction algorithm on mechanical failures. I have the data and my doubt comes on how to implement it. Is it a time series? I know I could use survival analysis but that’s just statistics, I want to use ML so I thought to use NN and that’s when i came to your article. What you say?
    Thanks in advance 😉

    1. Nikos Post author

      Hi Daniel,
      Interesting question; as you suggest typically people model this as a survival analysis problem. What kind of variable do you have available? (and software restrictions?)

  3. Holger


    I like the forecast package, but I saw it is limited to Aroma when you want to include exogenous variables. Can you include those here? Neural networks should be ideal for that problem …


    1. Nikos Post author

      You can, there is an xreg argument to help you do that. Just keep in mind the code is still beta, hence not on CRAN yet!

  4. vivek

    Hi Nikos,

    Can we use TStools or THief package for intermittent data too?


    1. Nikos Post author

      Yes! You would use tsintermittent and thief packages and or the thief forecast you would need to use the forecast function argument. For examples of how to do this have a look on the smooth.thief function in TStools.

  5. sonia

    Hello Nikos,
    Hope you’re fine…i’ve question about fitting of neural network in forecast package for sunspot.year data.I can’t understand how to specify neural network of order 4x4x1…pl reply me soon…waiting for your response

    1. Nikos Post author

      Hi Sonia, I think what you are looking for is the argument hd, which should be hd=c(4,4). The single output layer is implied. So you would write something along the lines: mlp(data,hd=c(4,4),…). Hope this helps!

  6. saarbaan

    hi sir ,
    i m a new learner in ANN model in my study i m using R to forecast the time series data of inflation .but i dont know how to use ANN mdel in basic. after
    reading the data in R and converting them in time serries, then how to bring in ANN model for further process

    1. Nikos Post author

      Once the data is a time series then you could build a very basic model just by letting everything on the default settings. So if your time series is in variable y, you could just write mlp(y). I do not expect this to give any good forecasts for infation though! If you are new to ANNs, my suggestion would be to first familiarise yourself how to use them on relatively simpler data!

      1. saarbaan

        Thanks a lot sir for your precious ideas , and suggestion regarding my study ,
        Actually my study on forecasting inflation (cpi) rate using some almost 50 years data in annually . And comparing Arima model and ANN model, A
        now Arima model is understandable in R , the problm is ANN model where i m basic learner and its command in R confusing a lot ,
        So sir kindly any new suggestion regarding this or any book , paper where i can get help some how basically
        I will be your thankful,

  7. eugene

    Hi nikos,
    I tried forecasting with the xreg componet, to account for days in the week seasonality. However, I encountered this error and I am trying to make some sense out of it. I have not encountered this problem before, although i have frequently being including xreg components in arima modelling.
    Hope to get some advise! 🙂

    > mlp.frc<-forecast(,xreg=forecastmatrix,h=151)
    Error in, h = h, y = y, xreg = xreg, …) :
    Length of xreg must be longer that y + forecast horizon

    [1] 151 6

    As you can see I create this matrix to forecast 151 days in the future ~ 6 columns for the days of the week.

  8. manny

    What would this error imply? Can someone help

    Error in if ((decomposition == “multiplicative”) && (min(y) <= 0)) { :
    missing value where TRUE/FALSE needed

    1. Nikos Post author

      Hi Manny,
      Are you using the latest version? Use the one on gitub:
      I think that bug is fixed there, if you get the same error still, please let me have the line of code you are trying to run.

      1. mannr

        Yes, I am using the latest package. I have also shared a data sample with you. Wish you help me!!!

        1. Nikos Post author

          I think you are getting this because you are trying to run the networks in very short time series. I will program a slightly more understandable error message!

  9. Abdul

    Hi Niko,
    Thanks for the initiative.
    How to include Xreg argument in forecast, it throws error while doing the exact same way as i do in “Forecast” package.
    i get the following error

    Error in, h = h, y = y, xreg = xreg, …) :
    Length of xreg must be longer that y + forecast horizon.

    Thanks for your help!!

  10. sagar <- mlp(x)
    Error: could not find function "mlp"

    Every time it is giving me above error. I have installed all the required package.

    1. Nikos Post author

      I cannot reproduce this error. I have used the package in multiple computers. Any additional information would be helpful!

  11. Pankaj Joshi

    Hello Professor,

    Hats off for your contributions.

    Is it possible to use an existing model (from a previous call to elm) with new time series in the elm() function?

    Thank you very much,

    1. Nikos Post author

      It now is! Took me a while to program it in, but now you can use the arguments `model’ and `retrain’ to reuse model with or without retraining the weights.
      Already available in github, I will push to CRAN as well.

  12. Pingback: October 2017 New Packages – Cloud Data Architect

Leave a Reply

Your email address will not be published. Required fields are marked *