# Multivariate LSTM with missing values

## Issue

I am working on a Time Series Forecasting problem using LSTM.
The input contains several features, so I am using a Multivariate LSTM.
The problem is that there are some missing values, for example:

``````    Feature 1     Feature 2  ...  Feature n
1    2               4             nan
2    5               8             10
3    8               8              5
4    nan             7              7
5    6              nan            12
``````

Instead of interpolating the missing values, that can introduce bias in the results, because sometimes there are a lot of consecutive timestamps with missing values on the same feature, I would like to know if there is a way to let the LSTM learn with the missing values, for example, using a masking layer or something like that? Can someone explain to me what will be the best approach to deal with this problem?
I am using Tensorflow and Keras.

## Solution

As suggested by François Chollet (creator of Keras) in his book, one way to handle missing values is to replace them with zero:

In general, with neural networks, it’s safe to input missing values as
0, with the condition that 0 isn’t already a meaningful value. The
network will learn from exposure to the data that the value 0 means
missing data and will start ignoring the value. Note that if you’re
expecting missing values in the test data, but the network was trained
on data without any missing values, the network won’t have learned to
ignore missing values! In this situation, you should artificially
generate training samples with missing entries: copy some training
samples several times, and drop some of the features that you expect
are likely to be missing in the test data.

So you can assign zero to `NaN` elements, considering that zero is not used in your data (you can normalize the data to a range, say [1,2], and then assign zero to `NaN` elements; or alternatively, you can normalize all the values to be in range [0,1] and then use -1 instead of zero to replace `NaN` elements.)

Another alternative way is to use a `Masking` layer in Keras. You give it a mask value, say 0, and it would drop any timestep (i.e. row) where all its features are equal to the mask value. However, all the following layers should support masking and you also need to pre-process your data and assign the mask value to all the features of a timestep which includes one or more `NaN` features. Example from Keras doc:

Consider a Numpy data array `x` of shape `(samples, timesteps,features)`,
to be fed to an LSTM layer. You want to mask timestep #3
and #5 because you lack data for these timesteps. You can:

• set `x[:, 3, :] = 0.` and `x[:, 5, :] = 0.`

• insert a Masking layer with `mask_value=0.` before the `LSTM` layer:

``````model = Sequential()