Why does Keras sequential model return multiple predictions per test sample?

Issue

I don’t work with Keras or TF very often so just trying to understand how it works. For example, this is a bit confusing: we generate some points of sine plot and trying to predict the remainder:

import numpy as np
from tensorflow.keras import layers
from tensorflow.keras.models import Sequential

a = np.array([np.sin(i) for i in np.arange(0, 1000, 0.1)])
b = np.arange(0, 1000, 0.1)

x_train = a[:8000]
x_test = a[8000:]
y_train = b[:8000]
y_test = b[8000:]

model = Sequential(layers.Dense(20, activation='relu'))
model.compile(optimizer='adam', loss='mean_squared_error')
model.fit(x=x_train, y=y_train, epochs=200, validation_split=0.2)

Now if I generate predictions either by simply calling model(x_test) or by using predict(x_test) method, the array that I get has a shape (2000, 20).

Why is this happening? Why do I get multiple predictions? And how do I get just a 1-dimensional array of predictions?

Solution

It’s because, in your model, you have 20 relu activated features in your last layer. That gave 20 features of a single instance in the inference time. All you need to do (as you requested) is to use a layer with 1 unit, place it as the last layer, and probably no activation.

Try this:

import numpy as np
from tensorflow.keras import layers
from tensorflow.keras.models import Sequential

a = np.array([np.sin(i) for i in np.arange(0, 1000, 0.1)])
b = np.arange(0, 1000, 0.1)

x_train = a[:8000]
x_test = a[8000:]
y_train = b[:8000]
y_test = b[8000:]

model = Sequential(
    [
        layers.Dense(20, activation='relu'),
        layers.Dense(1, activation=None)
     ])
model.compile(optimizer='adam', loss='mean_squared_error')
model.fit(x=x_train, y=y_train, epochs=2, validation_split=0.2)

model.predict(x_test).shape
(2000, 1)

Answered By – M.Innat

This Answer collected from stackoverflow, is licensed under cc by-sa 2.5 , cc by-sa 3.0 and cc by-sa 4.0

Leave a Reply

(*) Required, Your email will not be published