How do I change activation function parametrs within Keras models


I am trying to add a neuron layer to my model which has tf.keras.activations.relu() with max_value = 1 as its activation function. When I try doing it like this:

model.add(tf.keras.layers.Dense(2,  activation=tf.keras.activations.relu(max_value=1)))

It gives me following error:

TypeError: relu() missing 1 required positional argument: 'x'

I don’t have any x to give it, obviously, as I am just trying to set up a neuron layer. Is there a way to customize these activation functions properly?


You can try this

model.add(tf.keras.layers.Dense(2, activation=tf.keras.layers.ReLU(max_value=1)))

Answered By – Toan Tran

This Answer collected from stackoverflow, is licensed under cc by-sa 2.5 , cc by-sa 3.0 and cc by-sa 4.0

Leave a Reply

(*) Required, Your email will not be published