I am trying to add a neuron layer to my model which has tf.keras.activations.relu() with max_value = 1 as its activation function. When I try doing it like this:
It gives me following error:
TypeError: relu() missing 1 required positional argument: 'x'
I don’t have any x to give it, obviously, as I am just trying to set up a neuron layer. Is there a way to customize these activation functions properly?
You can try this
Answered By – Toan Tran