NotImplementedError: Cannot convert a symbolic Tensor (2nd_target:0) to a numpy array

Issue

I try to pass 2 loss functions to a model as Keras allows that.

loss: String (name of objective function) or objective function or
Loss instance. See losses. If the model has multiple outputs, you can
use a different loss on each output by passing a dictionary or a list
of losses
. The loss value that will be minimized by the model will
then be the sum of all individual losses.

The two loss functions:

def l_2nd(beta):
    def loss_2nd(y_true, y_pred):
        ...
        return K.mean(t)

    return loss_2nd

and

def l_1st(alpha):
    def loss_1st(y_true, y_pred):
        ...
        return alpha * 2 * tf.linalg.trace(tf.matmul(tf.matmul(Y, L, transpose_a=True), Y)) / batch_size

    return loss_1st

Then I build the model:

l2 = K.eval(l_2nd(self.beta))
l1 = K.eval(l_1st(self.alpha))
self.model.compile(opt, [l2, l1])

When I train, it produces the error:

1.15.0-rc3 WARNING:tensorflow:From /usr/local/lib/python3.6/dist-packages/tensorflow_core/python/ops/resource_variable_ops.py:1630:
calling BaseResourceVariable.init (from
tensorflow.python.ops.resource_variable_ops) with constraint is
deprecated and will be removed in a future version. Instructions for

updating: If using Keras pass *_constraint arguments to layers.

NotImplementedError Traceback (most recent call
last) in ()
47 create_using=nx.DiGraph(), nodetype=None, data=[(‘weight’, int)])
48
—> 49 model = SDNE(G, hidden_size=[256, 128],)
50 model.train(batch_size=100, epochs=40, verbose=2)
51 embeddings = model.get_embeddings()

10 frames in init(self, graph,
hidden_size, alpha, beta, nu1, nu2)
72 self.A, self.L = self._create_A_L(
73 self.graph, self.node2idx) # Adj Matrix,L Matrix
—> 74 self.reset_model()
75 self.inputs = [self.A, self.L]
76 self._embeddings = {}

in reset_model(self, opt)

—> 84 self.model.compile(opt, loss=[l2, l1])
85 self.get_embeddings()
86

/usr/local/lib/python3.6/dist-packages/tensorflow_core/python/training/tracking/base.py
in _method_wrapper(self, *args, **kwargs)
455 self._self_setattr_tracking = False # pylint: disable=protected-access
456 try:
–> 457 result = method(self, *args, **kwargs)
458 finally:
459 self._self_setattr_tracking = previous_value # pylint: disable=protected-access

NotImplementedError: Cannot convert a symbolic Tensor (2nd_target:0)
to a numpy array.

Please help, thanks!

Solution

I found the solution to this problem:

It was because I mixed symbolic tensor with a non-symbolic type, such as a numpy. For example. It is NOT recommended to have something like this:

def my_mse_loss_b(b):
     def mseb(y_true, y_pred):
         ...
         a = np.ones_like(y_true) #numpy array here is not recommended
         return K.mean(K.square(y_pred - y_true)) + a
     return mseb

Instead, you should convert all to symbolic tensors like this:

def my_mse_loss_b(b):
     def mseb(y_true, y_pred):
         ...
         a = K.ones_like(y_true) #use Keras instead so they are all symbolic
         return K.mean(K.square(y_pred - y_true)) + a
     return mseb

Hope this help!

Answered By – T D Nguyen

This Answer collected from stackoverflow, is licensed under cc by-sa 2.5 , cc by-sa 3.0 and cc by-sa 4.0

Leave a Reply

(*) Required, Your email will not be published