Skip to content Skip to sidebar Skip to footer

Logits And Labels Mismatch In Tensorflow

there is a mismatch between the logits and lables in the Tensorflow after one hot encoding. And my batch size is 256. How can I manage to get the batch size in labels Tensor too ?

Solution 1:

Issue was with tf.one_hot(le.fit_transform(labels), n_classes).

This passes a tensor where the numpy array was needed. After calling eval() for this Tensor, Issue is resolved.


Post a Comment for "Logits And Labels Mismatch In Tensorflow"