![]() Furthermore, your vectors represent the image do not sum up to 1, so the above minimum cannot apply. So, in your application, if you do not have binary target, you won't get zero Binar圜rossentropy even if your y_pred is identical to your y_true. However, notice that the minimum value of cross entropy is the entropy of the ground truth. In this sense, $p,q$ are distributions and we don't need them to be binary. (We never get values close to zero) ?ĭoes the best value will be close to 0.5 ?įrom your description you represent your image as a vector of dimension 4 and then you compute binary cross entropy between y_pred and y_true.įirst of all, cross entropy compares distribution $q$ relative to $p$, where $q$ is often used as the estimate and $p$ as ground truth in machine learning. It seems that using Binar圜rossentropy as loss function won't give us the best results. One result is close to the True value and the second is same as the true value: import numpy as npīce = tf.圜rossentropy()Īs you can see, the second example (which is the same as the true value) gets low score (low loss value, but still it's not 0 or close to 0). I tried to evaluate the output of Binar圜rossentropy and I'm confused.Īssume for simplicity we have image and we run Autoencoder and get 2 results. The sigmoid outputs values (value of each pixel of the image) The input to the Autoencoders is normalized I saw some examples of Autoencoders (on images) which use sigmoid as output layer and Binar圜rossentropy as loss function. ![]()
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |