Hi There,
I trained a model (classifier) to recognize the hand written digits from (0-9). The model is not related to any CNN types of Networks.
I tested the model prformance using accuracy, precsion and Recall. I am looking to do some error measurments analysis using cross entropy error.
my question is: given the actual label output, the predicted output of the model, how to find the cross entropy:
Example:
the target = [9 5 6 7 8 5 0 1]
the model output (predicted) = [ 5 5 9 7 8 5 0 1]
i want to use this information (only the target and the model output) to find the loss in the model.
i tried the dlarray given in the example: https://www.mathworks.com/help/deeplearning/ref/dlarray.crossentropy.html , but i dont think I am doing that in the right way, any suggestions!