I figured it out. Here is an example of a signal y, and how to calculate its entropy (Hent) and its negentropy (J). You may get the values for y from a file - as in your code, where you read it from a spreadsheet. In this case, I am generating the signal y by calling the uniform random number generator, rand().
y=sqrt(12)*(rand(1,N)-.5);
h=histogram(y,'Normalization','pdf');
Hent=-h.BinWidth*sum(h.Values(h.Values>0).*log(h.Values(h.Values>0))/log(2));
J=log(std(y))/log(2)+2.0471-Hent;
fprintf('Uniform random noise: Hent=%.4f, J=%.4f\n',Hent,J);
The attached document explains the rationale for the code above.
I am attaching a script which generates four signals: square wave, sine wave, uniform noise, Gaussian noise. The probability density of each one is displayed (below). It is evident from the plots that the density is most un-Gaussian at the top and gradually changes to Gussian at the bottom. The entropy and negentropy of each signal is computed using the formulas above. The values are as expected: negentropy is biggest for the top plot (most un-Gaussian) and smallest (approximately zero) for the bottom. The signals all have the same mean and standard deviation.