I am always fascinating by good programming techniques. R contains a lot of very
good examples I have been learning from.
Since I am using some functions from package "wmtsa", I though I could
borrow the elegant example, in the documentation of function
"wavBestBasis" to compute the entropy from the Wavelet Transform
coefficients.
In the following I have pasted the excerpt, from "wmtsa" on-line
documentation, that implements the entropy calculation:
## define an entropy cost functional
"entropy" <- function(x){
iz <- which(x==0)
z <- -x^2 * log(x^2)
if (length(iz))
z[iz] <- 0
sum(z)
}
To my best recollection, Shannon's entropy operates on probabilities
therefore requesting a normalization step.
I have written my simple code that implements Shannon's entropy:
# Shannon's entropy
# input: vector x
sum(x^2)
pi <- (x^2)/sum(x^2)
pi
-pi*log(pi)
-sum(pi*log(pi))
I have tested both entropy realizations on a simple case:
> x<- 1:10> x
[1] 1 2 3 4 5 6 7 8 9 10
# ENTROPY FROM WMTSA EXAMPLE> entropy(x)
[1] -1552.495
# SHANNON ENTROPY> sum(x^2)
[1] 385> pi <- (x^2)/sum(x^2)
> pi
[1] 0.002597403 0.010389610 0.023376623 0.041558442 0.064935065 0.093506494
0.127272727
[8] 0.166233766 0.210389610 0.259740260> -pi*log(pi)
[1] 0.01546297 0.04744882 0.08780304 0.13218305 0.17755633 0.22158462
0.26236293 0.29828326
[9] 0.32795410 0.35014887> -sum(pi*log(pi))
[1] 1.920788
Needless to say, when I calculate the entropy on wavelet coefficients to the
purpose of selecting the best wavelet family on the basis of the
"entropy" cost function, I get different results depending on which
entropy implementation I use ...
Your comments and thoughts are very welcome.
e tutti i telefonini TIM!
Vai su
-------------- next part --------------
A non-text attachment was scrubbed...
Name: wmtsa.pdf
Type: application/pdf
Size: 551276 bytes
Desc: wmtsa.pdf
URL:
<https://stat.ethz.ch/pipermail/r-help/attachments/20090220/d545ae8b/attachment-0002.pdf>