mnstn wrote:>
> Hello All,
> My question is not directly related to R but rather on which statistical
> method I should look in to for capturing the entropy of a data-set as a
> number. In this figure http://www.twitpic.com/18sob5 are two data sets
> blue and green (x-axis is time) that fluctuate between (-1,+1). Clearly,
> green has 4 jumps while blue has 1 (and a some?). Intuitively, green has
> more entropy than blue. Is there a robust statistical quantity that can
> capture their relative flexibilities? Additionally I am hoping this method
> will differentiate between two cases where both spend 50% of time in each
> of the states -1 and +1 but one has more jumps than the other. I am
> guessing the limits of that quantity are 0 (no change) and N-1 (for N time
> steps).
> Sigma( mod(value(t)-value(t-1))/2 )? I am just thinking out loud here.
>
> I have about 200 such plots and I would like to arrange them in order of
> their entropy.
>
> Thanks and I sincerely apologize if you feel this is not the right place
> to ask this question.
> MoonStone
>
Google "information theory" and "entropy," and you'll
find lots of stuff. If
memory serves me, Henri Theil's book, "Economics and Information
Theory,"
has a good discussion of this kind of stuff.
Marsh Feldman
--
View this message in context:
http://n4.nabble.com/Help-with-calculating-entropy-of-data-tp1593954p1595156.html
Sent from the R help mailing list archive at Nabble.com.