similar to: discrete entropy is not rotation invariant?

Displaying 20 results from an estimated 10000 matches similar to: "discrete entropy is not rotation invariant?"

2017 May 28
2
Low random entropy
On 05/28/2017 04:24 AM, Tony Mountifield wrote: > In article <792718e8-f403-1dea-367d-977b157af82c at htt-consult.com>, > Robert Moskowitz <rgm at htt-consult.com> wrote: >> >> On 05/26/2017 08:35 PM, Leon Fauster wrote: >>>> Am 27.05.2017 um 01:09 schrieb Robert Moskowitz <rgm at htt-consult.com>: >>>> >>>> I am use to low
2012 Feb 13
1
entropy package: how to compute mutual information?
suppose I have two factor vectors: x <- as.factor(c("a","b","a","c","b","c")) y <- as.factor(c("b","a","a","c","c","b")) I can compute their entropies: entropy(table(x)) [1] 1.098612 using library(entropy) but it is not clear how to compute their mutual information
2017 May 28
3
Low random entropy
On 05/26/2017 08:35 PM, Leon Fauster wrote: >> Am 27.05.2017 um 01:09 schrieb Robert Moskowitz <rgm at htt-consult.com>: >> >> I am use to low random entropy on my arm boards, not an intel. >> >> On my Lenovo x120e, >> >> cat /proc/sys/kernel/random/entropy_avail >> >> reports 3190 bits of entropy. >> >> On my armv7 with
2003 May 08
1
function to compute entropy
Maybe its slightly off-topic, but can anybody help with computing entropy on matrix of probabilities? Guess we have a matrix of probabilites, A, 2x2, something like this: z x 0 1 2 3 4 0 0.063 0.018 0.019 0.016 0.000 1 0.011 0.162 0.040 0.042 0.003 2 0.015 0.030 0.164 0.033 0.002 3 0.012 0.035 0.036 0.159 0.002 4 0.004 0.021 0.018 0.013 0.082 sum(A)=1 Can i
2017 May 26
3
Low random entropy
I am use to low random entropy on my arm boards, not an intel. On my Lenovo x120e, cat /proc/sys/kernel/random/entropy_avail reports 3190 bits of entropy. On my armv7 with Centos7 I would get 130 unless I installed rng-tools and then I get ~1300. SSH into one and it drops back to 30! for a few minutes. Sigh. Anyway on my new Zotac nano ad12 with an AMD E-1800 duo core, I am seeing 180.
2000 Apr 07
1
Question about compiled-in entropy gatherer
This oddity happened with test2: debug: Got 0.00 bytes of entropy from /usr/bin/who debug: Got 0.05 bytes of entropy from /usr/bin/last debug: Got 0.00 bytes of entropy from debug: Got 0.88 bytes of entropy from /usr/sbin/df debug: Got 0.00 bytes of entropy from /usr/sbin/df debug: Got 0.12 bytes of entropy from /usr/bin/vmstat debug: Got 0.00 bytes of entropy from /usr/bin/uptime I've
2010 Mar 15
1
Help with calculating entropy of data
Hello All, My question is not directly related to R but rather on which statistical method I should look in to for capturing the entropy of a data-set as a number. In this figure http://www.twitpic.com/18sob5 are two data sets blue and green (x-axis is time) that fluctuate between (-1,+1). Clearly, green has 4 jumps while blue has 1 (and a some?). Intuitively, green has more entropy than blue. Is
2008 Jan 10
1
Entropy/KL-Distance question
Dear R-Users, I have the CDF of a discrete probability distribution. I now observe a change in this CDF at one point. I would like to find a new CDF such that it has the shortest Kullback-Leibler Distance to the original CDF and respects my new observation. Is there an existing package in R which will let me do this ? Google searches based on entropy revealed nothing. Kind regards, Tolga
2002 Nov 14
3
[Bug 435] internal entropy gatherer
http://bugzilla.mindrot.org/show_bug.cgi?id=435 dtucker at zip.com.au changed: What |Removed |Added ---------------------------------------------------------------------------- Summary|internal entropy gatherer |internal entropy gatherer ------- Additional Comments From dtucker at zip.com.au 2002-11-15 00:21 ------- Which platform did you
2000 Jun 15
1
problem in entropy.c if no getrusage
entropy.c assumes RUSAGE_SELF and RUSAGE_CHILDREN *** entropy.c.orig Thu Jun 15 13:57:28 2000 --- entropy.c Thu Jun 15 13:58:25 2000 *************** *** 201,207 **** --- 201,209 ---- total_entropy_estimate += stir_gettimeofday(1.0); total_entropy_estimate += stir_clock(0.2); + #ifdef HAVE_GETRUSAGE total_entropy_estimate += stir_rusage(RUSAGE_SELF, 2.0); + #endif
2001 Jun 07
2
Patch to enable multiple possible sources of entropy
I have a need to have the same OpenSSH binaries run on multiple machines which are administered by different people. That means on Solaris, for example, there will be some with /dev/random, some on which I can run prngd because they'll be installing my binaries as root, and some which will have neither because they will be only installed as non-root. Below is a patch to enable choosing all 3
2011 Jul 11
1
problem finding p-value for entropy in reldist package
Hi, I am using the reldist package and having problems determining the p-value for the entropy value from the reldist function. I am able to properly determine the entropy value, but cannot figure out what function to use to find the p-value. I have tried using rpy, rpluy (which provides p-values for the polarization values) and investing the results from reldist(). Thus, far I cannot find the
2001 Feb 15
2
deviance vs entropy
Hello, The question looks like simple. It's probably even stupid. But I spent several hours searching Internet, downloaded tons of papers, where deviance is mentioned and... And haven't found an answer. Well, it is clear for me the using of entropy when I split some node of a classification tree. The sense is clear, because entropy is an old good measure of how uniform is distribution.
2017 May 29
2
Low random entropy
> Am 29.05.2017 um 05:46 schrieb Robert Moskowitz <rgm at htt-consult.com>: > > > > On 05/28/2017 06:57 PM, Rob Kampen wrote: >> On 28/05/17 23:56, Leon Fauster wrote: >>>> Am 28.05.2017 um 12:16 schrieb Robert Moskowitz <rgm at htt-consult.com>: >>>> >>>> >>>> >>>> On 05/28/2017 04:24 AM, Tony
2001 Nov 02
7
Entropy and DSA keys
I remember a discussion to the effect that using DSA keys in sshd increases the requirement for random bits available on the system... and that this requirement (was it a 128 bit random number per connection?) presents security problems on systems that don't have a decent source of entropy? Am I misinterpreting those discussions? We are having a problem deploying sshd (no prngd) where sshd
2008 Jul 08
1
calculation of entropy in R???
i want to calculate shannon entropy which is H1,H2,H3....upto H7? if there is any function or any package in which i can find this entropy directly. do you have any information please share this and i will be very thankful to you. Regards, ++++++++++++++++++++++++++++++++++++++++++++++ MUHAMMAD FAISAL Department of Statistics and Decion Support system, University of
2017 May 29
1
Low random entropy
On 29/05/17 15:46, Robert Moskowitz wrote: > > > On 05/28/2017 06:57 PM, Rob Kampen wrote: >> On 28/05/17 23:56, Leon Fauster wrote: >>>> Am 28.05.2017 um 12:16 schrieb Robert Moskowitz <rgm at htt-consult.com>: >>>> >>>> >>>> >>>> On 05/28/2017 04:24 AM, Tony Mountifield wrote: >>>>> In article
2023 Mar 03
1
EL9 says: pcp-pmie[2870]: Low random number entropy available 15.6%
Hi, I've discovered an issue which I don't understand. On a new test install of EL9 I saw this message in the logs: Mar 01 08:09:18 <hostname> pcp-pmie[2870]: Low random number entropy available 15.6%avail at beta.corp.invoca.ch This is on a 64 core "AMD Opteron(tm) Processor 6282 SE" server but I also got the same low entropy on an EL9 KVM guest running on a "AMD
2001 Sep 28
3
OpenSSH (portable) and entropy gathering
On Thu, 27 Sep 2001 20:41:05 EDT, Damien Miller writes: > On Thu, 27 Sep 2001, Dan Astoorian wrote: > > > > > It would (IMHO) be useful if there were a way to optionally configure > > that code to fall back to the internal entropy gathering routines in the > > event that EGD was not available; as it is, the routines simply fail if > > EGD is unavailable at the
2017 May 28
4
Low random entropy
On 28/05/17 23:56, Leon Fauster wrote: >> Am 28.05.2017 um 12:16 schrieb Robert Moskowitz <rgm at htt-consult.com>: >> >> >> >> On 05/28/2017 04:24 AM, Tony Mountifield wrote: >>> In article <792718e8-f403-1dea-367d-977b157af82c at htt-consult.com>, >>> Robert Moskowitz <rgm at htt-consult.com> wrote: >>>> On 05/26/2017