similar to: Entropy based feature selection in R

Displaying 20 results from an estimated 150 matches similar to: "Entropy based feature selection in R"

2011 Sep 02
2
Classifying large text corpora using R
Dear everyone, I am new to R, and I am looking at doing text classification on a huge collection of documents (>500,000) which are distributed among 300 classes (so basically, this is my training data). Would someone please be kind enough to let me know about the R packages to use and their scalability (time and space)? I am very new to R and do not know of the right packages to use. I
2011 Sep 07
1
Fwd: FSelector and RWeka problem
Hi all, Although I sent the mail to Piotr, the author of FSelector, it should be better to ask here to let others know. Yanwei Begin forwarded message: From: Yanwei Song <yanwei.song@gmail.com> Date: September 7, 2011 4:41:58 PM EDT To: p.romanski@stud.elka.pw.edu.pl Subject: FSelector and RWeka problem Dear Piotr, Thanks for developing the FSelector package for us. I'm a new
2015 Jun 02
2
information.gain de la libreria FSelector
Hola, estoy intentando calcular la ganancia de información para una serie de variables (series temporales con distinta longuitud, ej: Presion Arterial, Frecuencia cardíaca,...) en relación con una variable binaria (0:paciente no muere; 1:paciente muere). Para ello voy a usar la función information.gain de la libreria FSelector. Sabeis si es posible calcular la ganancia de información para
2015 Jun 02
2
information.gain de la libreria FSelector
Hola Javier, yo soy licenciada en Físicas pero también tengo algo de información médica (doctorado en Neurociencia) Un saludo El 2 de junio de 2015, 15:35, <javier.ruben.marcuzzi en gmail.com> escribió: > Estimada María Luz Morales > > ¿Qué título universitario tiene usted?, es para pensar en como poder > ayudarla, si desde la parte médica o la de R > > Javier Rubén
2011 Dec 13
1
k-means cluster and plot labels
Hi, For my data, I followed the example of http://en.wikibooks.org/wiki/Data_Mining_Algorithms_In_R/Clustering/K-Means#Execution and got some very nice results. Despite the fact, that I want to achieve a bit more by clustering my data (stratification beyond case-control), the actual data-frame contains a column labeled "C" which holds a case-control indicator (here either "Z"
2011 Feb 08
1
FP growth in R?
Does anyone know of an R interface to Christian Borgelt's implementation of the FP growth algorithm? thanks a lot Rob Tibshirani -- I get so much email that I might not reply to an incoming email, just because it got lost. So don't hesitate to email me again. The probability of a reply should increase. Prof. Robert Tibshirani ?Depts of Health Research and Policy, and Statistics
2013 Mar 13
1
Feature selection package for text mining
Hi, I am doing a project on authorship attribution, where my term document matrix has around 10450 features. Can you please suggest me a package where I can find the feature selection function to reduce the dimensions. Regards, Venkata Satish Basva [[alternative HTML version deleted]]
2012 Aug 09
2
Analyzing Poor Performance Using naiveBayes()
My data is 50,000 instances of about 200 predictor values, and for all 50,000 examples I have the actual class labels (binary). The data is quite unbalanced with about 10% or less of the examples having a positive outcome and the remainder, of course, negative. Nothing suggests the data has any order, and it doesn't appear to have any, so I've pulled the first 30,000 examples to use as
2001 Nov 07
0
Entropy collection in sshd (was Re: Entropy and DSA key)
why don't you do some profiling instead of posting so many lines of email?
2005 May 11
0
entropy and conditional entropy for continous variables
Hi, this is not a R question per se, but since I'm on the lookout for an R solution I thought this was the best place: I would like to calculate the entropy for a variable and the conditional entropy between two variables, H(X|Y) for variables X & Y I have coded the case for the categorical case but I'm having problems understanding how to do it for the continous case. >From
2003 May 08
1
function to compute entropy
Maybe its slightly off-topic, but can anybody help with computing entropy on matrix of probabilities? Guess we have a matrix of probabilites, A, 2x2, something like this: z x 0 1 2 3 4 0 0.063 0.018 0.019 0.016 0.000 1 0.011 0.162 0.040 0.042 0.003 2 0.015 0.030 0.164 0.033 0.002 3 0.012 0.035 0.036 0.159 0.002 4 0.004 0.021 0.018 0.013 0.082 sum(A)=1 Can i
2011 Oct 30
1
calculating joint entropy of many variables
Hello list. I need help (e.g., a reference, code, package, etc.) in calculating the joint entropy of many variables (some sure highly mutually-informative and some not). Is there anyone here who knows a computationally-efficient solution (such as an R package)? I appreciate you help ... Best, Reza [[alternative HTML version deleted]]
2013 Jun 11
0
Rao's quadratic entropy with fuzzy coded trait data
Ein eingebundener Text mit undefiniertem Zeichensatz wurde abgetrennt. Name: nicht verf?gbar URL: <https://stat.ethz.ch/pipermail/r-help/attachments/20130611/09dcb017/attachment.pl>
2006 Oct 05
1
randomness entropy in DomU
Hello. I was just digging around in the net and found the thread http://lkml.org/lkml/2006/5/12/103 So my questions: Are there any (good) news concerning the implementation of /dev/random in the kernel? I just cat /proc/sys/kernel/random/entropy_avail and saw I have about 250 in the DomU''s and 3500 on Dom0. I haven''t even started to implement encryption for the different kind
2006 Apr 25
1
/usr/libexec/save-entropy, IPv4: not found
About an hour ago I started getting regular messages from cron running /usr/libexec/save-entropy which contail the single line "IPv4: not found" Anybody got any ideas ? The only thing I did at that porint was to do an 'rm' of /usr/obj in preparation for compiling this mornings 61 code to test it. Removing /usr/obj should not affect a running system at all. The box is currently
2001 Jul 27
0
openssl version check in entropy.c
----- Original Message ----- From: "Markus Friedl" <Markus.Friedl at informatik.uni-erlangen.de> To: "NONAKA Akira" <anonaka at miraclelinux.com> Cc: <ssh at clinet.fi> Sent: Thursday, July 26, 2001 4:47 PM Subject: Re: openssl version check in entropy.c > On Thu, Jul 26, 2001 at 03:03:31PM +0900, NONAKA Akira wrote: > > OpenSSH checks OpenSSL
2000 Apr 07
1
Question about compiled-in entropy gatherer
This oddity happened with test2: debug: Got 0.00 bytes of entropy from /usr/bin/who debug: Got 0.05 bytes of entropy from /usr/bin/last debug: Got 0.00 bytes of entropy from debug: Got 0.88 bytes of entropy from /usr/sbin/df debug: Got 0.00 bytes of entropy from /usr/sbin/df debug: Got 0.12 bytes of entropy from /usr/bin/vmstat debug: Got 0.00 bytes of entropy from /usr/bin/uptime I've
2000 Jun 15
1
problem in entropy.c if no getrusage
entropy.c assumes RUSAGE_SELF and RUSAGE_CHILDREN *** entropy.c.orig Thu Jun 15 13:57:28 2000 --- entropy.c Thu Jun 15 13:58:25 2000 *************** *** 201,207 **** --- 201,209 ---- total_entropy_estimate += stir_gettimeofday(1.0); total_entropy_estimate += stir_clock(0.2); + #ifdef HAVE_GETRUSAGE total_entropy_estimate += stir_rusage(RUSAGE_SELF, 2.0); + #endif
2001 Jan 05
1
PORTING to IBM OS/390: select() always returning 0 in entropy.c
Hello, I'm attempting to port OpenSSH to IBM's S/390 mainframe. Things have gone well but I have come a little unstuck with the internal PRNG. Although the commands in ssh_prng_cmds are being executed the select() seems to be returning 0 and therefore the cose is assuming that the forked process has timed out. This could be a difference in the way that select is implemented on OS/390.
2001 Jun 02
1
ssh-keygen(1) misinfo: English prose entropy 0.6 - 1.3 b/char!
Quoth manpage: otherwise easily guessable (English prose has only 1-2 bits of entropy per word, and provides very bad passphrases). The passphrase can be Whoever wrote that manpage is either possessed of some amazing human insight to which I am not privvy, chose a very non-representative sample of English prose, or is just plain wrong. I know none of you would ever make such a glaring error,