Displaying 20 results from an estimated 4000 matches similar to: "Fishers exact test"
2007 May 22
1
Bug in Ferret::Search::SortField::SCORE ??
i have been trying to get this to work for a while now. my controller
is
sort = [ Ferret::Search::SortField::SCORE_REV ]
@results = Record.multi_search(params[:search_terms], [ Link, Post,
Event ], {:limit => :all, :sort => sort })
and in my view i just render a conglomeration of the appropriate
partials for each model. it seems that no matter what i do, i can''t get
the
2004 Nov 28
2
am i baned or something?
Soemthing goes wrong with this mail list:
I am getitng something like it:
>Sorry. Your message could not be delivered to:
>Aster risk (Mailbox or Conference is full.)
??????????
Regards,
Corvin
2010 Feb 03
2
selecting a group of points from a scatterplot?
Hi everyone,
is there a way/package in R that would allow me to select a group of
points from a scatterplot by drawing a circle around them or some such?
I can use 'identify' to pick individual points, but that gets tedious
if one has more than 10-20 spots.
I can easily select spots within a rectangle defined by picking points
using identify... but a simple rectangle sometimes will
2009 Dec 17
4
Fishers exact test at < 2.2e-16
In an effort to select the most appropriate number of clusters in a
mixture analysis I am comparing the expected and actual membership of
individuals in various clusters using the Fisher?s exact test. I aim
for the model with the lowest possible p-value, but I frequently get
p-values below 2.2e-16 and therefore does not get exact p-values with
standard Fisher?s exact tests in R.
Does anybody know
2006 Oct 29
2
app_meetme not loading
I originally built my Asterisk server without installing the Zaptel package
as it was going to be a purely SIP based system. However when I went to
setup conferencing using meetme I found out that app_meetme is dependant on
the ztdummy for timing. I have now installed the zaptel package and I
believe the ztdummy module is loading ok
[root@astro asterisk-1.4.0-beta2]# lsmod
Module
1999 Nov 12
1
R-0.65.1 Startup
Dear R users,
I have noticed that my R startup is extremely slow. It takes almost 3
minutes from "double-click" to R prompt. I have been running R-0.64.1 till
recently and it took about 30 sec. I still have access to R-0.64.1. When I
started it up, it took about 25 sec. Can anyone tell me if this is a bug in
R or a problem with my machine?
Note: This is after bootup with R being the
2012 Apr 26
3
Git branch with compiling fixes for win32
From: Erik de Castro Lopo <mle+la at mega-nerd.com>
>To: flac-dev at xiph.org
>Cc: Josh Coalson <xflac at yahoo.com>
>Sent: Wednesday, April 25, 2012 4:42 PM
>Subject: Re: [flac-dev] [Flac-dev] Git branch with compiling fixes for win32
>
>Josh Coalson wrote:
>
>> But regardless of submitter, any patch that affects encoding must be
>> reviewed very
2000 Mar 02
1
Error handling
Dear R-Help,
I am trying to run the following:
apply(Data,1, fit-non-linear-curve)
where fit-non-linear-curve() is a bootstrapping procedure that uses the
nls() function.
Unsuprisingly there are some lines in the Data for which the nls()
procedure fails, probably due to bad starting values. How do I make R just
give up on that particular line of Data and carry on? I am using R 0.99 on
2010 Jan 19
2
Server hanging despite efforts to correct memory limits
My group is working with datasets between 100 Mb and 1 GB in size, using
multiple log ins. From the documentation, it appears that vsize is limited
to 2^30-1, which tends to prove too restrictive for our use. When we drop
that restriction (set vsize = NA) we end up hanging the server, which
requires a restart. Is there any way to increase the memory limits on R
while keeping our jobs from
1999 May 15
2
vsize and nsize
I am running R version ??? under Redhat 5.2. It seems as though the
--nsize object has no effct on the size of the allocated Ncells as
determined using gc(). Yes, I have that much data....
That is if I envoke R with
R --vsize 100 --nsize 5000000
then type
gc()
I get
free total
Ncells 92202 200000
Vcells 12928414 13107200
Thanks
Tony Long
Ecology and Evolutionary Biology
Steinhaus
2000 Nov 09
3
maximum of nsize=20000k ??
Dear R-ers,
somehow it is not possible to increase nsize to more than
20000k. When I specify e.g.
> R --vsize=10M --nsize=21000K
the result is:
free total (Mb)
Ncells 99658 350000 6.7
Vcells 1219173 1310720 10.0
Maybe I have overlooked s.th....
Marcus
--
+-------------------------------------------------------
| Marcus Eger
| E-Mail: eger.m at gmx.de (NEW)
|
1999 Apr 27
2
Memory management
Dear all,
I don't get it:
First of all, the help doesn't say what are the memory limits of
R. Say, what's the max heap size for instance ????
Secondly, I invoke R with the following commands each time:
rgui --vsize 30M --nsize 1000K
rgui --vsize 30M --nsize 2000K
rgui --vsize 30M --nsize 3000K
rgui --vsize 30M --nsize 4000K
I try to open a matrix 8000x8000 by issuing
2005 Mar 04
1
ext2online difficulty
Hi all
I am having some trouble using the ext2online utility, I have reduced
the problem down to its simplist form, and it goes soemthing like this:
Start with a regular msdos labelled disk (I have tried lvm volumes):
Command (m for help): p
Disk /dev/sdb: 18.3 GB, 18351967232 bytes
64 heads, 32 sectors/track, 17501 cylinders
Units = cylinders of 2048 * 512 = 1048576 bytes
Device Boot
2006 Mar 29
1
calcualtign a trailing 12 column mean in a dataframe?
I have a dataframe of 25 columns and 100,000 rows
called ?testdf?.
I wish to build a new dataframe, with 14 columns and
100,000 rows.
I wish the new dataframe to have the ?trailing 12
column? mean. That is, I want column 1 of the new
dataframe to have soemthing like:
?( mean(testdf[,1:12],na.rm=T)?
What is the best way to accomplish this?
1999 Apr 12
3
--nsize and --vsize
Martin M has suggested I widen this discussion to R-devel, and
> I agree that we should increase them,
> but I'm not sure at all about the amount.
>
> The default could even depend on the architecture (via "./configure")..
Views, please.
------------- Begin Forwarded Message -------------
Is is not time we increased the defaults a bit? As the base gets bigger
I hit
2000 Oct 02
3
R vs S-PLUS with regard to memory usage
I am trying to translate code from S-PLUS to R and R really struggles!
After starting R with the foll.
R --vsize 50M --nsize 6M --no-restore
on a 400 MHz Pentium with 192 MB of memory running Linux (RH 6.2),
I run a function that essentially picks up an external dataset with 2121
rows
and 30 columns and builds a lm() object and also runs step() ... the step()
takes forever to run...(takes very
2001 Aug 22
1
Huge workspace cannot be opened
Hi everyone,
I have a problem that some people may have already encountered but i did not
find the solution yet.
As I use R to simulate several arrays of data, my workspace is now 35Mb big and
I cannot launch R with it.
An "xdr real data read error occured" and R tells me to delete .RData or
increase memory. I WON'T delete this file and changing the max-nsize to 40600k
did not
2009 May 07
1
increasing memory for R bg job
Hi,
Is the following command used to increase the memory or any other command when a background R job is run?
R --min-vsize=vl --max-vsize=vu --min-nsize=nl --max-nsize=nu --max-ppsize=N
source:
http://stat.ethz.ch/R-manual/R-patched/library/base/html/Memory.html
Thx
Carol
[[alternative HTML version deleted]]
2000 Aug 17
2
R on os390
G'day R friends,
I didn't get any replies on the main list so I thought I'd try with the
experts.
I was wondering if anyone's ported R to os390. If so, are the vsize and
nsize limits the same as other platforms?
I could really annoy those SAS guys then.
thanks,
John Strumila
john.strumila@team.telstra.com
2009 Nov 30
1
allocating vector memory > 1 GByte on Windows XP / Vista / 7
Let me begin stating that I read all help files and faq's on the subject
matter (there aren't more than about a dozen) but either did not find
solutions or found them not to work.
Here is the issue. I am trying to run a spatial regression on a
medium-sized dataset. Part of the functions in the spdep package I use
require me to allocate a vector of 1.1 Gb (mine is not a spatial SIG