Displaying 20 results from an estimated 32898 matches for "larg".
Did you mean:
arg
2011 May 21
4
Looping through values in a data frame that are >zero
...it is in
2 The entry of column y that is in the same row
3 The entry of column z that is in the same row
It'd be good to save this info in a data frame somehow - so that I
could loop through rows of this data frame.
To explain what I need it for eventually: I have a different data
frame "large.df" that has the same columns (variables) - but with many
more entries than "x". Something like:
large.df<-expand.grid(y,z)
names(large.df)<-c("y","z")
set.seed(123)
large.df$a<-sample(0:5,75,replace=T)
set.seed(234)
large.df$b<-sample(0:5,75,replace=...
2008 Jul 14
3
Dovecot Crash
Jul 13 08:19:15 hera dovecot: fstat 75 : Value too large for defined data
type
Jul 13 08:19:18 hera dovecot: fstat 75 : Value too large for defined data
type
Jul 13 08:19:20 hera dovecot: fstat 75 : Value too large for defined data
type
Jul 13 08:19:27 hera dovecot: fstat 75 : Value too large for defined data
type
Jul 13 08:19:27 hera dovecot: fstat 75...
2008 Dec 24
3
filling values in a vector using smaller vector
Dear list members:
I am looking for an elegant (or efficient) way to accomplish the following:
take a large boolean vector and fill the TRUE values with the values from a
smaller boolean vector that has a length that is the number of TRUE values of
the large vector.
Example:
large<- c(FALSE, FALSE, FALSE, TRUE, FALSE, FALSE, TRUE, FALSE, FALSE, FALSE,
TRUE, FALSE)
small<- c(TRUE, FALSE, TR...
2017 Sep 21
0
List of occuring values
unique(x) will give you the distinct values in x. table(x) will give you
the distrinct values and their frequencies as an array with dimnames.
data.frame(table(x)) will give you a 2-column data.frame with the distinct
values and their frequencies.
> values <- c("Small", "Large", "Large", "Large")
> unique(values)
[1] "Small" "Large"
> tblValues <- table(values)
> tblValues
values
Large Small
3 1
> tblValues[tblValues > 2, drop=FALSE]
values
Large
3
>
> dfValues <- data.frame(tblValues)
&...
2012 Nov 07
1
[LLVMdev] using large structures in registers/returns
I can't find a lot of information about using structures directly as
parameters, returns, and in registers. Is this fully supported on all
platforms? Does it always convert to creating a hidden parameter when
too large?
For example (assume very.large is too big to fit in the target machine
registers):
define %very.large @get_struct() {
%m1 = insertvalue %very.large undef, i32 10, 0
...
%m10 = insertvalue %very.large %m9, i32 25, 9
ret %very.large %m10
}
define void @use_struct( %very.large %m ) { ... }
d...
2009 Apr 24
2
"Old method" bootloader failing with large ramdisk
I''m trying to boot a PV guest using the "old method"
of passing kernel= and ramdisk= and it appears to
work fine with a "small" initrd but not with a "large"
one. (Small is 4MB, large is 154MB.) I''m sure both
of the initrd''s are properly gzip''ed etc. Unpacked,
the large one approaches 400M.
By doing some kernel startup debugging, it appears
that the large initrd never finds its way into memory.
Or at least not comple...
2017 Sep 21
4
List of occuring values
Dear all,
ftable produces a list of the frequencies of all occuring values.
But how about the occuring values?
How can I retrieve a list of occuring values?
How can I retrieve a table with both the list of occuring values and their respective frequencies?
Thank you in advance,
Yours, Ferri
2009 Apr 17
2
E2fsck and large file
How big is a file that e2fsck considers it to be a large file?
814611 blocks used (42.79%)
0 bad blocks
1 large file <----- that
Thanks
John Nelson
2002 Oct 27
3
rsync with large gzip files.
Hi,
I tried performing a complete copy of 17GB of filesystems over the WAN
(0.8GB/hr) with the speed of 16Mbps. The filesystem consists of several
large g-zipped files. These large g-zipped files have actually been zipped
out of other sub-filesystems and directories. I noticed that while
transferring a lists of large g-zipped files, rsync tends to take a much
longer time to transfer those files and at times, it even hangs.
I'm invoked rsync...
2009 May 04
2
normality test for large a large dataset ?
Hello,
Do you know a R implemented normality test like the shapiro test but more suitable for large data set ?
Thanks,
_________________________________________________________________
Découvrez toutes les possibilités de communication avec vos proches
[[alternative HTML version deleted]]
2007 Apr 04
1
fsck.ext3 reporting large file I cannot find
I am checking a file system (ext3) as shown below. It is actually a fresh file
system, as I had deleted all partitions and created an ext3 file system. But
when I run the check with the verbose option, it says I have one large file.
Am I missing something here, or is it odd that I cannot find this large file it
is reporting? I reviewed some documentation on ext3 file systems, and
experimented with different sizes and formats, but finally decided to post
(after reviewing archives). Is this "large file" just...
2002 Feb 12
1
error in rsync protocol on large file
...ata/GenBank/htg : Error 0
rsync error: error in file IO (code 11) at receiver.c(243)
Received signal 16. (no core)
rsync: connection unexpectedly closed (22672537 bytes read so far)
rsync error: error in rsync protocol data stream (code 12) at io.c(140)
The file it is getting the error on is very large:
-rw-rw-r-- 1 leema staff 4633417879 Apr 30 2001 htg
# rsync --version
rsync version 2.5.2 protocol version 26
Copyright (C) 1996-2002 by Andrew Tridgell and others
<http://rsync.samba.org/>
Capabilities: 64-bit files, socketpairs, hard links, symlinks, batchfiles,
no IPv6,...
2001 Oct 29
2
Large data sets in R
As a new user of R, I'm wondering what the maximum matrix size is in R?
I have a large data set consisting of 9000 people and want to be able to
create large matrices involving these. Is R suitable for large data
sets?
Thanks
Laura
-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-
r-help mailing list -- Read http://www.ci.tuwien.ac.at/~hornik/R/R-FAQ.h...
2009 Dec 23
3
OT:Which filesystem to use with large files
Hi all,
Recently I have installed a centOS 5.4 server to use as a home NAS server. I need
to use large files (8GB minimum) inside of it to serve via iSCSI services. Which
filesystem do you recommends me to reach maximum performance: xfs, ext3, ext4, gfs2
....??
Thanks.
--
CL Martinez
carlopmart {at} gmail {d0t} com
2009 Mar 16
3
Asterisk is not designed for University with large user base?
Hello,
I just had a meeting about a pilot project going on in our University, The
project manager has done some research in the past year and concluded that
Asterisk can not scale well to large user base like 10,000 users, thus
Asterisk is not fit for large University environment.
The project manager instead choosed sipX and said it scales well for large user base.
I had an Asterisk running in my office for small user base, I don't
have experience with large scale Asterisk impleme...
2002 Jan 31
3
Error when compile rsync
Hi,
I tried to compile rsync-2.5.2 on Solaris 5.7 Ultra-2 machine,
5.7 on Ultra-2 is running 64 bit, but, when I ran configure,
it said no for largefile, I thought 2.5.2 will support large
file? Is that true?
Thanks,
Jennifer
2004 Jul 30
2
Large File Copy to Large ext3 RAID5 Array Often Stalls
...6x Maxtor 250GB IDE drives (one drive per cable)
RAID level 5, 128Kb chunk size, EXT3: "mkfs -t ext3 -b 4096 -m 0 -R stride=16 /dev/md2"
I'm running Samba 3 and I first noticed this problem when 3 out of my 5 Windows clients (2 XP machines and 1 Server 2K3 machine) failed to copy any large files (~1GB) to a subdirectory on the server containing about 220 other such large files. Two XP machines on my network have no problems whatsoever copying large files to the very same subdirectory on the server.
A failing file transfer begins at a reasonable data rate (~6 MB / sec) but grind...
2016 Feb 24
2
[PATCH 1/5] fat: fix minfatsize for large FAT32
When trying to installing Syslinux on a FAT32 drive formatted using
Ridgecrop's Large FAT32 formatting tool [1], the installer will bail due
to the minfatsize check, as there is an extra sector being used. This
fix addresses that.
[1] http://www.ridgecrop.demon.co.uk/index.htm?fat32format.htm
-------------- next part --------------
2002 Feb 13
2
large file error is now SIGUSR1 or SIGINT error
...n this problem... I get the error below (and the error I
> reported previously) when running rsync 2.5.2 compiled from
> source. I saw
> different behavior when I used the rsync 2.5.2 binary
> compiled on Solaris
> 2.5.1 by Dave Dykstra. That binary complained of "Value too large for
> defined data type" whenever it encountered a large file (over
> 2GB), but did
> not exit. The impression I got was that the Solaris 2.5.1
> binary did not
> support or even try to support files over 2 GB, where the
> binary compiled on
> Solaris 7 or 8 *thinks*...
2008 Aug 21
2
Large data sets with R (binding to hadoop available?)
Dear R community,
I find R fantastic and use R whenever I can for my data analytic
needs. Certain data sets, however, are so large that other tools
seem to be needed to pre-process data such that it can be brought
into R for further analysis.
Questions I have for the many expert contributors on this list are:
1. How do others handle situations of large data sets (gigabytes,
terabytes) for analysis in R ?
2. Are there...