Displaying 20 results from an estimated 10636 matches for "huge".
2010 Aug 24
2
Extract rows from a list object
Dear list members,
I need to create a table from a huge list object,
this list consists of matrices of the same size (but with different
content).
The resulting n tables should contain the same rows from all matrices.
For example:
n <- 23
x <- array(1:20, dim=c(n,6))
huge.list <- list()
for (i in 1:1000) {
huge.list[[i]] <- x }
# On...
2006 Apr 04
2
Return function from function with minimal environment
Hi,
this relates to the question "How to set a former environment?" asked
yesterday. What is the best way to to return a function with a
minimal environment from a function? Here is a dummy example:
foo <- function(huge) {
scale <- mean(huge)
function(x) { scale * x }
}
fcn <- foo(1:10e5)
The problem with this approach is that the environment of 'fcn' does
not only hold 'scale' but also the memory consuming object 'huge',
i.e.
env <- environment(fcn)
ll(envir=env) # ll() fr...
2007 Dec 11
0
3 commits - libswfdec/swfdec_as_context.c libswfdec/swfdec_movie.c test/trace
...|binary
test/trace/crash-0.5.4-13491-stack-overflow-7.swf.trace | 1 +
test/trace/crash-0.5.4-13491-stack-overflow-8.swf |binary
test/trace/crash-0.5.4-13491-stack-overflow-8.swf.trace | 1 +
test/trace/crash-0.5.4-13491-stack-overflow.as | 6 ++++++
test/trace/crash-0.5.4-huge-image-7.swf |binary
test/trace/crash-0.5.4-huge-image-8.swf |binary
test/trace/crash-0.5.4-huge-image.as | 8 ++++++++
test/trace/swfdec-huge.jpg |binary
16 files changed, 35 insertions(+), 2 deletions(-)
New com...
2006 Sep 20
3
Spliting a huge vector
Dear R users,
I have a huge vector that I would like to split into
unequal slices. However, the only way I can do this is
to create another huge vector to define the groups
that are used to split the original vector, e.g.
# my vector is this
a.vector <- seq(2, by=5, length=100)
# indices where I would like to slice my ve...
2008 Jun 25
1
huge data?
Hi Jay Emerson,
Our Intention is to primarily optimize "R" to utilize the Parallel
Processing Capabilities of CELL BE Processor.(has any work been done in this
area?)
We have huge pages(of size 1MB 16MB ) available in the system and as you
pointed out our data is also in the GB ranges.So the idea is if Vectors of
this huge size are allocated from Huge Pages the performance will naturally
increase.How to implement it?
So How can we proceed in this case?
Also the Upper Limit...
2010 Sep 29
1
cor() alternative for huge data set
Hi,
I am have a data set of around 43000 probes(rows), and have to calculate correlation matrix. When I run cor function in R, its throwing an error message of RAM shortage which was obvious for such huge number of rows. I am not getting a logical way to cut off this huge number of entities, is there an alternative to pearson correlation or with other dist() methods calculation(euclidean) that can be run on such a huge data set??
Every help will be appreciated.
Regards
..
JG
2005 Jun 06
3
Reading huge chunks of data from MySQL into Windows R
Dear List,
I'm trying to use R under Windows on a huge database in MySQL via ODBC
(technical reasons for this...). Now I want to read tables with some
160.000.000 entries into R. I would be lucky if anyone out there has
some good hints what to consider concerning memory management. I'm not
sure about the best methods reading such huge files into R....
2005 Mar 25
2
Very HUGE binaries!?
Hi!
I've been compiling Samba 3.0.x on a Solaris 2.6 server using GCC
3.4.1 without any problem until recently... The problem started with 3.0.12
version, and reproduced in 3.0.13. Doing "configure" and then "make"
produces with these two versions VERY HUGE binaries! I'm talking about more
than 50 Megabytes binaries in some cases... With 3.0.11 and before I had
much smaller binaries, doing exactly the same procedure to compile. I
haven't changed anything on the compiler side...
I've seen that the huge binaries are due to the fact that d...
2006 Mar 17
2
> 1TB filesystems with ZFS and 32-bit Solaris?
Solaris in 32-bit mode has a 1TB device limit. UFS filesystems in 32-bit
mode also have a 1TB limit, even if using a logical volume manager to
span smaller than 1TB devices.
So, what kind of limit does ZFS have when running under 32-bit Solaris?
--
Erik Trimble
Java System Support
Mailstop: usca14-102
Phone: x17195
Santa Clara, CA
2012 Jan 10
0
ltp hugemmap02 fails on ocfs2
Hi Tiger,
ltp-20120104 hugemmap02 testcase fails on ocfs2 filesystem with both UEK
2.6.39-100.0.18 and RHEL 2.6.18-300.el5:
hugemmap02 1 TCONF : huge mmap failed to test the scenario
hugemmap02 1 TCONF : huge mmap failed to test the scenario
hugemmap02 0 TWARN : tst_rmdir:
rmobj(/mnt/ocfs2/ltp-mQdlAx5411/...
2011 Jun 04
2
Completely disable local keyboard input in Syslinux / Extlinux?
...previous, propritary operating system. And there
is no way to get a BIOS with a less broken redirection.
I try to use Syslinux / Extlinux (first 3.84, then switched to 4.04) to
load Linux, it works as long as Extlinux does not prompt, with this
extlinux.conf:
serial 0 9600 0
console 0
default huge.s
prompt 0
timeout 0
say Booting Linux ...
label huge.s
kernel bzImage
append initrd=initrd.img e100.eeprom_bad_csum_allow=1
earlyprintk=ttyS0,9600 console=ttyS0,9600 load_ramdisk=1
prompt_ramdisk=0 rw printk.time=0 SLACK_KERNEL=huge.s
Note "console 0" to prevent writing to the e...
2020 Jul 07
3
hex editor for huge files
-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA512
hexpeek: a hex editor for huge files
Occasionally I need to work with huge binary files. Over the years I've
tried many different tools and never found one that was exactly what I
wanted. In my experience most hex editors either (1) do not work well
with 4GB+ files or (2) require the user to learn a curses interface and
are...
2023 Feb 10
1
syncing huge files/devices: bmapfs
Hi,
recently I had to sync really huge files (VM images), represented
either as block devices (on one or both sides) or as regular files.
It *seemed* that Rsync didn't work well with block devices (the
--copy-devices didn't 'work for me, maybe I'm stupid or it's broken in
the Rsync that ships with Debian bullseye)....
2006 Mar 16
3
Converting huge mbox to Dovecot mbox + indexes
Migrating from UW IMAP/Qpopper mbox to Dovecot mbox.
I have ~500 users, some with HUGE mbox (500MB-1GB),
is there a script to create the Dovecot indexes at night
to help speed up the migration process.
Any idea.
Thanks
Bertrand Leboeuf
leboeuf at emt.inrs.ca
2005 Feb 04
2
rsync huge tar files
Hi folks,
Are there any tricks known to let rsync operate on huge tar
files?
I've got a local tar file (e.g. 2GByte uncompressed) that is
rebuilt each night (with just some tiny changes, of course),
and I would like to update the remote copies of this file
without extracting the tar files into temporary directories.
Any ideas?
Regards
Harri
2003 Sep 16
1
how to identify huge downloads ?
hello ...
how can I identify huge downloads on link to automticly move them to low priority queue ? somethink like combination rate and duration of session
Thanks
2010 Jan 23
8
The directory that I am trying to clean up is huge
The directory that I am trying to clean up is huge . every time get this
error msg
-bash: /usr/bin/find: Argument list too long
Please advise
Anas
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.centos.org/pipermail/centos/attachments/20100123/f6534851/attachment-0001.html>
2017 Mar 10
4
[PATCH v7 kernel 3/5] virtio-balloon: implementation of VIRTIO_BALLOON_F_CHUNK_TRANSFER
...t, chunk_ext may be rarely used, thanks. I will remove chunk_ext if
> there is no objection from others.
>
> Best,
> Wei
I don't think we can drop this, this isn't an optimization.
One of the issues of current balloon is the 4k page size
assumption. For example if you free a huge page you
have to split it up and pass 4k chunks to host.
Quite often host can't free these 4k chunks at all (e.g.
when it's using huge tlb fs).
It's even sillier for architectures with base page size >4k.
So as long as we are changing things, let's not hard-code
the 12 shift thi...
2017 Mar 10
4
[PATCH v7 kernel 3/5] virtio-balloon: implementation of VIRTIO_BALLOON_F_CHUNK_TRANSFER
...t, chunk_ext may be rarely used, thanks. I will remove chunk_ext if
> there is no objection from others.
>
> Best,
> Wei
I don't think we can drop this, this isn't an optimization.
One of the issues of current balloon is the 4k page size
assumption. For example if you free a huge page you
have to split it up and pass 4k chunks to host.
Quite often host can't free these 4k chunks at all (e.g.
when it's using huge tlb fs).
It's even sillier for architectures with base page size >4k.
So as long as we are changing things, let's not hard-code
the 12 shift thi...
2005 Jun 06
1
AW: Reading huge chunks of data from MySQL into Windows R
...k the other way round will serve best: Do everything in R and avoid using SQL on the database...
-----Urspr??ngliche Nachricht-----
Von: bogdan romocea [mailto:br44114 at yahoo.com]
Gesendet: Montag, 6. Juni 2005 16:27
An: Dubravko Dolic
Cc: r-help at stat.math.ethz.ch
Betreff: RE: [R] Reading huge chunks of data from MySQL into Windows R
You don't say what you want to do with the data, how many columns you
have etc. However, I would suggest proceeding in this order:
1. Avoid R; do everything in MySQL.
2. Use random samples.
3. If for some reason you need to process all 160 million rows...