search for: 350mb

Displaying 20 results from an estimated 37 matches for "350mb".

Did you mean: 300mb
2006 Jul 18
2
FW: Large datasets in R
...ons (not least of which is an aversion for > proprietary software), I am thinking of shifting to R. At the current > juncture my concern is the following: would I be able to work on > relatively large data-sets using R? For instance, I am currently working > on a data-set which is about 350MB in size. Would be possible to work > data-sets of such sizes using R? The answer depends on a lot of things, but most importantly 1) What you are going to do with the data 2) Whether you have a 32-bit or 64-bit version of R 3) How much memory your computer has. In a 32-bit version of R (where...
2007 Jul 16
3
Errors in data frames from read.table
Hello, all. I am working on a project with a large (~350Mb, about 5800 rows) insurance claims dataset. It was supplied in a tilde(~)-delimited format. I imported it into a data frame in R by setting memory.limit to maximum (4Gb) for my computer and using read.table. The resulting data frame had 10 bad rows. The errors appear due to read.table missing del...
2005 Dec 20
0
2005 a send_file odyssey (or Rails and Apache don''t always play well)...
...tream => true, :buffer_size => 4096) With appropriate values for my_path_to_file, my_file_name, my_file_mime_type which are logged so I can check them in the logs. During development I''m downloading two test files, one around 10MB and another around 350MB. Using WEBrick as the server everything seems to work okay: Firefox (OS/X) : both downloads successful Safari (OS/X) : both downloads successful IE 6 (Win XP) : both downloads successful Flushed with success I switch to using the deployment web server Apache 1.3.x and retry the dow...
2009 Oct 23
2
Memory Problems with CSV and Survey Objects
I'm working with a 350MB CSV file on a server that has 3GB of RAM, yet I'm hitting a memory error when I try to store the data frame into a survey design object, the R object that stores data for complex sample survey data. When I launch R, I execute the following line from Windows: "C:\Program Files\R\R-2.9.1\bi...
2005 Oct 07
3
Performance issues
...Samba share. Seemingly at random, my video stream will halt due to an inability to receive data from the server. If I pause for a few seconds and resume, everything is usually fine. This generally happens only once or twice per hour, but it's annoying. The video is not huge. We're talking ~350MB xvid files, 45 minutes each (compressed network TV shows). The Samba server used to be a Windows 2000 Server and the same video files worked perfectly from there. Network is gigabit on the server side, 100mbit on the client side - though even wireless should be able to stream these files. Virtually...
2004 Jul 02
1
reading large data
Hello, I have trouble using read.table for flat files of larger than about 300MB on windows 2000. Any ideas of how to file a bug report? Is it a known issue? I have three cuts of data, a 1%, 10% and 100% sample in flat text files. The 100% sample is about 350MB. When I read the 1% and 10% files, besides being slow, everything works. RAM footprint appears to increase approximately 2x of text file size when loaded. I have 1.5GB of ram on my machine. The 10% file takes < 1.5 minutes to load. So the 100% file I would think would load in < 15 minute...
1999 Jul 05
1
smbtar/smbclient and backups of NT
...kb/s) \usr\mail\Vanja\Attach\att11.eml 2732 ( 70.2 kb/s) \usr\mail\Vanja\Attach\att12.eml Error reading file \usr\mail\Vanja\Attach\att13.eml. Got 0 bytes Didn't get entire file. size=2305, nread=0 2305 ( 57.7 kb/s) \usr\mail\Vanja\Attach\att13.eml ... And it goes on and on. On 350Mb of data, I get around 250 errors. Not to mention that 100 of those 250 are *critical* (for me) files :) I tried to modify the blocksize (using -Tcb <value> <filename>) but the only difference is that errors start happening earler or later. And always at same place, when using same bloc...
2019 Feb 26
1
IO rate of tar-in, what can we expect on a qcow2 image?
...e done several speed tests on a qcow2 Linux Image to test how fast tar-in with a big tarball can be. Virtio seems to be active, and we get transfers in a range from 100-160MB/sec, independent of the disk speed on the host. For example we had a 20 core host system with 900MB for serial writing and 350MB for mixed read/write on the native filesystem. We've expected a faster tar-in for qcow2 there than on a small test system with some slow disks and only two cores. But the rates where nearly the same. We tried really hard to do some optimizing and using big files inside the tar (for emulating se...
2006 Mar 03
3
memory once again
...with a data frame with more than 820.000 observations and 80 variables. The Stata file has 150Mb. With my Pentiun IV 2GHz and 1G RAM, Windows XP, I could't do the import using the read.dta() function from package foreign. With Stat Transfer I managed to convert the Stata file to a S file of 350Mb, but my machine still didn't manage to import it using read.S(). I even tried to "increase" my memory by memory.limit(4000), but it still didn't work. Regardless of the answer to my question, I'd appreciate to hear about your experience/suggestions in working with big fil...
2020 Apr 07
1
Re: [PATCH virt-v2v] v2v: Allow temporary directory to be set on a global basis.
...not see any alternative than use it, with all the caveats associated. > > Also, if we take as possible scenario the situation where /var/tmp is > not that big, then we need to consider that may not be big enough to > even store the cached supermin appliance (which is more than > 300/350MB). In another message in this thread, Rich indicates the appliance is being built at the time the container image itself is built. As such the appliance will already be present in /var/tmp when the container starts, so there's no disk space issue for it. The issue of lack of space only applies...
2007 Aug 24
1
Has anyone experience with rsync out of memory
...own problem. So I need to finde a workaround for this. Please help me! We try to sync many (nnnn) Thumbnails of size 2k between AIX and Linux with rsync -a --delete --rsh=ssh <source dir> <user>@<server>:<target dir> I calculated 100 bytes a file which lead me to about 350MB needed for the process. However the process seems to stop already at getting 16MB. Is there any restriction to processes on Unix system that might influence this? Will set ulimit help? ulimit -a: ------------------------- time(seconds) unlimited file(blocks) 2097151 data(kbytes)...
2004 Sep 10
3
FLAC status
Hi, How's the testing going? I compressed 194 individual .wav files (totaling 8.54GB) which contained tracks ripped from many varied albums. I unflacced them and compared their md5 signature with the same from the original .wav. They were all perfect. I didn't use the -V option just in case of any chance of mis-reporting. I hope to test it with the complete collection of ~41GB
2004 Jan 28
3
Server crashed using rsync
I'm trying to make a backup using this command rsync -auvH /home/ /bak --delete --bwlimit=1000 --status server load has been increased so much and the server crashed, as well has gone out of memory My Server is a Dual Xeon 2.0 GHz with 2GB of Memory + 1GB Swap. Could be that there are too many files, about 5.000.000, to be backed up ? The way the files are structured make very
2003 Dec 03
1
R and Memory
...ur PhD students. Our unit will be the HQ for developing R throughout Thailand. I would like some help with a problem we are having. We have one sample of data that is quite large in fact - over 2 million records (ok ok it's more like a population!). The data is stored in SPSS. The file is over 350Mb but SPSS happily stores this much data. Now when I try to read it into R it grunts and groans for a few seconds and then reports that there is not enough memory (the computer has 250MB RAM). I have tried setting the memory in the command line (--max-vsize and --max-mem-size) but all to no avail. A...
2007 May 14
7
Help a newb with 0.3.1
Hi, first off thanks Ezra for Merb - it''s certainly interesting and I''m keen to have a play. However, I''m having difficulties in getting started. I''ve followed the docs for setting up mrblog and everything seems to be installed correctly, and merb appears to start fine: $ merb you must install the markaby gem to use .mab templates you must install the
2020 Apr 07
5
Re: [PATCH virt-v2v] v2v: Allow temporary directory to be set on a global basis.
On Tue, Apr 07, 2020 at 01:25:02PM +0200, Pino Toscano wrote: > The important thing is still that that you need to have space for the > temporary files somewhere: be it /var/tmp, /mnt/scratch, whatever. > Because of this, and the fact that usually containers are created > fresh, the cache of the supermin appliance starts to make little sense, > and then a very simple solution is to
2020 Apr 07
2
Re: [PATCH virt-v2v] v2v: Allow temporary directory to be set on a global basis.
...s the best you can do is let user know about those (documentation?). > > Also, if we take as possible scenario the situation where /var/tmp is > not that big, then we need to consider that may not be big enough to > even store the cached supermin appliance (which is more than > 300/350MB). > > > - The external large space may be shared with other containers, and > > I'm not convinced that our locking in supermin will be safe if > > multiple parallel instances start up at the same time. We > > certainly never tested it, and don't curren...
2004 Sep 10
0
Possible bug
...gt; - 44 > > It's lost the fact that the last 176 bytes are not > actually part of the > audio. If it's not possible to carry this comment > into the flac format > then it should be dropped rather than encoded as > audio? > Cheers. > > BTW It compressed the 350MB wav file perfectly at > level 8. __________________________________________________ Do You Yahoo!? Yahoo! Photos - Share your holiday photos online! http://photos.yahoo.com/
2005 Oct 01
1
help with loading National Comorbidity Survey
I downloaded data from http://www.hcp.med.harvard.edu/ncs/ Which provides data in DTA (STATA), XPT (SAS), and POR (SPSS) formats all of which I have tried to read with the foreign package but I am not able to load any of them. I have 2 gb of RAM, but R crashes when the memory gets just over 1 GB. I am using Windows version 2.1.1. The size of the DTA file is 48 MB; the xpt file is 188
2005 Jul 19
1
mac os x crashes with bioconductor microarray code (PR#8013)
Full_Name: Eric Libby Version: 2.1.1 OS: OS Tiger Submission from: (NULL) (65.93.158.117) I am trying to analyze microarray data of 42 human arrays. I typed in the following instructions: library(affy) Data <-ReadAffy() eset <- expresso(Data, normalize.method="invariantset", bg.correct=FALSE, pmcorrect.method="pmonly",summary.method="liwong") And I get some