similar to: Huge differences in Ram Consumption with different versions of R on the same scripts

Displaying 20 results from an estimated 5000 matches similar to: "Huge differences in Ram Consumption with different versions of R on the same scripts"

2014 Jan 13
3
apache - upload files bigger than 2Go
Hi, I need to upload files larger than 4.4Gb (iso DVD) on CentOS (5.5 x64) http server (httpd-2.2.3-43.el5.centos) On the apache server set in my /etc/php.ini upload_max_filesize = 4900M post_max_size = 5000M In my httpd.conf I set : LimitRequestBody 0 I'm using firefox and/or chrome client for upload a file with 4.2gb size on the server. But it doesn't work.
2011 Aug 30
1
Why does loading saved/cached objects add significantly to RAM consumption?
Dear list, I make use of cached objects extensively for time consuming computations and yesterday I happened to notice some very strange behavior in that respect: When I execute a given computation whose result I'd like to cache (tried both saving it as '.Rdata' and via package 'R.cache' which uses a own filetype '.Rcache'), my R session consumes about 200 MB of
2010 Apr 13
3
ClamAV "clamscan" command using huge amount of RAM
We have a perl cgi script that accepts uploaded files and runs clamscan on them. While observing the system performance I noticed that each clamscan process consumes up to 250MB of RAM. Is this normal for ClamAV? This seems like an enormous amount of RAM, for simply scanning one file for viruses.
2008 Oct 05
1
io writes very slow when using vmware server
We are struggling with a strange problem. When we have some VMWare clients running (mostly MS windows clients), than the IO-write performance on the host becomes very bad. The guest os's do not do anything, just having them started, sitting at the login prompt, is enough to trigger the problem. The host has plenty of 4G of RAM, and all clients fit easily into the space. The disksystem is a
2010 Apr 19
2
Huge data sets and RAM problems
Dear all, This is the first time I am sending mail to the mailing list, so I hope I do not make a mistake... The last months I have been working on my MSc thesis project on performing data mining techniques on user logs of a software-as-a-service application. The main problem I am experiencing is how to process the huge amount of data. More specifically: I am using R 2.10.1 in a laptop with
2008 Mar 04
2
memory constraints in ubuntu gutsy
Hello All, I have a very large data set (1.1GB) that I am trying to read into R. The file is tab delimited and contains headers; there are over 800 columns and almost 700,000 rows. I am using the Ubuntu 7.10 Gutsy Gibbon version of R. I am using Kernel Linux 2.6.22-14-generic. I have 3.1GB of RAM with the AMD Athlon(tm) 64 Processor 3200+. I downloaded R using the instructions from cran under
2014 Mar 18
1
Samba 4.1.5 memory consumption - again
Hi all, At the pre-big thanks to the SAMBA team for many years the great work. Sorry for my english. A week ago, have implemented productively Samba 4.1.5 on CentOS 6.5. I installed samba by default dns server built-in (internal) with the rfc2307. Everything works great, but after a while I noticed that the memory on my server is largely occupied. At this time there are about 10 users domain -
2012 Nov 01
2
Multiple incremental DVD backup program?
I'm running C5.8 and want to backup a directory that is 6GB in size. Is there any Linux program for Centos to make this backup over 2 x 4.4GB DVD+R disks please? Something with a GUI like K3b would do nicely. Kind Regards, Keith ----------------------------------------------------------- Websites: http://www.karsites.net http://www.php-debuggers.net
2010 Jun 16
2
cpuspeed settings??
Hey, folks, Sometimes my workstation bogs down... slows to a crawl. Using gkrellm, it's obvious the CPU is the laggard. The top utility confirms: the load average gets up over 4 at times. But this occurs when cpu stepping pegs the speed at 600MHz. This processor is capable of 1.5GHz and when it's allowed to run at that speed, the load average is under 2, which is fine. So the
2010 Feb 06
2
question about bigmemory: releasing RAM from a big.matrix that isn't used anymore
Hi all, I'm on a Linux server with 48Gb RAM. I did the following: x <- big.matrix(nrow=20000,ncol=500000,type='short',init=0,dimnames=list(1:20000,1:500000)) #Gets around the 2^31 issue - yeah! in Unix, when I hit the "top" command, I see R is taking up about 18Gb RAM, even though the object x is 0 bytes in R. That's fine: that's how bigmemory is supposed to
2013 Jul 24
3
memory consumption with treesize pro and cifs shares
Hi everyone. I'm looking to solve an issue with Samba on a NAS being accessed with TreeSize Pro. Using that program to scan through millions of files is eating up memory on swap and eventually crashing the system. It's scanning mounted CIFS shares on the NAS running TrueNAS with samba version 3.6.9 We have a test case and have been able to replicate the issue on another machine. The
2004 Aug 06
1
Sacrilege, but...
On Tue, 24 Jul 2001, Brendan Cully wrote: | which patch exactly is that? I looked into Mark's patch and it seems | to be unsafe (try setting up a client which reads much too slowly and | see if it bogs down all the other clients). I may have misapplied it | or applied it to the wrong version of icecast though, or you could be | talking about a different patch. This one... it probably is the
2006 Apr 11
1
Mixins?
As is often the case when I tackle a new platform/language, I get the big picture very quickly (because frameworks are frameworks are frameworks) but its the nitty-gritty of the language that bogs me down... So I have some similar methods on a few of my model classes that I wanted to push into a helper. Now I reckoned that the Ruby way was to create a module and mix it in with include. However, I
2011 Apr 13
1
Overcoming warning in package zoo
Dear R users,I have a long program that I am trying to run--I am using RStudio as my interface with R. The pieces of the program run well individually but when I try to run everything in sequence it bogs down because of a warning after using rollmax from package zoo. Here is the warning: "In rollmax.zoo(zoo(Pmat), 7, na.pad = FALSE, align = "right") : na.pad is deprecated. Use
2010 Jun 24
1
help, bifurcation diagram efficiency
Hello all - This code will run, but it bogs down my computer when I run it for finer and finer time increments and more generations. I was wondering if there is a better way to write my loops so that this wouldn't happen. Thanks! -Tyler ################# # Bifurcation diagram # Using Braaksma system of equations # We have however used a Fourier analysis # to get a forcing function
2011 Oct 17
1
Need help with optimizing GlusterFS for Apache
Our webserver is configured as such: The actual website files, php, html ,css and so on. Or on a dedicated non-glusterfs ext4 partition. However, the website access Videos and especially image files on a gluster mounted directory. The write performance for our backend gluster storage is not that important. Since it only comes into play when someone uploads a video or image. However, the files
2012 Feb 08
4
String position character replacement
Hi, Is there a way to efficiently replace specified indices in a string with another character? For example, if I had a vector of strings such as [1] "hellohowareyoudoing" [2] "imgoodhowareyou" [3] "goodandyou" [4] "yesimgoodijusttoldyou" [5] "ohyesthatsright" and had a list of positions that I want to replace with the character "-"
2004 Mar 24
0
High/low level: Plot 2 time series with different axis (left and ri ght)
Sun, 14 Mar 2004, Jan Verbesselt wrote: > Dear R specialists, > > I have two time series in a data.frame and want to plot them in the same > plot(), with the left axis scaled to time series 1 (-700,0) and the > right axis scaled to time series 2 (-0.2, 0.4). > > plot(timeserie1) > lines(timeserie2, col=c(2)) => this one should be scaled differently > with a new
2009 Dec 07
5
CentOS 5.4 x86_64 only detects 32GB RAM while Fedora x86_64 correctly lists 128GB
Hi, We have a new 24-core Dell PowerEdge R905 server with 128GB's RAM. The 64 bits version of Fedora 12 lists the correct amount of 128GB, CentOS only finds 32GB (and so does Scientific Linux). I would much prefer to use CentOS (most of the software we use is specifically designed for CentOS). Does anyone know what is causing this/how to fix it? Many Thanks, Diederick -------------- next
2017 May 26
1
CentOS 6 dhcpd custom log issues
Hi all, I've got an issue with C6's dhcpd custom logging that I cannot figure out. Hopefully someone has an idea, or has seen a similar issue. We have dhcpd logging to /var/log/messages a custom header (DHCPUSER:) with MAC, IP and Circuit-ID. I'll not bore you with the guts, so here's the beginning of that line in dhcpd.conf: if exists agent.circuit-id { log (info,