Displaying 20 results from an estimated 85 matches for "600mb".
Did you mean:
100mb
2007 Mar 19
2
phpMyAdmin high Memory usage
I installed phpMyAdmin on CentOS 4.4.
When accessing phpMyAdmin the memory usage jumps. Memory usage for http
jumps to 700MB spawning 4-5 httpd processes (about 3GB of ram). Mysql
also jumps to using close to 600MB in some cases
This is a Daul Dual core Xeon 3GHz machine with 4 GB of 667Mhz ram.
Should I be seeing this spike in Memory usage? The spikes only last about
5 seconds but my manager doesn't want to see those jumps.
The databases appear to be about 450MB-600MB.
On a side note I had a require...
2009 Jun 29
2
Large Stata file Import in R
Hi
I am using Stata 10 and I need to import a data set in stata 10 to R, I have
saved the dataset in lower versions of Stata as well by using saveold
command in Stata.
My RAM is 4gb and the stata file is 600MB, I am getting an error message
which says :
"Error: cannot allocate vector of size 3.4 Mb
In addition: There were 50 or more warnings (use warnings() to see the first
50)"
Thus far I have already tried the following
1. By right clicking on the R icon I have used --max-mem-size=1000M in...
2009 Jun 30
1
Stata file and R Interaction :File Size Problem in Import
Hi
I am using Stata 10 and I need to import a data set in stata 10 to R, I
have saved the dataset in lower versions of Stata as well by using saveold
command in Stata.
My RAM is 4gb and the stata file is 600MB, I am getting an error message
which says :
"Error: cannot allocate vector of size 3.4 Mb
In addition: There were 50 or more warnings (use warnings() to see the
first
50)"
Thus far I have already tried the following
1. By right clicking on the R icon I have used --max-mem-size=10...
2003 Dec 03
2
memory leak (PR#5476)
...00 Pro
Submission from: (NULL) (24.229.106.55)
Appears to be memory leak with
=RApply("qnorm",K56)*-1 function
from within Excel 2000.
Have spreadsheet with between
100 and 500 of above references.
When spreadsheet first opened
system using about 200MB but
very quickly grows to over 600MB
and appears to be unlimited.
Is there another way to run qnorm
from within Excel with less memory?
2006 Sep 28
4
Trimming the fat out of a Centos 4.4 Installation
Hi, just to avoid re-inventing the wheel, is there any document that
can help me reduce even further a "minimum" installation of Centos 4.4
(BTW can you say 600mb is minimum)?
I am in the process of creating a small Centos-4.4-based Asterisk box
and I need to boot it from a CF card. Deleting useless packages will
help me do what i want.
Example: even a minimum install of Centos 4.4 (or Redhat EL for what
is worth) includes CUPS, among other things.
Any doc...
2006 Sep 28
4
Trimming the fat out of a Centos 4.4 Installation
Hi, just to avoid re-inventing the wheel, is there any document that
can help me reduce even further a "minimum" installation of Centos 4.4
(BTW can you say 600mb is minimum)?
I am in the process of creating a small Centos-4.4-based Asterisk box
and I need to boot it from a CF card. Deleting useless packages will
help me do what i want.
Example: even a minimum install of Centos 4.4 (or Redhat EL for what
is worth) includes CUPS, among other things.
Any doc...
2005 Jul 29
10
Rails Wiki down
Howdy -
Someone''s probably already reported this, but anyhow, going here:
http://wiki.rubyonrails.com/rails/show/RailsOnFedora
results in:
====================
> Bad Gateway
> The proxy server received an invalid response from an upstream server.
>
====================
Yours,
Tom
2007 Dec 19
2
speed and connection problems after samba upgrade - RH 5 -> RH 5.1, samba 3.0.23c -> 3.0.25b
...XP PC.
This problem occures only from time to time, so it may be also a problem
on the client side or the networkswitch, so I did a test download from
an ftp server (ftp-stud.fht-esslingen.de) and I can download files with
up to 6MBytes(!) - that's o.k.
Copying files from the server (e.g. an 600MB iso) takes about 60 seconds
- that's also o.k.
But opening smal files on the server takes sometimes that long ...
My question is: Could it be, that the update includes some changes in
timeouts or locking funtions? Which options may I check? Or are there
some cachefiles to be checked?
The log...
2006 Sep 14
2
Possiible Bug ? indexWriter#doc_count countsdeleted docs after #commit
...s show that #commit doesn''t affect #doc_count,
even across ruby sessions.
On a different note, I''d like to request a variation of #add_document
which returns the doc_id of the document added, as opposed to self.
I''m trying to track down an issue with a large test index [600MB, 500k
docs] in which I need to update a document. The old document is deleted
then added again, but doesn''t show up in my searches.
A #doc_count on the writer before and after #add_document shows that the
index is 1 document larger, but I still cant #search for the updated
doc.
What do y...
2004 Feb 19
4
1024GB max memory on R for Windows XP?
I have 2GB installed on my windows XP box running R 1.9.0, and after
performing a prune.tree(intree,newdata), I get an out of memory error within
R, but it says the maximum allowed is 1024gb (1/2 of what I have!) Can R
not use more than 1GB on an XP box? I noticed I had ~600mb left over after
R conked out, so clearly I had more memory... What about virtual memory?
--j
--
Jonathan Greenberg
Graduate Group in Ecology, U.C. Davis
http://www.cstars.ucdavis.edu/~jongreen
http://www.cstars.ucdavis.edu
AIM: jgrn307 or jgrn3007
MSN: jgrn307 at msn.com or jgrn3007 at msn.com
2011 Sep 12
2
Duration
Is it normal for rsync to take 3 hours on this transfer?
Number of files: 27419348
Number of files transferred: 19501
Total file size: 185.39G bytes
Total transferred file size: 195.92M bytes
Literal data: 195.68M bytes
Matched data: 241.09K bytes
File list size: 402.01M
File list generation time: 0.561 seconds
File list transfer time: 0.000 seconds
Total bytes sent: 600.61M
Total bytes received:
2006 Apr 26
2
Memory usage and limit
...ched?)".
Three questions:
1) Why were so much memory and CPU consumed to read 300MB of data? Since
almost all of the variables are character, I expected almost of 1-1 mapping
between file size on disk and that in memory
2) Since this is a 64-bit build, I expected it could handle more than the
600MB of data I used. What does the error message mean? I don't beleive the
vector length exceeded the theoretic limit of about 1 billion.
3) The original file was compressed and I had to uncompress it before the
experiement. Is there a way to read compressed files directly in R
Thanks so much for y...
2017 Nov 06
2
samba 4.x slow ...
On Mon, Nov 06, 2017 at 09:35:06AM +0100, Dr. Peer-Joachim Koch via samba wrote:
> Setup: Server 2x10GB NIC, client 1x1GB:
>
> Using smbclient with -mSMB3 I get an average of 52000 KiloBytes/sec (before
> 45000),
>
> using NFS (v3) from the same server I get 108000 KiloBytes/sec an more
> (mount ; cp; umount; ...)
So I don't think it's the client. The smbclient
2006 Oct 05
1
Best way to compress *many* .spx files
...tized voice with a vocabulary of some 350.000 words.
Converting them from WAV to Speex obviously cut down the size a lot,
but I find that some more compression could be done. Specifically I
collect the resulting files in one big container file, and that one
compresses quite well (from about 600MB to about 130MB). I expect
that is due to redundant data in the headers and other OGG container
artifacts (framing etc.).
Any suggestions on how to proceed?
Sincerely,
Anders S. Johansen
2009 Oct 28
1
Regex matching that gives byte offset?
Hi,
Is there any way of doing 'grep' ore something like it on the content of a
text file and extract the byte positioning of the match in the file? I'm
facing the need to access rather largish (>600MB) XML files and would like
to be able to index them ...
Thanks for any help or flogging,
Joh
2000 Oct 21
2
scp and regstarting transfer
Hi,
I have one question. Is someone working on restarting transfers ability in scp ?
This will be nice feature especially when you want to download huge file and
you will lost connection (at 90%, 600MB file as I had) :-(
--
Arkadiusz Mi?kiewicz http://www.misiek.eu.org/ipv6/
PLD GNU/Linux [IPv6 enabled] http://www.pld.org.pl/
2005 Oct 25
1
Building wine-20050930 uses *a lot* of hd space
Yesterday, I noticed that I had lost about a gig of hd space without
knowing what had used that much. Turned out that it was the
wine-20050930 build directory that was using 850mb of hd space. I
pinned down the problem to the wine-20050930/dlls directory, which was
using about 600mb hd space. For example, advapi32.dll.so was 1155k,
while advapi.c was only 8518 bytes and advapi.o was 149k. I tried
bzipping advapi32.dll.so, and its size went down to 279k. Then I tried
making the wine-20050930 directory into one big bzipped tar, and it
became 174mb.
Is it just me that finds this...
2008 Jul 05
1
Wanted: minimal install ks.cfg
Greetings, all.
I'm in need of a minimal ks.cfg file for the smallest possible install
with yum. I've got the scripting for yum to install the apps I need, I
just want to insure all the cruft is not on the system as well. Using
the s-c-ks app, the smallest I have gotten is 600MB. This is for a
server appliance vm that I need to deploy quickly and dynamically.
Thanks
Dnaiel
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.centos.org/pipermail/centos/attachments/20080704/6ea93678/attachment-0002.html>
2016 Sep 01
3
2.2.25 dumps core with "Panic: file imap-client.c: line 837 (client_check_command_hangs): assertion failed: (client->io != NULL)"
...1 11:50:13 surz113 dovecot: [ID 583609 mail.crit] imap(user):
> Fatal: master: service(imap): child 11227 killed with signal 6 (core not
> dumped - set service imap { drop_priv_before_exec=yes })
>
> This happens with different users, the last one with a relatively small
> mailbox of 600MB.
>
> doveconf -n is attached.
>
> Dovecot 2.2.25
> OS: Solaris 11 (SunOS 5.11 11.3 i86pc i386 i86pc)
> Virtualization: VMware
> Filesystem: ZFS
> active users: ~4000
>
> The system was transferred at beginning of last week from an old
> SPARC-station with Solaris 10...
2007 Dec 17
4
Torrent: reminder to use it folks!
Well, there's so few going right now that I'm showing 38 days to get the
DVD. My normal dnld from a mirror travels appx. 600Mb/sec.
I'll wait until most of the U.S. goes home before I give up and use the
normal download though.
Here's hoping...
--
Bill