Displaying 20 results from an estimated 92 matches for "gbs".
Did you mean:
gb
2006 May 02
1
pairwise.t.test: empty p-table
Hi list-members
can anybody tell me why
> pairwise.t.test(val, fac)
produces an empty p-table. As shown below:
Pairwise comparisons using t tests with pooled SD
data: val and fac
AS AT Fhh Fm Fmk Fmu GBS Gf HFS Hn jAL Kol R_Fill
AT - - - - - - - - - - - - -
Fhh - - - - - - - - - - - - -
Fm - - - - - - - - - - - - -
Fmk - - - - - - - - - - - - -
Fmu - - - - - - - - - - - -...
2007 May 11
1
Writing files > 2GB from Windows
...Samba shares, when OTHER Windows applications on the same machine do not
have difficulty writing large files to the same Samba share? And when
the underlying Linux filesystem supports very large files?
I have sometimes even found that a SINGLE Windows application can write
files larger than 4 GBs while performing SOME operations, but while
performing OTHER operations, when a file gets to 2GB or 4GB, you get
back a message saying "reached file size limit" or something similar.
And those same operations don't cause any trouble when writing > 4GB
files to a local hard driv...
2007 Jan 21
2
using rsync 3.0.0 CVS version
...king with this CVS version cuz the new
"incremental-recursion algorithm" is just what I need. But ran into a
problem...
When I start the rsync, either with the rsync protocol or rsh, i found that
it'll start doing the rsync and just halt after a few hundred MBs or even up
to a couple GBs.
I was never able to finish an rsync.
Anyone else had this problem?
Shai
-------------- next part --------------
HTML attachment scrubbed and removed
2007 Feb 12
3
processing a large matrix
...have 10,000 columns, the loops (10,000 * 10,000) take forever even if
there is no formula inside.
Then, I attempted to vectorize my code:
> cor(matrix)^2
With 10,000 columns, this works great. With 30,000, R tells me it cannot
allocate vector of that length even if the memory limit is set to 4 GBs.
Is there anything else I can do to resolve this issue?
Thanks.
--
View this message in context: http://www.nabble.com/processing-a-large-matrix-tf3216447.html#a8932591
Sent from the R help mailing list archive at Nabble.com.
2012 Mar 22
3
Memory Utilization on R
Hello,
I have a 32 GB RAM Mac Pro with a 2*2.4 GHz quad core processor and 2TB
storage. Despite this having so much memory, I am not able to get R to
utilize much more than 3 GBs. Some of my scripts take hours to run but I
would think they would be much faster if more memory is utilized. How do I
optimize the memory usage on R by my Mac Pro?
Thank you!
Kurinji
[[alternative HTML version deleted]]
2016 Jul 28
2
ElasticSearch Logrotate not working
...y /etc/logrotate.d folder
/var/log/elasticsearch/*.log {
daily
rotate 100
size 50M
copytruncate
compress
delaycompress
missingok
notifempty
create 644 elasticsearch elasticsearch
}
And I notice that log files are still being generated that are upwards of 7
or 8 GBs. Can anyone point out to me where the script is going wrong, and
why log files for ES are growing so incredibly big? I would think that
having that logrotate script in place should solve that problem.
Thanks,
Tim
--
GPG me!!
gpg --keyserver pool.sks-keyservers.net --recv-keys F186197B
2004 Sep 20
2
CallerID in Queue
How can I bring the Caller ID when the calls enter call queue and answer by X-
lite or kphone?
I've tried many configuration but no luck that it only shows the AgentLogin's
exten..
Thanks!
R Wong
The information transmitted is intended only for the person or entity to which it is addressed and
may contain confidential and/or privileged material.
Any review, retransmission,
2019 Apr 24
2
Systemd, PHP-FPM, and /cgi-bin scripts
...t
> that runs via php-fpm and not via "standard" CGI?
Because "normal" php processes all of POST data in memory and is thereby
constrained to the limit of available memory. Typically in the range of a few
MB. This makes it impossible to upload LARGE files, EG 100s of MB or GBs in
size.
The cgi-bin workaround works because the CGI script has direct access to stdin
and thus can process the input in chunks without using a large amount of
memory.
But... if it can't maintain session state, then I cannot get the uploaded data
in the right place, nor validate the us...
2009 Mar 10
3
CentOS 5.2 i m having problem with gnome
Hello to all,
I am having a web server mail server Mysql and name server
on single machine my os is CentOs 5.2 and desktop
gnome everything was working fine now couple of days
back I was using text editor, Fire Fox and GIMP at a time
and my mouse started to act funny as well as very slow
even my keyboard curser moves very slow I feel it is
some sort of memory problem but I have noticed the
2009 Dec 11
4
[LLVMdev] Old DOUT
...splays the
output at program termination if requested. By default output
gets generated immediately, just like errs().
I will add a flag -debug-buffer-size=N to set the buffer and turn
on delayed output. This is super useful when trying to debug
very large codes. I have had debug output consume GBs of disk space.
This avoids that problem but it only works if all current debug
output goes to the new stream.
As I said, by default there is no change in behavior. dbgs() works
very similarly to the formatted_raw_ostream in that it uses errs()
underneath to do the actual output and only does the...
2012 Jun 26
1
[LLVMdev] reducing llc's memory consumption
We are processing some fairly large, e.g. 10s of MB, bitcode files with llc,
which result in peak memory use of several GBs.
We would like to ameliorate this somewhat.
On one end of the spectrum we could look into reducing the size of common
data structures and local space optimization.
On the other end we could try to switch the MCAssembler from a model
where it processes the entire Module at once, to something more li...
2005 Dec 02
2
/var partition and recovery
greetings
bryan, did you say that in your experience the /var partition should _not_
be on the extended partition table for "recovery" purposes?
- rh
2008 Feb 19
1
--o-direct option
...orks because VMware saves disks as a series of 2GB files and
creating a snapshot creates a new set of files that contain
modifications from the originals.
The problem is that during the rsync process the user's machine is
barely usable. The reason is because rsync reads these 2GB files... many
GBs of them. This causes the user's machine to repeatidly trash the page
cache. This really is Linux's fault. It should realize the relative
priority of the two apps and prevent rsync from trashing the cache. But
it doesn't.
Allowing rsync to specific O_DIRECT would circumvent the page cac...
2005 Mar 15
0
Documentation on Displaying Quotas
...ch member of the group had his/her own directory.
All files that went into either the group's directory or the user's
sub-directories were set (via a sticky GID) to always belong to the particular Linux
group.
So, on the Linux side, the quotas worked perfectly. Set the quota to 200 GBs
and when the total files stored in the Group's directory, including the
user's subdirectories, reached 200 GBs, no more files could be written to the
Group or User directories.
The question was, how to make My Computer or Explorer show how much space
the group had left when access...
2013 Oct 31
1
Re: libvirt-lxc driver on armv7l
Oh, sorry. I try to build libvirt on Tizen, with gbs.
Unfortunately Tizen doesn't have the packages from the "snip list of rpms":)
I tried to install dependencies from http://ftp.pbone.net/pub/fedora/linux/development/20/armhfp/os/Packages/ but that just doesn't work...
Thanks!
Jan
-----Original Message-----
From: Daniel P. Berr...
2004 Jun 15
1
How do you properly use "--partial"?
[background: we rsync Gbs of data over our WAN, so want to run rsync as
efficiently as possible. We have Linux "rsync servers" that mount local
Windows file servers - i.e we use Linux-rsync to replicate data between
Windows file servers. (why? we found Linux IP stack to be superior over our
WAN)]
I know that &quo...
2006 Sep 13
1
(no subject)
...set by peer)
[2006/09/12 09:19:47, 1] smbd/service.c:make_connection_snum(642)
2)
>From a Win 2000 Pro PC I cannot login as the same user that works on the
XP. It says something like there is not enough space on the server and it
cannot create profile, but it's impossible,I have more than GBs free on my
server.
Pleeeease, help me! :-)
Thanks a lot in advance
Stefano
2012 Oct 10
1
glmmPQL and spatial correlation
...makes me think that I
need a larger dataset than 10,000 points.
Does someone have a suggestion on how to improve/run this code with the
spatial correlation for a larger dataset than 10,000 which wouldn't take
weeks to run?
I'm working with an Intel Core i7, 12 Gb RAM (plus a couple of 100 Gbs in
virtual memory), in Windows 7 64-bit.
Thanks for your help,
----------------------------------------
Luis A. Huckstadt, Ph.D.
Department of Ecology and Evolutionary Biology
University of California Santa Cruz
Long Marine Lab
100 Shaffer Road
Santa Cruz, CA 95060
[[alternative HTML version d...
2016 Sep 09
2
Extracting files from OVA is bad
...and processed
afterwards. Under normal situation user can have up to three copies of
the VM on his drive at the end of import:
* original OVA,
* temporary extracted files (will be deleted when virt-v2v terminates,
* converted VM.
This is not a good idea for large VMs that have hunderds of GBs or even
TBs in size. The requirements on the necessary storage space can be
lessened with proper partitioning. I.e. source OVA and converted VM
don't end up on the same drive and TMPDIR is set to put even temporary
files somewhere else. But this is not a general solution. And sometimes
the nece...
2009 Jan 01
20
Large server, Xen limitations
Hi,
we''re contemplating getting a large new server, where we will run a number
of
virtual servers. Are there any things we need to keep in mind in that
case? Are
there limitations on what a Xen system can manage?
We''re talking about a 4 x Quad core CPU server with 64 GBs of RAM and a
couple of terabytes of RAIDed SATA storage.
-Morten
(Re-sending this, as the first message didn''t seem to go through.)
_______________________________________________
Xen-users mailing list
Xen-users@lists.xensource.com
http://lists.xensource.com/xen-users