Displaying 20 results from an estimated 77 matches for "24gb".
Did you mean:
20gb
2010 Mar 04
6
XCP 64 bits ?
Hello,
I am just installing XCP 0.1.1 on a server with 24GB RAM
Normally XCP is a 64 bits version but
* *uname -a*
Linux node012 2.6.27.42-0.1.1.xs0.1.1.737.1065xen #1 SMP Fri
Jan 15 16:20:16 EST 2010 i686 i686 i386 GNU/Linux
(not a x86_64 version !!!!)
* *cat /proc/meminfo*
MemTotal: 746496 kB
MemFree:...
2003 May 20
1
Allocation problem
Hi,
I have a problem when using R1.7.0. I have a very large dataset, but I
also have access to 24GB RAM on a super-computer. Nevertheless, when
using the "lme"-function I get the error message
Error: cannot allocate vector of size 224295 Kb
As I understand it, the default is that there is no memory-limit in R
other than machine resources, but somehow R does not get acces to the
24...
2010 May 21
2
fsck.ocfs2 using huge amount of memory?
...with ocfs2 1.2.9, the new servers have ocfs2 1.4.3 installed. Part of the refresh process is to run fsck.ocfs2 on the volume to recover, but right now as I am trying to run it on our 700GB volume it shows a virtual memory size of 21.9GB, resident of 10GB and it is killing the machine with swapping (24GB physical memory).
Can anyone enlighten what is going on?
Ulf.
2012 Dec 01
3
6Tb Database with ZFS
Hello,
Im about to migrate a 6Tb database from Veritas Volume Manager to ZFS, I
want to set arc_max parameter so ZFS cant use all my system''s memory, but i
dont know how much i should set, do you think 24Gb will be enough for a 6Tb
database? obviously the more the better but i cant set too much memory.
Have someone implemented succesfully something similar?
We ran some test and the usage of memory was as follow:
(With Arc_max at 30Gb)
Kernel = 18Gb
ZFS DATA = 55Gb
Anon = 90Gb
Page Cache = 10Gb
Free...
2012 Jun 26
6
Universal server hardware platform - which to choose?
...U;
- hot swap for disks.
We are looking for a solution in which we would be able to deploy a
basic server with for example 2 SATA disks, 8GB of RAM, 1 NIC, 4 cors
and to be able to use the same enclosure and motherboard and extend it
to deploy a more heavy-duty server with for example 6 SATA disks, 24GB
of RAM, 2 NICs, 8 cors.
Which manufacturer can you recommend and why? We are looking for
something rather not expensive but reliable which has a good support.
All servers will be based on CentOS5/6 :)
Best regards,
Rafal Radecki.
2012 Aug 05
3
Memory limit for Windows 64bit build of R
Dear all
I have a Windows Server 2008 R2 Enterprise machine, with 64bit R installed
running on 2 x Quad-core Intel Xeon 5500 processor with 24GB DDR3 1066 Mhz
RAM. I am seeking to analyse very large data sets (perhaps as much as
10GB), without the addtional coding overhead of a package such as
bigmemory().
My question is this - if we were to increase the RAM on the machine to
(say) 128GB, would this become a possibility? I have read...
2012 Jun 26
4
increase the usage of CPU and Memory
...online for help increasing my R code more efficiently
for almost a whole day, however, there is no solution to my case. So if
anyone could give any clue to solve my problem, I would be very appreciate
for you help. Thanks in advance.
Here is my issue:
My desktop is with i7-950 Quad-core CPU with 24Gb memory, and a NVIDIA GTX
480 graphic card, and I am using a 64-bit version of R under 64-bit Windows
.
I am running a "for" loop to generate a 461*5 matrix data, which is coming
from the coefficients of 5 models. The loop would produce 5 values one
time, and it will run 461 times in to...
2017 Jun 02
4
sum() returns NA on a long *logical* vector when nb of TRUE values exceeds 2^31
...unt
is greater than 2^31. For example:
> xx <- runif(3e9)
> sum(xx < 0.9)
[1] NA
Warning message:
In sum(xx < 0.9) : integer overflow - use sum(as.numeric(.))
This already takes a long time and doing sum(as.numeric(.)) would
take even longer and require allocation of 24Gb of memory just to
store an intermediate numeric vector made of 0s and 1s. Plus, having
to do sum(as.numeric(.)) every time I need to count things is not
convenient and is easy to forget.
It seems that sum() on a logical vector could be modified to return
the count as a double when it cannot be rep...
2012 Jan 30
2
ode() tries to allocate an absurd amount of memory
...in vode(y, times, func, parms, ...) :
cannot allocate memory block of size 137438953456.0 Gb
In addition: Warning message:
In vode(y, times, func, parms, ...) : NAs introduced by coercion
This appears to be case regardless of the computer I use; that is, whether
it's a laptop or server with 24Gb of RAM. Why is ode() trying to allocate
137 billion gigabytes of memory?! (I receive exactly the same error message
whether I have, for example, 34000 or 80000 state variables: the amount of
memory trying to be allocated is exactly the same.) I have included a
trivial example below that uses a func...
2010 May 03
1
xentrace
...9;ve made some SAP benchmarks on my Xen system and discovered a huge difference in the performance of a "xened" SAP system compared to a native SAP system. Hence, I tried to figure out what might cause this 'overhead' and run a xentrace (listining to all events). Xentrace produced 24gb data and I converted it to 27gb human-readable data. After I gathered the human-readable data, I filtered the data and counted the appearance of each event. So far, so good.
Now it comes: although I used paravirt-guests, the xentrace-tool reported HVM events in the trace data. Moreover, from my po...
2010 Dec 27
1
director in large(ish) environments
...I just switched our whole production setup to the director and am quite pleased with the result. We're doing a peak of about 25000 tot 30000 concurrent sessions on 3 servers. But ive shut 1 server down a couple of days ago to see what would happen and 2 servers carried the load easily. (16 CPU, 24GB memory servers). If others are using the director on larger setups maybe we can all post when things do or dont work well.
Now to see if the solution is better than the problem :)
Cor
2011 Jan 03
1
changed datadir
...y double checked my rights and
found them to be correct, then I restarted mysqld and this has fixed it for
now.
Has anyone else run into this problem? I am concerned that it will happen
again if the actual cause is not corrected. I am running the latest Fedora
on new quad core Dell hardware with 24GB RAM and 1.7TB of disk. Any help
and/or suggestions much appreciated. Thanks.
Nick
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.digium.com/pipermail/asterisk-users/attachments/20110103/327b79ae/attachment.htm>
2006 Oct 31
11
Dell 2900 with Xen
-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1
Hi,
someone are using Dell 2900 and PERC5 controller with Xen?
I want to test this server with 24GB RAM, 7 discs 15000 RPM SCSI and 2
processors.
If someone could tell me if this hardware works great with Xen, will be
nice.
Thanks.
Lucas
-----BEGIN PGP SIGNATURE-----
Version: GnuPG v1.4.5 (GNU/Linux)
Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org
iD8DBQFFR1vPoFHRTH+AUKYRAlbkAK...
2016 Jan 27
2
Maildir to mdbox conersion size issue
Hello,
I have issue with dovecot 2.2.13
I want to change all Maildirs to mdbox format and I do it per partes, so
I change mail_location for user (in sql), than I use doveadm sync -u
user path_to_maildir and everything works mostly fine.
But now I found that when I migrate quite huge Maildir (about 24G of
mails) to mdbox, after sync the final mdbox is twice as big as source
Maildir (48GB). I
2012 Dec 07
3
(no subject)
Hi All,
I have recently installed CentOS 6.3 with QEMU+KVM for Virtualization.
I have successfully created a Windows 2003 VM with 4GB of RAM. The host
server is an HP ML350 G8 with 24GB RAM and 24 cores. Details of one of
the cores is shown below:
processor : 23
vendor_id : GenuineIntel
cpu family : 6
model : 45
model name : Intel(R) Xeon(R) CPU E5-2620 0 @ 2.00GHz
stepping : 7
cpu MHz : 1200.000
cache size : 15360 KB
physical...
2011 Dec 20
1
Convert ragged list to structured matrix efficiently
Hi All,
I'm wanting to convert a ragged list of values into a structured matrix for
further analysis later on, i have a solution to this problem (below) but
i'm dealing with datasets upto 1GB in size, (i have 24GB of memory so can
load it) but it takes a LONG time to run the code on a large dataset. I
was wondering if anyone had any tips or tricks that may make this run
faster?
Below is some sample code of what ive been doing, (in the full version i
use snowfall to spread the work via sfSapply)
bhvs <-...
2010 Jul 21
6
[Fwd: XCP - extreme high load on pool master]
Good day.
We trying to test XCP cloud under some product-like load (4 hosts, each with 24Gb mem and 8 cores)
But with just about 30-40 virtual machines I got an extreme load on dom0
on pool master host: LA is about 3.5-6, and most time are used by xapi
and stunnel processes.
It''s really bother me: what happens on higher load with few thousands of
VMs with about 10-16 hosts in p...
2009 Feb 05
1
Questions regarding journal replay
Today, I had to uncleanly shutdown one of our machines due to an error
in 2.6.28.3. Durin the boot sequence, the ext4 partition /home
experienced a journal replay. /home looks like this:
/dev/mapper/volg1-logv1 on /home type ext4 (rw,noexec,nodev,noatime,errors=remount-ro)
Filesystem Size Used Avail Use% Mounted on
/dev/mapper/volg1-logv1 2,4T 1,4T 1022G 58% /home
Filesystem
2007 Mar 15
20
C''mon ARC, stay small...
...max = 0t1070318720
. . .
"size" is at 3GB, with c_max at 1GB.
What gives? I''m looking at the code now, but was under the impression
c_max would limit ARC growth. Granted, it''s not a factor of 10, and
it''s certainly much better than the out-of-the-box growth to 24GB
(this is a 32GB x4500), so clearly ARC growth is being limited, but it
still grew to 3X c_max.
Thanks,
/jim
2010 Nov 24
3
Boot 32GB Multi-partition Flash as USB-ZIP
I have a Pentium 4 machine that does not boot from my 32GB SanDisk
Cruzer. Its first partition is 24GB and FAT32, to serve as
cross-platform storage. There is a second partition of 7GB in EXT2
which is bootable and contains a Linux system armed with syslinux
(extlinux). This works fine booting off of recent laptops and desktops
alike.
This particular desktop has in its BIOS everything related to US...