Displaying 5 results from an estimated 5 matches for "68gb".
Did you mean:
64gb
2020 Feb 06
3
commiting SAM database
...s. For the migration we used the 4.11 sernet-packages. We started
with 32GB of RAM and noticed that this was not enough. We raised the RAM
to 128GB. After reading all objects the system needed 28GB RAM but then
the "commiting SAM database" process started and the RAM usage increases
up to 68GB. Because we have to get the new domain running we stopped the
migration after 28 hours. Because the original domain is getting all
objects via an openLDAP (hiphiphuraaaa) we started with a new domain (I
think it's the best way after 18 years) and now we are pushing all
objects into the now doma...
2011 Jul 25
1
CRAN mirror size mushrooming; consider archiving some?
...r and filled up the disk space the
server allotted me. I asked for more, then filled that up. Now the
system administrators want me to buy an $800 fiber channel card and a
storage device. I'm going to do that, but it does make want to
suggest to you that this is a problem.
CRAN now is about 68GB, and about 3/4 of that is in the bin folder,
where one finds copies of compiled packages for macosx and windows.
If the administrators of CRAN would move the packages for R before,
say, 2.12, to long term storage, then mirror management would be a bit
more, well, manageable.
Moving the R for windo...
2003 Aug 05
0
long time taken in building file-list
HI,
I have a 68GB of filesystem to be synchronized from 1 filesystem to another. However, it is really taking more than 4 hours to just build the file-list, and it's still in the process of building the file-list as show below, together with my script. I broke-down the rsync process by transferring the sub-files...
2020 Feb 03
2
commiting SAM database
Anyone can give? hint how log it will take to commit the SAM database?
We joining a samba 4.11 into a windows domain:
Partition[DC=example,DC=de] objects[129301/57641] linked_values[40647/42351]
Done with always replicated NC (base, config, schema)
Replicating DC=DomainDnsZones,DC=example,DC=de
Partition[DC=DomainDnsZones,DC=example,DC=de] objects[83/79]
linked_values[0/0]
Replicating
2009 Jul 06
3
How to make big MySQL database more diffable/rsyncable? (aka rsyncing big files)
Hello group,
I'm having a very hard time rsyncing efficiently a MySQL database which
contains very large binary blobs.
(Actually, it's the database of Mantis bug tracker
[http://www.mantisbt.org/], with file attachments stored directly in the
table rows. I know it's a bad idea from many other reasons, but let's
say it was given to me as such.)
First, I was dumping the