Displaying 20 results from an estimated 24 matches for "2.3gb".
Did you mean:
2.3g
2003 Dec 19
3
partial transfer
I am attempting to use rsync to backup a Win98 laptop to a FreeBSD 4.8
backup server. I have experienced the same problem at roughly the same
point in the process on two occations. The laptop contains ~2.7Gb of data.
On the first attempt we received this error at 2.3Gb and on the second at
2.4Gb.
rsync error: partial transfer (code 23) at main.c(575)
Would love to have a full backup of the
2018 Jan 10
4
Panic: file mail-index-util.c: line 37 (mail_index_uint32_to_offset): assertion failed: (offset < 0x40000000)
Hello,
i have a problem with a very huge mail account.
The INBOX of this account has ~1.1 - ~1.7 million mails inside. And
from time to time the index cache make problems and i got the
following in the logs:
--- Snip ---
Jan 10 20:43:04 XXXXX dovecot: imap(XXXXX): Panic: file
mail-index-util.c: line 37 (mail_index_uint32_to_offset): assertion
failed: (offset < 0x40000000)
--- Snap ---
I
2016 Feb 29
4
Possible Memory Savings for tools emitting large amounts of existing data through MC
On Mon, Feb 29, 2016 at 3:36 PM, Adrian Prantl <aprantl at apple.com> wrote:
>
> On Feb 29, 2016, at 3:18 PM, David Blaikie <dblaikie at gmail.com> wrote:
>
> Just in case it interests anyone else, I'm playing around with trying to
> broaden the MCStreamer API to allow for emission of bytes without copying
> the contents into a local buffer first (either because
2016 Mar 01
0
Possible Memory Savings for tools emitting large amounts of existing data through MC
> On Feb 29, 2016, at 3:46 PM, David Blaikie <dblaikie at gmail.com> wrote:
>
>
>
> On Mon, Feb 29, 2016 at 3:36 PM, Adrian Prantl <aprantl at apple.com <mailto:aprantl at apple.com>> wrote:
>
>> On Feb 29, 2016, at 3:18 PM, David Blaikie <dblaikie at gmail.com <mailto:dblaikie at gmail.com>> wrote:
>>
>> Just in case it
2016 Sep 17
2
(Thin)LTO llvm build
So, when I embark on the next ThinLTO try build, probably this Sunday,
should I append -Wl,-plugin-opt,jobs=NUM_PHYS_CORES to LDFLAGS
and run ninja without -j or -jNUM_PHYS_CORES?
2007 Oct 09
1
problem mapping large files?
Hi all,
I'm trying to use Microsoft's ImageX with Wine as part of a small Linux
image to install Windows disk images.
It works great for a 120Mb file, however when I try it on a 2.3Gb file,
it fails with "Invalid data".
The tool is designed for dealing with large files, so it looks like a
problem in Wine.
I reran it with +server,+relay which included the following in the
2007 Jan 18
1
NFS 64-bit client crash with 32-bit server
I have an unusual scenario where a new 64-bit multi-core Opteron
Centos 4.4client crashes when writing a large file on a 32-bit NFS
server. The
problem happens on the same client with either Centos 4.4 or SuSE 9.3 32-bit
NFS servers. Additionally the problem is reproducible with another
identical 64-bit Centos 4.4 client. I'm not certain that the problem is
tied to a 32-bit (as opposed to
2016 Sep 17
5
(Thin)LTO llvm build
On Sun, Sep 18, 2016 at 12:32 AM, Mehdi Amini <mehdi.amini at apple.com> wrote:
>
>> On Sep 17, 2016, at 3:19 PM, Carsten Mattner <carstenmattner at gmail.com> wrote:
>>
>> So, when I embark on the next ThinLTO try build, probably this Sunday,
>> should I append -Wl,-plugin-opt,jobs=NUM_PHYS_CORES to LDFLAGS
>> and run ninja without -j or
2017 Sep 02
5
readLines() segfaults on large file & question on how to work around
Hi:
I have a 2.1GB JSON file. Typically I use readLines() and
jsonlite:fromJSON() to extract data from a JSON file.
When I try and read in this file using readLines() R segfaults.
I believe the two salient issues with this file are
1). Its size
2). It is a single line (no line breaks)
I can reproduce this issue as follows
#Generate a big file with no line breaks
# In R
>
2016 Mar 01
2
Possible Memory Savings for tools emitting large amounts of existing data through MC
On Mon, Feb 29, 2016 at 3:51 PM, Adrian Prantl <aprantl at apple.com> wrote:
>
> On Feb 29, 2016, at 3:46 PM, David Blaikie <dblaikie at gmail.com> wrote:
>
>
>
> On Mon, Feb 29, 2016 at 3:36 PM, Adrian Prantl <aprantl at apple.com> wrote:
>
>>
>> On Feb 29, 2016, at 3:18 PM, David Blaikie <dblaikie at gmail.com> wrote:
>>
>>
2017 Sep 02
1
readLines() segfaults on large file & question on how to work around
Thank you for your suggestion. Unfortunately, while R doesn't segfault
calling readr::read_file() on the test file I described, I get the error
message:
Error in read_file_(ds, locale) : negative length vectors are not allowed
Jen
On Sat, Sep 2, 2017 at 1:38 PM, Ista Zahn <istazahn at gmail.com> wrote:
> As s work-around I suggest readr::read_file.
>
> --Ista
>
>
>
2018 Jun 13
4
[lldb-dev] Adding DWARF5 accelerator table support to llvm
Hello again,
It's been nearly six months since my first email, so it's a good time
to recap what has been done here so far. I am happy to report that
stages 1-3 (i.e. producer/consumer in llvm and integration with lldb)
of my original plan are now complete with one caveat.
The caveat is that the .debug_names section is presently not a full
drop-in replacement for the .apple_*** sections.
2016 Feb 29
0
Possible Memory Savings for tools emitting large amounts of existing data through MC
> On Feb 29, 2016, at 3:46 PM, David Blaikie <dblaikie at gmail.com> wrote:
>
>
>
> On Mon, Feb 29, 2016 at 3:36 PM, Adrian Prantl <aprantl at apple.com <mailto:aprantl at apple.com>> wrote:
>
>> On Feb 29, 2016, at 3:18 PM, David Blaikie <dblaikie at gmail.com <mailto:dblaikie at gmail.com>> wrote:
>>
>> Just in case it
2005 May 21
5
copying large files over NFS locks up machine on -testing from Thursday
I''ve locked up my dom0 a couple of times this morning copying a 3GB
file from local disk to an NFS mount(neither xend nor guests running).
I don''t encounter this problem on the stock CentOS 4 kernel. The
machine is a PowerEdge 2850 with 2 e1000 cards - the one in use is
connected to a PowerConnect 2216 10/100 switch and has negotiated
100Mbit. I''ll check if the stock
2016 Mar 01
0
Possible Memory Savings for tools emitting large amounts of existing data through MC
> On Feb 29, 2016, at 4:10 PM, David Blaikie <dblaikie at gmail.com> wrote:
>
>
>
> On Mon, Feb 29, 2016 at 3:51 PM, Adrian Prantl <aprantl at apple.com <mailto:aprantl at apple.com>> wrote:
>
>> On Feb 29, 2016, at 3:46 PM, David Blaikie <dblaikie at gmail.com <mailto:dblaikie at gmail.com>> wrote:
>>
>>
>>
>> On
2006 Jan 05
4
Q: R 2.2.1: Memory Management Issues?
Dear Developers:
I have a question about memory management in R 2.2.1 and am wondering if you would be kind enough to help me understand what is going on.
(It has been a few years since I have done software development on Windows, so I apologize in advance if these are easy questions.)
-------------
MY SYSTEM
-------------
I am currently using R (version 2.2.1) on a PC running Windows 2000
2018 Jan 10
0
Panic: file mail-index-util.c: line 37 (mail_index_uint32_to_offset): assertion failed: (offset < 0x40000000)
Sorry, i forgot to mention, that the storage format is mdbox and the
version is 2.2.27-3~bpo8+1 from jessie backports.
Kind regards
Stefan
2018-01-10 21:00 GMT+01:00 Stefan Neben <stefan.neben at gmail.com>:
> Hello,
>
> i have a problem with a very huge mail account.
>
> The INBOX of this account has ~1.1 - ~1.7 million mails inside. And
> from time to time the index
2018 Jan 11
0
Panic: file mail-index-util.c: line 37 (mail_index_uint32_to_offset): assertion failed: (offset < 0x40000000)
On 10.01.2018 22:00, Stefan Neben wrote:
> Hello,
>
> i have a problem with a very huge mail account.
>
> The INBOX of this account has ~1.1 - ~1.7 million mails inside. And
> from time to time the index cache make problems and i got the
> following in the logs:
>
> --- Snip ---
> Jan 10 20:43:04 XXXXX dovecot: imap(XXXXX): Panic: file
> mail-index-util.c: line 37
2014 May 06
0
Some Information about compression rates to expect using zlib/xz compression
compression : xz, Level 6
here some compression rates I experienced :
Diskspace usage :
user1: uncompressed : 2.3GB --> compressed 1 GB
user2: uncompressed : 6.2GB --> compressed 3.9 GB
just for your reference
Ing. Robert Nowotny
Rotek GmbH
Vienna/Austria
2017 Sep 02
0
readLines() segfaults on large file & question on how to work around
As s work-around I suggest readr::read_file.
--Ista
On Sep 2, 2017 2:58 PM, "Jennifer Lyon" <jennifer.s.lyon at gmail.com> wrote:
> Hi:
>
> I have a 2.1GB JSON file. Typically I use readLines() and
> jsonlite:fromJSON() to extract data from a JSON file.
>
> When I try and read in this file using readLines() R segfaults.
>
> I believe the two salient