Displaying 20 results from an estimated 258 matches for "largefile".
2003 Mar 16
2
> 2GB files on solaris with largefiles enabled!
Hello,
I'm facing a problem syncing files which is over 2GB size, eventhough i searched the archives and it been mentioned its possible if largefiles option is enabled, but still its giving me problems. i.e:
building file list ... readlink dir1/oracle_data1.tar: Value too large for defined data type
readlink dir1/oracle_data2.tar: Value too large for defined data type
readlink dir1/oracle_data3.tar: Value too large for defined data type
# du -...
2004 Aug 02
1
HP-UX 11i and largefiles on rsync 2.6.2 (fwd)
...e:
> Would anyone who is seeing this problem please try out the patch that is
> attached to this bugzilla bug:
>
> https://bugzilla.samba.org/show_bug.cgi?id=1536
I gave it a test and it seems to work fine on HPUX 11.0. It skips the
problematic mkstemp() call.
It also passed the largefiles test that I sent to the list recently.
-- Steve
2009 Mar 19
1
largefile question
Hello,
currently we're using version 1.0.13 with 32bit file offsets.
Is it safe to switch to a new version with largefile support
enabled ?
We want to reuse existing index/cache or do we have to
expect errors with that ?
Regards.
Martin
---------------------------------------------------------------
Martin Preen, Universit?t Freiburg, Institut f?r Informatik
Georges-Koehler-Allee 52, Raum 00-006, 79110 Freiburg, Ge...
2002 Jan 31
3
Error when compile rsync
Hi,
I tried to compile rsync-2.5.2 on Solaris 5.7 Ultra-2 machine,
5.7 on Ultra-2 is running 64 bit, but, when I ran configure,
it said no for largefile, I thought 2.5.2 will support large
file? Is that true?
Thanks,
Jennifer
2006 Jun 08
3
[Bug 1195] sftp corrupts large files on new macbook pro
...-------------------------------------
<GMT27-Apr-2006 20:27:29GMT> Darrin Bodner:
Summary:
Large files (on the order of 1GB) are sometimes (~50%) corrupted when
transferred by using sftp on a new macbookpro (mac os x/intel arch)
Steps to Reproduce:
Enable sshd.
$ sftp localhost
> > get largefile largefile.copy
$ md5 largefile
$ md5 largefile.copy
The results usually differ. Using diff(1) confirms this is not a bug
with md5. Subsequent transfers can have the correct checksum or
different incorrect checksums, suggesting the corruption does not
always happen in the same place. sftp'ing a...
2006 Oct 31
0
6366222 zdb(1M) needs to use largefile primitives when reading label
Author: eschrock
Repository: /hg/zfs-crypto/gate
Revision: e5f70a6fc5010aa205f244a25a9cdb950e0dae89
Log message:
6366222 zdb(1M) needs to use largefile primitives when reading label
6366267 zfs broke non-BUILD64 compilation of libsec
Files:
update: usr/src/cmd/zdb/zdb.c
update: usr/src/lib/libsec/Makefile
2006 Oct 31
0
4775289 fsck reports wrong state in superblock if there once has existed a largefile
Author: jkennedy
Repository: /hg/zfs-crypto/gate
Revision: 931f9f0e5d2fb0b5da5bc6c0f77c10e668f842cf
Log message:
4775289 fsck reports wrong state in superblock if there once has existed a largefile
6302747 Performance regression when deleting large files from a logging ufs filesystem
6362734 df output corrupt after under heavy stress
Files:
update: usr/src/cmd/fs.d/ufs/fsck/utilities.c
update: usr/src/uts/common/fs/ufs/lufs_map.c
update: usr/src/uts/common/fs/ufs/ufs_thread.c
2003 Aug 16
2
Problem copying >2GB files on HP-UX 11i with rsync 2.5.6
I have downloaded the source for rsync 2.5.6 and compiled it without
adjusting any options, no problems. Performed a test copy of a file GT 2 GB
and it failed. In the file configure I changed the line to
"--enable-largefile", removed all the .o files, performed a ./configure,
gmake, gmake install. Tried the test copy and again received the following
error message:
rsync: writefd_unbuffered failed to write 32768 bytes: phase "unknown":
Broken pipe
rsync error: error in rsync protocol data stream (cod...
2005 Aug 31
0
throughput differences
Hi all,
I notice a big throughput differences between a normal user and root on
Dom-0. I did the command scp largefile root@xx.yyy.zzz.sss:~ and got the
following result (test is a normal user):
>From Dom-0 to Dom-1
Login root --> largefile 100% 4096MB 8.5MB/s 08:01
Login test --> largefile 100% 4096MB 2.9MB/s 23:46 <--???
>From Dom-1 to Dom-0
Login root --> largefile 100% 4096MB...
2001 Apr 20
2
scp with files > 2gb
A while back someone posted a patch for scp that updates it to deal with
files > 2gb by using 64 bit offsets as defined by LFS (Large File Sumit).
I belive the patch was tested on Linux but maybe not on other systems
that support largefiles.
I've tried this under Solaris and scp fails with a broken pipe on only the
second write to the pipe between scp and ssh if the file is over 2gb.
If the file is under 2gb it works fine.
it fails the second time around the for loop that looks like this in
scp.c:source()
for (haderr = i = 0;...
2005 Apr 01
1
HP-UX 11i and largefiles on rsync
...temp() if HP-UX were ever
fixed (HA!) This may falsely fail if the current filesystem doesn't
allow for large files (common on HP systems) and if the system doesn't
support sparse files this test could fill up the filesystem.
4) Implement HP's suggested workaround of a fcntl() to set O_LARGEFILE
on the file handle returned by mkstemp() inside syscall.c. This is
essentially non-portable and indeed won't even work on HP-UX if
building a 64-bit binary.
5) Document the workarounds for HP-UX in the INSTALL file (e.g.
comment out HAVE_SECURE_MKSTEMP in config.h after running "configure...
2006 Dec 05
4
flac-1.1.3 fails to decode where flac-1.1.2 works
I'm attempting to decode part of a largefile flac whose seektable is
broken or missing and the file is shorter then it's supposed to be,
when I use 1.1.2 it decodes fine, when I use 1.1.3 it fails
flac --decode --skip=719:58.0 --until=1024:58.0 -o
\/home\/sbh\/work\/hs\/out.wav
\/mnt\/ss\/sdb\/Sound\/Recording\/2006\-11\-29\/in.flac
out...
2006 Dec 05
4
flac-1.1.3 fails to decode where flac-1.1.2 works
I'm attempting to decode part of a largefile flac whose seektable is
broken or missing and the file is shorter then it's supposed to be,
when I use 1.1.2 it decodes fine, when I use 1.1.3 it fails
flac --decode --skip=719:58.0 --until=1024:58.0 -o
\/home\/sbh\/work\/hs\/out.wav
\/mnt\/ss\/sdb\/Sound\/Recording\/2006\-11\-29\/in.flac
out...
2007 Aug 27
1
fix for broken largefile seek() on 32-bit linux (PR#9883)
Full_Name: John Brzustowski
Version: R-devel-trunk, R-2.4.0
OS: linux
Submission from: (NULL) (206.248.132.197)
DESCRIPTION
seek() on files larger than 2 gigabytes fails for large values of "where" on
i386 linux 2.6.13 (and presumably other 32-bit unix-like platforms).
e.g.:
> f<-file("3gigabytefile.dat", "rb")
> seek(f, 3e9, "start",
2010 Jan 22
0
Removing large holey file does not free space 6792701 (still)
...NAME USED AVAIL REFER MOUNTPOINT
zpool01 123K 5.33T 42.0K /zpool01
filer01a:/$ df -h /zpool01
Filesystem Size Used Avail Use% Mounted on
zpool01 5.4T 42K 5.4T 1% /zpool01
filer01a:/$ mkfile 1024G /zpool01/largefile
^C
filer01a:/$ zfs list zpool01
NAME USED AVAIL REFER MOUNTPOINT
zpool01 160G 5.17T 160G /zpool01
filer01a:/$ ls -hl /zpool01/largefile
-rw-...
2002 Apr 02
1
ext3 fs unable to stat
Hi,
I am running RH 7.2 (professional release 2.4 kernel) I have a ext3 file
system which stores large oracle "dump files" over 4GB. We write to it via
NFS which is working great, the problem is when we issue a file
largefile.dmp from the linux box it fails on anything over 4GB stating that
can't stat 'largefile.dmp' (value too large for defined data type). We can
do the same command on the NFS mount on an HP 11.0 server and it works fine.
We have "journaling" turned on , what are we missing on t...
2006 Dec 11
1
flac-1.1.3 fails to decode where flac-1.1.2 works
On 12/11/06, Josh Coalson <xflac@yahoo.com> wrote:
> --- Avuton Olrich <avuton@gmail.com> wrote:
> > I'm attempting to decode part of a largefile flac whose seektable is
> > broken or missing and the file is shorter then it's supposed to be,
> > when I use 1.1.2 it decodes fine, when I use 1.1.3 it fails
> >
> > flac --decode --skip=719:58.0 --until=1024:58.0 -o
> > \/home\/sbh\/work\/hs\/out.wav
> > \...
2003 Mar 26
3
Transfer files bigger than 1383275520 bytes - Rsync compiled in cygwin
...d:
building file list ... done
wrote 114 bytes read 20 bytes 3.01 bytes/sec
total size is 1383275520 speedup is 10322951.64
I've tried with other big files and rsync strip them to the same size, so I
recompiled rsync to support large files support but I failed again:
sh ./configure --enable-largefile --prefix=/usr
In the output you can see:
checking for broken largefile support... no
checking for special C compiler options needed for large files... no
checking for _FILE_OFFSET_BITS value needed for large files... no
checking for _LARGE_FILES value needed for large files... no
$ rsync --versi...
2004 Sep 06
1
Fixing libvorbisfile to handle largefiles
[I'm not online regularly, so don't include me in any replies.
Looks like the archives are working at present, so I'll catch
up from them when I get a chance. Thanks...]
(Trying out vorbis-dev@ instead of vorbis@ where I sent my
previous messages; if this is the wrong place, correct me)
Greetings.
Some weeks ago, I submitted several hacks which gave me the
ability to play Ogg
2001 Sep 10
4
scp doesn't work with large (>2GB) files
Hi,
A bug I've had reported is that scp doesn't work with large files
(Debian bug number 106809). The problem seems to be scp.c:504:
if ((fd = open(name, O_RDONLY, 0)) < 0)
Is there some reason why making that
if ((fd = open(name, O_RDONLY|O_LARGEFILE, 0)) < 0)
would break things? It seems a simple fix to me...
Thanks,
Matthew
--
"At least you know where you are with Microsoft."
"True. I just wish I'd brought a paddle."
http://www.debian.org