search for: largefiles

Displaying 20 results from an estimated 258 matches for "largefiles".

Did you mean: largefile
2003 Mar 16
2
> 2GB files on solaris with largefiles enabled!
Hello, I'm facing a problem syncing files which is over 2GB size, eventhough i searched the archives and it been mentioned its possible if largefiles option is enabled, but still its giving me problems. i.e: building file list ... readlink dir1/oracle_data1.tar: Value too large for defined data type readlink dir1/oracle_data2.tar: Value too large for defined data type readlink dir1/oracle_data3.tar: Value too large for defined data type # du -s...
2004 Aug 02
1
HP-UX 11i and largefiles on rsync 2.6.2 (fwd)
...e: > Would anyone who is seeing this problem please try out the patch that is > attached to this bugzilla bug: > > https://bugzilla.samba.org/show_bug.cgi?id=1536 I gave it a test and it seems to work fine on HPUX 11.0. It skips the problematic mkstemp() call. It also passed the largefiles test that I sent to the list recently. -- Steve
2009 Mar 19
1
largefile question
Hello, currently we're using version 1.0.13 with 32bit file offsets. Is it safe to switch to a new version with largefile support enabled ? We want to reuse existing index/cache or do we have to expect errors with that ? Regards. Martin --------------------------------------------------------------- Martin Preen, Universit?t Freiburg, Institut f?r Informatik Georges-Koehler-Allee 52, Raum
2002 Jan 31
3
Error when compile rsync
Hi, I tried to compile rsync-2.5.2 on Solaris 5.7 Ultra-2 machine, 5.7 on Ultra-2 is running 64 bit, but, when I ran configure, it said no for largefile, I thought 2.5.2 will support large file? Is that true? Thanks, Jennifer
2006 Jun 08
3
[Bug 1195] sftp corrupts large files on new macbook pro
http://bugzilla.mindrot.org/show_bug.cgi?id=1195 Summary: sftp corrupts large files on new macbook pro Product: Portable OpenSSH Version: 4.3p2 Platform: ix86 OS/Version: Mac OS X Status: NEW Severity: normal Priority: P2 Component: sftp AssignedTo: bitbucket at mindrot.org ReportedBy: ski
2006 Oct 31
0
6366222 zdb(1M) needs to use largefile primitives when reading label
Author: eschrock Repository: /hg/zfs-crypto/gate Revision: e5f70a6fc5010aa205f244a25a9cdb950e0dae89 Log message: 6366222 zdb(1M) needs to use largefile primitives when reading label 6366267 zfs broke non-BUILD64 compilation of libsec Files: update: usr/src/cmd/zdb/zdb.c update: usr/src/lib/libsec/Makefile
2006 Oct 31
0
4775289 fsck reports wrong state in superblock if there once has existed a largefile
Author: jkennedy Repository: /hg/zfs-crypto/gate Revision: 931f9f0e5d2fb0b5da5bc6c0f77c10e668f842cf Log message: 4775289 fsck reports wrong state in superblock if there once has existed a largefile 6302747 Performance regression when deleting large files from a logging ufs filesystem 6362734 df output corrupt after under heavy stress Files: update: usr/src/cmd/fs.d/ufs/fsck/utilities.c update:
2003 Aug 16
2
Problem copying >2GB files on HP-UX 11i with rsync 2.5.6
I have downloaded the source for rsync 2.5.6 and compiled it without adjusting any options, no problems. Performed a test copy of a file GT 2 GB and it failed. In the file configure I changed the line to "--enable-largefile", removed all the .o files, performed a ./configure, gmake, gmake install. Tried the test copy and again received the following error message: rsync:
2005 Aug 31
0
throughput differences
Hi all, I notice a big throughput differences between a normal user and root on Dom-0. I did the command scp largefile root@xx.yyy.zzz.sss:~ and got the following result (test is a normal user): >From Dom-0 to Dom-1 Login root --> largefile 100% 4096MB 8.5MB/s 08:01 Login test --> largefile 100% 4096MB 2.9MB/s 23:46 <--??? >From Dom-1 to Dom-0 Login root -->
2001 Apr 20
2
scp with files > 2gb
A while back someone posted a patch for scp that updates it to deal with files > 2gb by using 64 bit offsets as defined by LFS (Large File Sumit). I belive the patch was tested on Linux but maybe not on other systems that support largefiles. I've tried this under Solaris and scp fails with a broken pipe on only the second write to the pipe between scp and ssh if the file is over 2gb. If the file is under 2gb it works fine. it fails the second time around the for loop that looks like this in scp.c:source() for (haderr = i = 0;...
2005 Apr 01
1
HP-UX 11i and largefiles on rsync
For all you folks out there using rsync on HP-UX, be warned that HP-UX is still broken with regard to large files. I opened a bug report with HP last year, as described here: http://lists.samba.org/archive/rsync/2004-July/010226.html I've been periodically checking on the status and today I was told that it's been officially filed as a "we won't fix this", citing that the
2006 Dec 05
4
flac-1.1.3 fails to decode where flac-1.1.2 works
I'm attempting to decode part of a largefile flac whose seektable is broken or missing and the file is shorter then it's supposed to be, when I use 1.1.2 it decodes fine, when I use 1.1.3 it fails flac --decode --skip=719:58.0 --until=1024:58.0 -o \/home\/sbh\/work\/hs\/out.wav \/mnt\/ss\/sdb\/Sound\/Recording\/2006\-11\-29\/in.flac out.wav: ERROR seeking while skipping bytes
2006 Dec 05
4
flac-1.1.3 fails to decode where flac-1.1.2 works
I'm attempting to decode part of a largefile flac whose seektable is broken or missing and the file is shorter then it's supposed to be, when I use 1.1.2 it decodes fine, when I use 1.1.3 it fails flac --decode --skip=719:58.0 --until=1024:58.0 -o \/home\/sbh\/work\/hs\/out.wav \/mnt\/ss\/sdb\/Sound\/Recording\/2006\-11\-29\/in.flac out.wav: ERROR seeking while skipping bytes
2007 Aug 27
1
fix for broken largefile seek() on 32-bit linux (PR#9883)
Full_Name: John Brzustowski Version: R-devel-trunk, R-2.4.0 OS: linux Submission from: (NULL) (206.248.132.197) DESCRIPTION seek() on files larger than 2 gigabytes fails for large values of "where" on i386 linux 2.6.13 (and presumably other 32-bit unix-like platforms). e.g.: > f<-file("3gigabytefile.dat", "rb") > seek(f, 3e9, "start",
2010 Jan 22
0
Removing large holey file does not free space 6792701 (still)
Hello, I mentioned this problem a year ago here and filed 6792701 and I know it has been discussed since. It should have been fixed in snv_118, but I can still trigger the same problem. This is only triggered if the creation of a large file is aborted, for example by loss of power, crash or SIGINT to mkfile(1M). The bug should probably be reopened but I post it here since some people where
2002 Apr 02
1
ext3 fs unable to stat
Hi, I am running RH 7.2 (professional release 2.4 kernel) I have a ext3 file system which stores large oracle "dump files" over 4GB. We write to it via NFS which is working great, the problem is when we issue a file largefile.dmp from the linux box it fails on anything over 4GB stating that can't stat 'largefile.dmp' (value too large for defined data type). We can do
2006 Dec 11
1
flac-1.1.3 fails to decode where flac-1.1.2 works
...sure it's a problem with the seektable? flac-1.1.3 has > a new seeking algorithm and yuo may have hit a bug with it. > > does it happen on any large file? if so, what is the size threshold? > if not, can you host an example for me to debug with? It's definitely happening with largefiles, around 3-4GB. Also, like I was saying I normally like to cut parts of the file before the file has been 'completed' using the --skip and --until. -- avuton -- Anyone who quotes me in their sig is an idiot. -- Rusty Russell.
2003 Mar 26
3
Transfer files bigger than 1383275520 bytes - Rsync compiled in cygwin
Hello, I can't transfer big files. In the output below, you can see that rsync only transfer 1383275520 of a 5448046592 bytes file: F:\shells>rsync -e ssh -avz ./backup chris@myhost.com:backups chris@myhost.com's password: building file list ... done wrote 114 bytes read 20 bytes 3.01 bytes/sec total size is 1383275520 speedup is 10322951.64 I've tried with other big files and
2004 Sep 06
1
Fixing libvorbisfile to handle largefiles
[I'm not online regularly, so don't include me in any replies. Looks like the archives are working at present, so I'll catch up from them when I get a chance. Thanks...] (Trying out vorbis-dev@ instead of vorbis@ where I sent my previous messages; if this is the wrong place, correct me) Greetings. Some weeks ago, I submitted several hacks which gave me the ability to play Ogg
2001 Sep 10
4
scp doesn't work with large (>2GB) files
Hi, A bug I've had reported is that scp doesn't work with large files (Debian bug number 106809). The problem seems to be scp.c:504: if ((fd = open(name, O_RDONLY, 0)) < 0) Is there some reason why making that if ((fd = open(name, O_RDONLY|O_LARGEFILE, 0)) < 0) would break things? It seems a simple fix to me... Thanks, Matthew -- "At least you know where you are