I'm reposting this problem (perhaps a bug) now I've got more information on it. This is another point of view of the situation and I hope someone could have run into the same trouble before (and solved it :-)) This is it: * with ntbackup 2000 I create a 22Gb .bkf file in the windows machine. * I can copy that file over a samba share and get correct info form the file in windows explorer. * ls -l also returns correct info, *WHILE* stat, mc, and other programs raise up with an error regarding a value too high for defined data type. * If I try to create the file with ntbackup directly over the share, it gets downsized to 0 bytes and grows slowly while ntbackup dies when the file crosses the 4 Gb (exactly) size. *I have compiled myself version 2.2.2 of samba and, surprisingly, the 4Gb "limit" situation described above was taken down to 2 Gb exactly. Any idea on what's happening, please?? Thanks all for reading (and more more thanks if someone responds!) -------------- next part -------------- HTML attachment scrubbed and removed
I am not sure but you might be running into the limitations of the ext2 file ssytem. I believe it has a 4 Gbyte limitation on a single file. You might have to consider another file system for your large files. Ivan Fernandez wrote:> I'm reposting this problem (perhaps a bug) now I've got more > information on it. This is another point of view of the situation and > I hope someone could have run into the same trouble before (and solved > it :-)) > > This is it: > > * with ntbackup 2000 I create a 22Gb .bkf file in the windows > machine. > > * I can copy that file over a samba share and get correct info > form the file in windows explorer. > > * ls -l also returns correct info, *WHILE* stat, mc, and other > programs raise up with an error regarding a value too high for defined > data type. > > * If I try to create the file with ntbackup directly over the > share, it gets downsized to 0 bytes and grows slowly while ntbackup > dies when the file crosses the 4 Gb (exactly) size. > > *I have compiled myself version 2.2.2 of samba and, > surprisingly, the 4Gb "limit" situation described above was taken down > to 2 Gb exactly. > > Any idea on what's happening, please?? > > Thanks all for reading (and more more thanks if someone responds!) >-- Joseph Loo jloo@acm.org -------------- next part -------------- HTML attachment scrubbed and removed
Is there a filesystem for Linux that will get around this problem? Thanks P --- Petter T. Olsson Consultant/Advisor II Cornell University Veterinary College CPPS/DCS Ithaca, NY 14853-6401 (607) 253-3411 -----Original Message----- From: Andrew V. Samoilov [mailto:kai@cmail.ru] Sent: Wednesday, October 17, 2001 9:51 AM To: Joseph Loo Cc: samba list Subject: Re: large files Joseph Loo wrote:> I am not sure but you might be running into the limitations of the > ext2 file ssytem. I believe it has a 4 Gbyte limitation on a single > file. You might have to consider another file system for your large > files. >ext2 fs limitation is 2GB blocks. Block is at least 1024 bytes long.> > Ivan Fernandez wrote: > >> I'm reposting this problem (perhaps a bug) now I've got more >> information on it. This is another point of view of the situation and>> I hope someone could have run into the same trouble before (and >> solved it :-)) >> >> This is it: >> >> * with ntbackup 2000 I create a 22Gb .bkf file in the windows>> machine. >> >> * I can copy that file over a samba share and get correct >> info form the file in windows explorer. >> >> * ls -l also returns correct info, *WHILE* stat, mc, and >> other programs raise up with an error regarding a value too high for >> defined data type. >> >Some program was compiled with large file support, some without. MC 4.5.55 can be configured with --enable-largefile. This also fixes some mc problems over ntfs on 2.4 kernels.>> >> * If I try to create the file with ntbackup directly over the>> share, it gets downsized to 0 bytes and grows slowly while ntbackup >> dies when the file crosses the 4 Gb (exactly) size. >> >> *I have compiled myself version 2.2.2 of samba and, >> surprisingly, the 4Gb "limit" situation described above was taken >> down to 2 Gb exactly. >> >> Any idea on what's happening, please?? >> >Please mail me kernel and glibc versions.> -- > Joseph Loo > jloo@acm.org > >Regards, Andrew. -- To unsubscribe from this list go to the following URL and read the instructions: http://lists.samba.org/mailman/listinfo/samba
Hi, Try XFS, ReiserFS, Ext3, and so on. cheers Gustavo> -----Original Message----- > From: Petter T. Olsson [mailto:po26@ulfhild.cornell.edu] > Sent: quarta-feira, 17 de outubro de 2001 11:57 > To: kai@cmail.ru; Joseph Loo > Cc: samba list > Subject: RE: large files > > > Is there a filesystem for Linux that will get around this problem? > > Thanks > P > > --- > Petter T. Olsson > Consultant/Advisor II > Cornell University > Veterinary College CPPS/DCS > Ithaca, NY 14853-6401 > (607) 253-3411 > > -----Original Message----- > From: Andrew V. Samoilov [mailto:kai@cmail.ru] > Sent: Wednesday, October 17, 2001 9:51 AM > To: Joseph Loo > Cc: samba list > Subject: Re: large files > > > Joseph Loo wrote: > > > I am not sure but you might be running into the limitations of the > > ext2 file ssytem. I believe it has a 4 Gbyte limitation on a single > > file. You might have to consider another file system for your large > > files. > > > > ext2 fs limitation is 2GB blocks. Block is at least 1024 bytes long. > > > > > > Ivan Fernandez wrote: > > > >> I'm reposting this problem (perhaps a bug) now I've got more > >> information on it. This is another point of view of the > situation and > > >> I hope someone could have run into the same trouble before (and > >> solved it :-)) > >> > >> This is it: > >> > >> * with ntbackup 2000 I create a 22Gb .bkf file in > the windows > > >> machine. > >> > >> * I can copy that file over a samba share and get correct > >> info form the file in windows explorer. > >> > >> * ls -l also returns correct info, *WHILE* stat, mc, and > >> other programs raise up with an error regarding a value > too high for > >> defined data type. > >> > > > Some program was compiled with large file support, some without. MC > 4.5.55 can be configured with --enable-largefile. This also fixes some > mc problems over ntfs on 2.4 kernels. > > >> > >> * If I try to create the file with ntbackup > directly over the > > >> share, it gets downsized to 0 bytes and grows slowly while > ntbackup > >> dies when the file crosses the 4 Gb (exactly) size. > >> > >> *I have compiled myself version 2.2.2 of samba and, > >> surprisingly, the 4Gb "limit" situation described above was taken > >> down to 2 Gb exactly. > >> > >> Any idea on what's happening, please?? > >> > > > Please mail me kernel and glibc versions. > > > -- > > Joseph Loo > > jloo@acm.org > > > > > > Regards, > Andrew. > > > > -- > To unsubscribe from this list go to the following URL and read the > instructions: http://lists.samba.org/mailman/listinfo/samba > > -- > To unsubscribe from this list go to the following URL and read the > instructions: http://lists.samba.org/mailman/listinfo/samba >
Excuse my former question since it was written in a BIG TABLE in the Mandrake manual what filesystem to use in different situations. JFS will be my choice. Thanks P --- Petter T. Olsson Consultant/Advisor II Cornell University Veterinary College CPPS/DCS Ithaca, NY 14853-6401 (607) 253-3411 -----Original Message----- From: Michels, Gustavo [EES/BR] [mailto:gustavo.michels@emersonenergy.com] Sent: Wednesday, October 17, 2001 11:42 AM To: Petter T. Olsson; kai@cmail.ru; Joseph Loo Cc: samba list Subject: RE: large files Hi, Try XFS, ReiserFS, Ext3, and so on. cheers Gustavo> -----Original Message----- > From: Petter T. Olsson [mailto:po26@ulfhild.cornell.edu] > Sent: quarta-feira, 17 de outubro de 2001 11:57 > To: kai@cmail.ru; Joseph Loo > Cc: samba list > Subject: RE: large files > > > Is there a filesystem for Linux that will get around this problem? > > Thanks > P > > --- > Petter T. Olsson > Consultant/Advisor II > Cornell University > Veterinary College CPPS/DCS > Ithaca, NY 14853-6401 > (607) 253-3411 > > -----Original Message----- > From: Andrew V. Samoilov [mailto:kai@cmail.ru] > Sent: Wednesday, October 17, 2001 9:51 AM > To: Joseph Loo > Cc: samba list > Subject: Re: large files > > > Joseph Loo wrote: > > > I am not sure but you might be running into the limitations of the > > ext2 file ssytem. I believe it has a 4 Gbyte limitation on a single > > file. You might have to consider another file system for your large > > files. > > > > ext2 fs limitation is 2GB blocks. Block is at least 1024 bytes long. > > > > > > Ivan Fernandez wrote: > > > >> I'm reposting this problem (perhaps a bug) now I've got more > >> information on it. This is another point of view of the > situation and > > >> I hope someone could have run into the same trouble before (and > >> solved it :-)) > >> > >> This is it: > >> > >> * with ntbackup 2000 I create a 22Gb .bkf file in > the windows > > >> machine. > >> > >> * I can copy that file over a samba share and get correct > >> info form the file in windows explorer. > >> > >> * ls -l also returns correct info, *WHILE* stat, mc, and > >> other programs raise up with an error regarding a value > too high for > >> defined data type. > >> > > > Some program was compiled with large file support, some without. MC > 4.5.55 can be configured with --enable-largefile. This also fixes some> mc problems over ntfs on 2.4 kernels. > > >> > >> * If I try to create the file with ntbackup > directly over the > > >> share, it gets downsized to 0 bytes and grows slowly while > ntbackup > >> dies when the file crosses the 4 Gb (exactly) size. > >> > >> *I have compiled myself version 2.2.2 of samba and, > >> surprisingly, the 4Gb "limit" situation described above was taken > >> down to 2 Gb exactly. > >> > >> Any idea on what's happening, please?? > >> > > > Please mail me kernel and glibc versions. > > > -- > > Joseph Loo > > jloo@acm.org > > > > > > Regards, > Andrew. > > > > -- > To unsubscribe from this list go to the following URL and read the > instructions: http://lists.samba.org/mailman/listinfo/samba > > -- > To unsubscribe from this list go to the following URL and read the > instructions: http://lists.samba.org/mailman/listinfo/samba >
Trond, Please explain further. According to the Mandrake Reference Manual, Chapter 9, Page 50, there is a table describing features and limitations in the different filesystems. Thanks P --- Petter T. Olsson Consultant/Advisor II Cornell University Veterinary College CPPS/DCS Ithaca, NY 14853-6401 (607) 253-3411 -----Original Message----- From: Trond Eivind Glomsr?d [mailto:teg@redhat.com] Sent: Wednesday, October 17, 2001 11:40 AM To: Joseph Loo Cc: samba list Subject: Re: large files Joseph Loo <jloo@acm.org> writes:> I am not sure but you might be running into the limitations of the > ext2 file ssytem. I believe it has a 4 Gbyte limitation on a single > file. You might have to consider another file system for your large > files.You're wrong. Ext2 doesn't have such a limitation - the limits are elsewhre in the kernel. If you have a patched 2.2 kernel (like the enterprise edition of Red Hat Linux 6.2) or a 2.4 kernel, in addition to a glibc with the necesarry support, you can make applications support it. Red Hat Linux 7.1, with glibc 2.2 and a 2.4 kernel, has such support - but the applications also need to be compiled with special defines and not be coded in broken ways (don't use "int" where you should have used "off_t" etc). -- Trond Eivind Glomsr?d Red Hat, Inc. -- To unsubscribe from this list go to the following URL and read the instructions: http://lists.samba.org/mailman/listinfo/samba
Hi Joseph, please read the message entirely. I say I can copy the 22 Gb file onto the samba server with no problems. (I'm using ReiserFS 3.6). Cheers! -----Mensaje original----- De: samba-admin@lists.samba.org [mailto:samba-admin@lists.samba.org]En nombre de Joseph Loo Enviado el: mi?rcoles, 17 de octubre de 2001 15:30 Para: samba list Asunto: Re: large files I am not sure but you might be running into the limitations of the ext2 file ssytem. I believe it has a 4 Gbyte limitation on a single file. You might have to consider another file system for your large files. Ivan Fernandez wrote: I'm reposting this problem (perhaps a bug) now I've got more information on it. This is another point of view of the situation and I hope someone could have run into the same trouble before (and solved it :-)) This is it: * with ntbackup 2000 I create a 22Gb .bkf file in the windows machine. * I can copy that file over a samba share and get correct info form the file in windows explorer. * ls -l also returns correct info, *WHILE* stat, mc, and other programs raise up with an error regarding a value too high for defined data type. * If I try to create the file with ntbackup directly over the share, it gets downsized to 0 bytes and grows slowly while ntbackup dies when the file crosses the 4 Gb (exactly) size. *I have compiled myself version 2.2.2 of samba and, surprisingly, the 4Gb "limit" situation described above was taken down to 2 Gb exactly. Any idea on what's happening, please?? Thanks all for reading (and more more thanks if someone responds!) -- Joseph Loo jloo@acm.org -------------- next part -------------- HTML attachment scrubbed and removed
Hi Petter, the problem is not the file system. ReiserFS 3.6 (the one I'm using) supports files up to 1EB (yes 1 Exabyte = 1,024 PB (yes 1 EB = 1,024 Tb (yes 1 Tb = 1,024 Gb (yes 1Gb....)))). As I say in the message, I copy a 22 Gb file from windows over the samba share with no problems. The problem reises when I stat, mc, updatedb, etc the directory because linux returns inconsistent values. Also ntbackup 2000 cannot directly generate that 22 Gb file over the samba share due to same problem.... cheers! Ivan> -----Mensaje original----- > De: samba-admin@lists.samba.org [mailto:samba-admin@lists.samba.org]En > nombre de Petter T. Olsson > Enviado el: miercoles, 17 de octubre de 2001 16:57 > Para: kai@cmail.ru; Joseph Loo > CC: samba list > Asunto: RE: large files > > > Is there a filesystem for Linux that will get around this problem? > > Thanks > P > > --- > Petter T. Olsson > Consultant/Advisor II > Cornell University > Veterinary College CPPS/DCS > Ithaca, NY 14853-6401 > (607) 253-3411 > > -----Original Message----- > From: Andrew V. Samoilov [mailto:kai@cmail.ru] > Sent: Wednesday, October 17, 2001 9:51 AM > To: Joseph Loo > Cc: samba list > Subject: Re: large files > > > Joseph Loo wrote: > > > I am not sure but you might be running into the limitations of the > > ext2 file ssytem. I believe it has a 4 Gbyte limitation on a single > > file. You might have to consider another file system for your large > > files. > > > > ext2 fs limitation is 2GB blocks. Block is at least 1024 bytes long. > > > > > > Ivan Fernandez wrote: > > > >> I'm reposting this problem (perhaps a bug) now I've got more > >> information on it. This is another point of view of the > situation and > > >> I hope someone could have run into the same trouble before (and > >> solved it :-)) > >> > >> This is it: > >> > >> * with ntbackup 2000 I create a 22Gb .bkf file in > the windows > > >> machine. > >> > >> * I can copy that file over a samba share and get correct > >> info form the file in windows explorer. > >> > >> * ls -l also returns correct info, *WHILE* stat, mc, and > >> other programs raise up with an error regarding a value > too high for > >> defined data type. > >> > > > Some program was compiled with large file support, some without. MC > 4.5.55 can be configured with --enable-largefile. This also fixes some > mc problems over ntfs on 2.4 kernels. > > >> > >> * If I try to create the file with ntbackup > directly over the > > >> share, it gets downsized to 0 bytes and grows slowly while > ntbackup > >> dies when the file crosses the 4 Gb (exactly) size. > >> > >> *I have compiled myself version 2.2.2 of samba and, > >> surprisingly, the 4Gb "limit" situation described above was taken > >> down to 2 Gb exactly. > >> > >> Any idea on what's happening, please?? > >> > > > Please mail me kernel and glibc versions. > > > -- > > Joseph Loo > > jloo@acm.org > > > > > > Regards, > Andrew. > > > > -- > To unsubscribe from this list go to the following URL and read the > instructions: http://lists.samba.org/mailman/listinfo/samba > > -- > To unsubscribe from this list go to the following URL and read the > instructions: http://lists.samba.org/mailman/listinfo/samba >-------------- next part -------------- HTML attachment scrubbed and removed
Ok, I don't have any time to look at this at the moment, but it sure does sound like an internal storage issue from the data you supply: Since you can copy the file ONTO the fs that samba is sharing out, obviously your fs can handle a file that large. But the concern is that some of your OS utilities like stat, etc, are NOT reporting the size right. This indicates to me that perhaps a system call that Samba is using to return file/filesystem informatino back to the pc in response to a smb is not able to do the job, and thus the program on nt is thinking that it can't create a file any bigger than 4 or 2 gigabytes. If you turned on debug during the ntbackup and saved off the debug file as soon as the ntbackup program died, we might be able to see exactly what smb was being processed when it failed. This would tell us what system call was being made by samba to give that info, and we would have a better idea. BUT the fact that stat, etc is not working properly on your system is IMHO the most likely culprit You might want to check the patches and fixes in later revisions of the os to see if fstat, stat, or other system calls have reported a problem that is fixed... Hope this helps, Don -----Original Message----- From: Ivan Fernandez [mailto:ivan@vyb.com] Sent: Wednesday, October 17, 2001 2:49 PM To: samba@lists.samba.org Subject: Re: large files Hi Joseph, please read the message entirely. I say I can copy the 22 Gb file onto the samba server with no problems. (I'm using ReiserFS 3.6). Cheers! -----Mensaje original----- De: samba-admin@lists.samba.org [ mailto:samba-admin@lists.samba.org <mailto:samba-admin@lists.samba.org> ]En nombre de Joseph Loo Enviado el: mi?rcoles, 17 de octubre de 2001 15:30 Para: samba list Asunto: Re: large files I am not sure but you might be running into the limitations of the ext2 file ssytem. I believe it has a 4 Gbyte limitation on a single file. You might have to consider another file system for your large files. Ivan Fernandez wrote: I'm reposting this problem (perhaps a bug) now I've got more information on it. This is another point of view of the situation and I hope someone could have run into the same trouble before (and solved it :-)) This is it: * with ntbackup 2000 I create a 22Gb .bkf file in the windows machine. * I can copy that file over a samba share and get correct info form the file in windows explorer. * ls -l also returns correct info, *WHILE* stat, mc, and other programs raise up with an error regarding a value too high for defined data type. * If I try to create the file with ntbackup directly over the share, it gets downsized to 0 bytes and grows slowly while ntbackup dies when the file crosses the 4 Gb (exactly) size. *I have compiled myself version 2.2.2 of samba and, surprisingly, the 4Gb "limit" situation described above was taken down to 2 Gb exactly. Any idea on what's happening, please?? Thanks all for reading (and more more thanks if someone responds!) -- Joseph Loo jloo@acm.org -------------- next part -------------- HTML attachment scrubbed and removed
Hi. I have just noticed a weirdness of Samba and was wondering if anyone else has experienced this or not... I don't want to add it to the bug list until after I have verified that they don't already know of it. Basically, I happen to have a bunch of large files that I want to dump onto my server which is running Samba (2.2) and Linux (kernel 2.4.17), and what I want to do is dump the files off my windows machine onto my linux server. The problem in detail is that the files are over 2Gb in size and Samba won't write those files to the server. I can pass the files back and forth between my Windows 98 Machine, and my Windows ME system, but samba seems incapable of receiving them. However, as a weirdness of the whole situation, I can get the files onto the linux server if I run smbclient and connect to the windows machine, and retrieve the files that way. I am trying to put the files into a Reiserfs parition so I know I am not hitting a limit of the filesystem itself. Basically the files are for video editing, and unless I can send them from the windows machine using and open source application on my file storage server is not going to be viable, so any help would be appreciated. I have read a lot of documentation on this, and I still cannot understand why this would not be working, unless there is some filesize limit imposed by the samba daemon. Regards, Cassandra