Andreas Haumer
2006-Jul-10 14:09 UTC
[Samba] ArcView + Samba: Performance nightmare under Linux, ok under Solaris or HP-UX
-----BEGIN PGP SIGNED MESSAGE----- Hash: SHA1 Hi! For some months now I'm hunting a Samba performance problem without a solution yet. Now I'm hoping someone on this list has an idea (In fact, I already reported the problem to the mailing list but got only one reply which did not help) So I'm here for another try... Here's the situation: Some of our users runs Windows XP with ArcView GIS 3.3 by ESRI. In this application, ArcView is used to render some scientific data which is stored in files on a Samba server. If the samba server is running under Solaris or HP-UX, a typical run of ArcView takes about 30 seconds. This is ok. If the samba server is running under Linux, a run with the same ArcView setup (same data files, same control file, same Windows XP client) takes more than 4 minutes! This is NOT ok! I'm completely able to reproduce this at any time. For all other applications, the Linux Samba server works just fine, it's a very fast machine on Gigabit LAN and apart from the ArcView problem user's are quite happy with it. As the setup is in production with more than 100 users, I did a test installation using VMware virtual machines for servers and client and can reproduce the problem here, too (execution times in a virtual machine are a little bit longer, but basically I have the same runtime behaviour difference between Solaris and Linux servers) Client: * Windows XP professional SP1, ESRI ArcView GIS 3.3 Server: * Solaris 10 64bit, Samba 3.0.11 (provided by Sun) Execution time: 30 seconds * Solaris 10 32bit, Samba 3.0.11 (provided by Sun) Execution time: 30 seconds * Solaris 10 32bit, Samba 3.0.22 (self-compiled) Execution time: 30 seconds * SuSE Linux 9.3, Samba 3.0.12 (provided by SuSE) Execution time: 250 seconds * xS+S BLD-5.2, Linux kernel 2.4.31, Samba-3.0.20b (everything self-compiled) Execution time: 250 seconds * xS+S BLD-5.3, Linux kernel 2.4.32, Samba 3.0.22 (everything self-compiled) Execution time: 250 seconds For this test, all servers were executed in a VMware virtual machine on the same VMware host, one after another. On "real" hardware I get similar results, only the absolute execution times are a little better. I have got samba logfiles at loglevel 10 (about 30MB on the Solaris system, about 1900MB on the Linux servers), also Samba process trace files (with strace under Linux and truss under Solaris) I found that under Solaris, Samba executes 4866 pread64(2) system calls for the whole run, while under Linux more than 325000(!) pread64(2) system calls are executed (for the same client application!) Looking at the Samba Logfiles, the first 190000 lines or so are almost identical between Solaris and Linux systems. Here the application opens its control files and some data files. At some specific point the logfiles beginn to differ: with the Solaris samba server, the ArcView application reads the data files with 4k blocks in a sequential manner like this: [...] read_file (daten/covers/dhm_offset/o1000c/arc.adf): pos = 0, size = 4096, returned 4096 read_file (daten/covers/dhm_offset/o1000c/arc.adf): pos = 4096, size = 4096, returned 4096 read_file (daten/covers/dhm_offset/o1000c/arc.adf): pos = 8192, size = 4096, returned 4096 read_file (daten/covers/dhm_offset/o1000c/arc.adf): pos = 12288, size = 4096, returned 4096 read_file (daten/covers/dhm_offset/o1000c/arc.adf): pos = 16384, size = 4096, returned 4096 read_file (daten/covers/dhm_offset/o1000c/arc.adf): pos = 20480, size = 4096, returned 4096 read_file (daten/covers/dhm_offset/o1000c/arc.adf): pos = 24576, size = 4096, returned 4096 read_file (daten/covers/dhm_offset/o1000c/arc.adf): pos = 28672, size = 4096, returned 4096 read_file (daten/covers/dhm_offset/o1000c/arc.adf): pos = 32768, size = 4096, returned 4096 read_file (daten/covers/dhm_offset/o1000c/arc.adf): pos = 36864, size = 4096, returned 4096 read_file (daten/covers/dhm_offset/o1000c/arc.adf): pos = 40960, size = 4096, returned 4096 read_file (daten/covers/dhm_offset/o1000c/arc.adf): pos = 45056, size = 4096, returned 4096 read_file (daten/covers/dhm_offset/o1000c/arc.adf): pos = 49152, size = 4096, returned 4096 read_file (daten/covers/dhm_offset/o1000c/arc.adf): pos = 53248, size = 4096, returned 4096 read_file (daten/covers/dhm_offset/o1000c/arc.adf): pos = 57344, size = 4096, returned 4096 read_file (daten/covers/dhm_offset/o1000c/arc.adf): pos = 61440, size = 4096, returned 4096 read_file (daten/covers/dhm_offset/o1000c/arc.adf): pos = 65536, size = 4096, returned 4096 read_file (daten/covers/dhm_offset/o1000c/arc.adf): pos = 69632, size = 4096, returned 4096 read_file (daten/covers/dhm_offset/o1000c/arc.adf): pos = 73728, size = 4096, returned 4096 read_file (daten/covers/dhm_offset/o1000c/arc.adf): pos = 77824, size = 4096, returned 4096 read_file (daten/covers/dhm_offset/o1000c/arc.adf): pos = 81920, size = 4096, returned 4096 read_file (daten/covers/dhm_offset/o1000c/arc.adf): pos = 86016, size = 4096, returned 4096 read_file (daten/covers/dhm_offset/o1000c/arc.adf): pos = 90112, size = 4096, returned 4096 read_file (daten/covers/dhm_offset/o1000c/arc.adf): pos = 94208, size = 4096, returned 4096 read_file (daten/covers/dhm_offset/o1000c/arc.adf): pos = 98304, size = 4096, returned 4096 read_file (daten/covers/dhm_offset/o1000c/arc.adf): pos = 102400, size = 4096, returned 4096 read_file (daten/covers/dhm_offset/o1000c/arc.adf): pos = 106496, size = 4096, returned 4096 read_file (daten/covers/dhm_offset/o1000c/arc.adf): pos = 110592, size = 4096, returned 4096 read_file (daten/covers/dhm_offset/o1000c/arc.adf): pos = 114688, size = 4096, returned 4096 [...] With the Linux samba server, it looks like this: [...] read_file (daten/covers/dhm_offset/o1000c/arc.adf): pos = 0, size = 4096, returned 4096 read_file (daten/covers/dhm_offset/o1000c/arc.adf): pos = 0, size = 512, returned 512 read_file (daten/covers/dhm_offset/o1000c/arc.adf): pos = 512, size = 512, returned 512 read_file (daten/covers/dhm_offset/o1000c/arc.adf): pos = 0, size = 512, returned 512 read_file (daten/covers/dhm_offset/o1000c/arc.adf): pos = 512, size = 512, returned 512 read_file (daten/covers/dhm_offset/o1000c/arc.adf): pos = 0, size = 512, returned 512 read_file (daten/covers/dhm_offset/o1000c/arc.adf): pos = 512, size = 512, returned 512 read_file (daten/covers/dhm_offset/o1000c/arc.adf): pos = 0, size = 512, returned 512 read_file (daten/covers/dhm_offset/o1000c/arc.adf): pos = 512, size = 512, returned 512 read_file (daten/covers/dhm_offset/o1000c/arc.adf): pos = 1024, size = 512, returned 512 read_file (daten/covers/dhm_offset/o1000c/arc.adf): pos = 512, size = 512, returned 512 read_file (daten/covers/dhm_offset/o1000c/arc.adf): pos = 0, size = 512, returned 512 read_file (daten/covers/dhm_offset/o1000c/arc.adf): pos = 1024, size = 512, returned 512 read_file (daten/covers/dhm_offset/o1000c/arc.adf): pos = 512, size = 512, returned 512 read_file (daten/covers/dhm_offset/o1000c/arc.adf): pos = 0, size = 512, returned 512 read_file (daten/covers/dhm_offset/o1000c/arc.adf): pos = 1024, size = 512, returned 512 read_file (daten/covers/dhm_offset/o1000c/arc.adf): pos = 512, size = 512, returned 512 read_file (daten/covers/dhm_offset/o1000c/arc.adf): pos = 1024, size = 512, returned 512 read_file (daten/covers/dhm_offset/o1000c/arc.adf): pos = 512, size = 512, returned 512 read_file (daten/covers/dhm_offset/o1000c/arc.adf): pos = 1024, size = 512, returned 512 read_file (daten/covers/dhm_offset/o1000c/arc.adf): pos = 512, size = 512, returned 512 read_file (daten/covers/dhm_offset/o1000c/arc.adf): pos = 1536, size = 512, returned 512 read_file (daten/covers/dhm_offset/o1000c/arc.adf): pos = 512, size = 512, returned 512 read_file (daten/covers/dhm_offset/o1000c/arc.adf): pos = 1024, size = 512, returned 512 read_file (daten/covers/dhm_offset/o1000c/arc.adf): pos = 1536, size = 512, returned 512 read_file (daten/covers/dhm_offset/o1000c/arc.adf): pos = 2048, size = 512, returned 512 read_file (daten/covers/dhm_offset/o1000c/arc.adf): pos = 1536, size = 512, returned 512 read_file (daten/covers/dhm_offset/o1000c/arc.adf): pos = 1024, size = 512, returned 512 read_file (daten/covers/dhm_offset/o1000c/arc.adf): pos = 2048, size = 512, returned 512 read_file (daten/covers/dhm_offset/o1000c/arc.adf): pos = 1536, size = 512, returned 512 read_file (daten/covers/dhm_offset/o1000c/arc.adf): pos = 1024, size = 512, returned 512 read_file (daten/covers/dhm_offset/o1000c/arc.adf): pos = 2048, size = 512, returned 512 read_file (daten/covers/dhm_offset/o1000c/arc.adf): pos = 1024, size = 512, returned 512 read_file (daten/covers/dhm_offset/o1000c/arc.adf): pos = 2048, size = 512, returned 512 read_file (daten/covers/dhm_offset/o1000c/arc.adf): pos = 1024, size = 512, returned 512 read_file (daten/covers/dhm_offset/o1000c/arc.adf): pos = 2048, size = 512, returned 512 read_file (daten/covers/dhm_offset/o1000c/arc.adf): pos = 1536, size = 512, returned 512 read_file (daten/covers/dhm_offset/o1000c/arc.adf): pos = 2048, size = 512, returned 512 read_file (daten/covers/dhm_offset/o1000c/arc.adf): pos = 2560, size = 512, returned 512 read_file (daten/covers/dhm_offset/o1000c/arc.adf): pos = 2048, size = 512, returned 512 read_file (daten/covers/dhm_offset/o1000c/arc.adf): pos = 1024, size = 512, returned 512 read_file (daten/covers/dhm_offset/o1000c/arc.adf): pos = 1536, size = 512, returned 512 read_file (daten/covers/dhm_offset/o1000c/arc.adf): pos = 2560, size = 512, returned 512 [...] You can also see this in the system trace files: Here's an example from the trace files, reading the "arc.adf" file from the example above: Solaris: pread64(27, "\0\0 '\n\0\0\001\0\0\0\0".., 4096, 0) = 4096 pread64(27, "\0\0\002 J W83E0 JAE97 P".., 4096, 4096) = 4096 pread64(27, " JAE87B0 J W 5C0 JAE8F80".., 4096, 8192) = 4096 pread64(27, "\0\0\002 J X ^A0 JAE7FE0".., 4096, 12288) = 4096 pread64(27, "\0\0\0AA\0\0\091\0\0\002".., 4096, 16384) = 4096 pread64(27, " JAE h p\0\001A0\0\0\014".., 4096, 20480) = 4096 pread64(27, "\0\0\014\0\0 \ &\0\00114".., 4096, 24576) = 4096 pread64(27, "\0\0\014\0\0 [E4\0\001 @".., 4096, 28672) = 4096 pread64(27, "\0\001 m\0\001 l\0\001 H".., 4096, 32768) = 4096 pread64(27, " JAE 1C0 J WD2\0 JAE 1C0".., 4096, 36864) = 4096 pread64(27, "\0\0\002 J WC2 ` JAE )F0".., 4096, 40960) = 4096 pread64(27, "\0\001F9\0\001CA\0\001A0".., 4096, 45056) = 4096 pread64(27, "\0\0\002 J W t @ JAE1A P".., 4096, 49152) = 4096 pread64(27, "\0\002 Q\0\002 "\0\001F4".., 4096, 53248) = 4096 pread64(27, "\0\0\002 J VB8C0 JAE\nB0".., 4096, 57344) = 4096 pread64(27, "\0\002 c\0\002 C\0\0\002".., 4096, 61440) = 4096 pread64(27, "\0\0\014\0\0 YC8\0\002D7".., 4096, 65536) = 4096 pread64(27, "\0\00302\0\00301\0\002C5".., 4096, 69632) = 4096 pread64(27, " JADF3 @ J W9380 JADFB10".., 4096, 73728) = 4096 pread64(27, "\0\002E6\0\002E7\0\0\002".., 4096, 77824) = 4096 pread64(27, "\0\00384\0\00383\0\003 C".., 4096, 81920) = 4096 pread64(27, "\0\0\014\0\0 XA9\0\003B2".., 4096, 86016) = 4096 pread64(27, "\0\0\001\0\003 `\0\0\003".., 4096, 90112) = 4096 pread64(27, "\0\0\014\0\0 XBC\0\004\b".., 4096, 94208) = 4096 pread64(27, "\0\0\014\0\0 X Y\0\004 9".., 4096, 98304) = 4096 pread64(27, "\0\00415\0\003CC\0\0\002".., 4096, 102400) = 4096 pread64(27, "\0\00493\0\004 =\0\003EF".., 4096, 106496) = 4096 pread64(27, "\0\004 g\0\00419\0\0\002".., 4096, 110592) = 4096 pread64(27, " JADC4 `\0\0\t ,\0\0\014".., 4096, 114688) = 4096 pread64(27, "\0\00519\0\004BC\0\004 f".., 4096, 118784) = 4096 pread64(27, "\0\004E1\0\00490\0\0\002".., 4096, 122880) = 4096 pread64(27, " JADB4C0 J WD2\0 JADBC90".., 4096, 126976) = 4096 pread64(27, "\0\004E6\0\004E7\0\0\002".., 4096, 131072) = 4096 pread64(27, " JADAFA7 J XAA1D JADACF0".., 4096, 135168) = 4096 [...] Linux: pread64(52, "\0\0\'\n\0\0\0\1\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\20q"..., 4096, 0) = 4096 pread64(52, "\0\0\'\n\0\0\0\1\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\20q"..., 512, 0) = 512 pread64(52, "\0\0\0\4\0\0\0\3\0\0\0\2JW\243 J\256\256\300JW\243 J\256"..., 512, 512) = 512 pread64(52, "\0\0\'\n\0\0\0\1\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\20q"..., 512, 0) = 512 pread64(52, "\0\0\0\4\0\0\0\3\0\0\0\2JW\243 J\256\256\300JW\243 J\256"..., 512, 512) = 512 pread64(52, "\0\0\'\n\0\0\0\1\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\20q"..., 512, 0) = 512 pread64(52, "\0\0\0\4\0\0\0\3\0\0\0\2JW\243 J\256\256\300JW\243 J\256"..., 512, 512) = 512 pread64(52, "\0\0\'\n\0\0\0\1\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\20q"..., 512, 0) = 512 pread64(52, "\0\0\0\4\0\0\0\3\0\0\0\2JW\243 J\256\256\300JW\243 J\256"..., 512, 512) = 512 pread64(52, "\0\0\0\16\0\0\0\r\0\0\0\f\0\0\0\7\0\0\0\2JW\203\340J\256"..., 512, 1024) = 512 pread64(52, "\0\0\0\4\0\0\0\3\0\0\0\2JW\243 J\256\256\300JW\243 J\256"..., 512, 512) = 512 pread64(52, "\0\0\'\n\0\0\0\1\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\20q"..., 512, 0) = 512 pread64(52, "\0\0\0\16\0\0\0\r\0\0\0\f\0\0\0\7\0\0\0\2JW\203\340J\256"..., 512, 1024) = 512 pread64(52, "\0\0\0\4\0\0\0\3\0\0\0\2JW\243 J\256\256\300JW\243 J\256"..., 512, 512) = 512 pread64(52, "\0\0\'\n\0\0\0\1\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\20q"..., 512, 0) = 512 pread64(52, "\0\0\0\16\0\0\0\r\0\0\0\f\0\0\0\7\0\0\0\2JW\203\340J\256"..., 512, 1024) = 512 pread64(52, "\0\0\0\4\0\0\0\3\0\0\0\2JW\243 J\256\256\300JW\243 J\256"..., 512, 512) = 512 pread64(52, "\0\0\0\16\0\0\0\r\0\0\0\f\0\0\0\7\0\0\0\2JW\203\340J\256"..., 512, 1024) = 512 pread64(52, "\0\0\0\4\0\0\0\3\0\0\0\2JW\243 J\256\256\300JW\243 J\256"..., 512, 512) = 512 pread64(52, "\0\0\0\16\0\0\0\r\0\0\0\f\0\0\0\7\0\0\0\2JW\203\340J\256"..., 512, 1024) = 512 pread64(52, "\0\0\0\4\0\0\0\3\0\0\0\2JW\243 J\256\256\300JW\243 J\256"..., 512, 512) = 512 pread64(52, "J\256\243 JXO\337J\256\242\334JXO\0J\256\241\272\0\0\0"..., 512, 1536) = 512 pread64(52, "\0\0\0\4\0\0\0\3\0\0\0\2JW\243 J\256\256\300JW\243 J\256"..., 512, 512) = 512 pread64(52, "\0\0\0\16\0\0\0\r\0\0\0\f\0\0\0\7\0\0\0\2JW\203\340J\256"..., 512, 1024) = 512 pread64(52, "J\256\243 JXO\337J\256\242\334JXO\0J\256\241\272\0\0\0"..., 512, 1536) = 512 pread64(52, "J\256\237 JW\203\340J\256\237 \0\0\0(\0\0\0\24\0\0]5\0"..., 512, 2048) = 512 pread64(52, "J\256\243 JXO\337J\256\242\334JXO\0J\256\241\272\0\0\0"..., 512, 1536) = 512 pread64(52, "\0\0\0\16\0\0\0\r\0\0\0\f\0\0\0\7\0\0\0\2JW\203\340J\256"..., 512, 1024) = 512 pread64(52, "J\256\237 JW\203\340J\256\237 \0\0\0(\0\0\0\24\0\0]5\0"..., 512, 2048) = 512 pread64(52, "J\256\243 JXO\337J\256\242\334JXO\0J\256\241\272\0\0\0"..., 512, 1536) = 512 pread64(52, "\0\0\0\16\0\0\0\r\0\0\0\f\0\0\0\7\0\0\0\2JW\203\340J\256"..., 512, 1024) = 512 pread64(52, "J\256\237 JW\203\340J\256\237 \0\0\0(\0\0\0\24\0\0]5\0"..., 512, 2048) = 512 pread64(52, "\0\0\0\16\0\0\0\r\0\0\0\f\0\0\0\7\0\0\0\2JW\203\340J\256"..., 512, 1024) = 512 pread64(52, "J\256\237 JW\203\340J\256\237 \0\0\0(\0\0\0\24\0\0]5\0"..., 512, 2048) = 512 pread64(52, "\0\0\0\16\0\0\0\r\0\0\0\f\0\0\0\7\0\0\0\2JW\203\340J\256"..., 512, 1024) = 512 pread64(52, "J\256\237 JW\203\340J\256\237 \0\0\0(\0\0\0\24\0\0]5\0"..., 512, 2048) = 512 pread64(52, "J\256\243 JXO\337J\256\242\334JXO\0J\256\241\272\0\0\0"..., 512, 1536) = 512 pread64(52, "J\256\237 JW\203\340J\256\237 \0\0\0(\0\0\0\24\0\0]5\0"..., 512, 2048) = 512 pread64(52, "\0\0\0\2JX\0\340J\256\237 JX\0\340J\256\240\31\0\0\000"..., 512, 2560) = 512 pread64(52, "J\256\237 JW\203\340J\256\237 \0\0\0(\0\0\0\24\0\0]5\0"..., 512, 2048) = 512 pread64(52, "\0\0\0\16\0\0\0\r\0\0\0\f\0\0\0\7\0\0\0\2JW\203\340J\256"..., 512, 1024) = 512 pread64(52, "J\256\243 JXO\337J\256\242\334JXO\0J\256\241\272\0\0\0"..., 512, 1536) = 512 pread64(52, "\0\0\0\2JX\0\340J\256\237 JX\0\340J\256\240\31\0\0\000"..., 512, 2560) = 512 pread64(52, "\0\0\0\16\0\0\0\r\0\0\0\f\0\0\0\7\0\0\0\2JW\203\340J\256"..., 512, 1024) = 512 pread64(52, "J\256\243 JXO\337J\256\242\334JXO\0J\256\241\272\0\0\0"..., 512, 1536) = 512 pread64(52, "\0\0\0\2JX\0\340J\256\237 JX\0\340J\256\240\31\0\0\000"..., 512, 2560) = 512 pread64(52, "J\256\243 JXO\337J\256\242\334JXO\0J\256\241\272\0\0\0"..., 512, 1536) = 512 pread64(52, "J\256\237 JW\203\340J\256\237 \0\0\0(\0\0\0\24\0\0]5\0"..., 512, 2048) = 512 pread64(52, "\0\0\0\2JX\0\340J\256\237 JX\0\340J\256\240\31\0\0\000"..., 512, 2560) = 512 pread64(52, "\0\0\0000\0\0\0/\0\0\0\1\0\0\0*\0\0\0\3JX\20\200J\256\231"..., 512, 3584) = 512 pread64(52, "J\256\237 \0\0\0<\0\0\0\30\0\0\3\7\0\0\0\31\0\0\0)\0\0"..., 512, 3072) = 512 pread64(52, "J\256\243 JXO\337J\256\242\334JXO\0J\256\241\272\0\0\0"..., 512, 1536) = 512 pread64(52, "\0\0\0000\0\0\0/\0\0\0\1\0\0\0*\0\0\0\3JX\20\200J\256\231"..., 512, 3584) = 512 pread64(52, "\0\0\0\2JW\203\340J\256\227PJW\203\340J\256\237 \0\0\0"..., 512, 4096) = 512 [...] Under Linux, not only the blocksize is smaller, the client also requests the same data blocks again and again several times. So this is the reason why there are almost 70 times more pread64() calls under Linux and this is where all the execution time goes! But why? What is going on here? Why does the client behave differently if the server operating system (which should be hidden behind the Samba server interface) is different? It's still the same client, the same client software and even exactly the same data files which lead to different behaviour! I have now more than 2GB of trace and logfiles and still no idea of what is going on... :-( I managed to strip the samba server configuration down to a minimum: a single share only (which holds the test data), security=user, our standard workgroup and everything else is set to Samba defaults. Result: no change in runtime behaviour. I tried to compare samba settings and made them identical on both Solaris and Linux samba servers (the only difference now is printer command strings, but printing is disabled anyway). Result: no change in runtime behaviour. I tried to play with Samba settings on the Linux server: * kernel oplocks = no -> no change * Fake oplocks = yes -> no change * Locking = no -> no change * Oplocks = no -> performance got worse * lock spin time = 15 lock spin count = 30 -> no change * use sendfile = yes -> no change If anyone has an idea of why Linux and Solaris samba servers behave differently, how to do more debugging or even how to solve the problem I'd very much appreciate it! - - andreas - -- Andreas Haumer | mailto:andreas@xss.co.at *x Software + Systeme | http://www.xss.co.at/ Karmarschgasse 51/2/20 | Tel: +43-1-6060114-0 A-1100 Vienna, Austria | Fax: +43-1-6060114-71 -----BEGIN PGP SIGNATURE----- Version: GnuPG v1.4.3 (GNU/Linux) Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org iD8DBQFEsluexJmyeGcXPhERAl+GAJ4pp73nL5bzyr099LsiY3VrJyR6hACePab+ LmbHfC57Uju4DO5clR1bMAk=2QyQ -----END PGP SIGNATURE-----
Paul Gienger
2006-Jul-10 19:11 UTC
[Samba] ArcView + Samba: Performance nightmare under Linux, ok under Solaris or HP-UX
> I tried to compare samba settings and made them identical > on both Solaris and Linux samba servers (the only difference > now is printer command strings, but printing is disabled anyway). > Result: no change in runtime behaviour. > > I tried to play with Samba settings on the Linux server: > > * kernel oplocks = no -> no change > > * Fake oplocks = yes -> no change > > * Locking = no -> no change > > * Oplocks = no -> performance got worse > > * lock spin time = 15 > lock spin count = 30 -> no change > > * use sendfile = yes -> no changeI don't suppose you've tried messing around with the socket options? E.g. socket options = TCP_NODELAY SO_SNDBUF=8192 Just a thought. Paul
Tobias Bluhm
2006-Jul-10 19:46 UTC
[Samba] ArcView + Samba: Performance nightmare under Linux, ok under Solaris or HP-UX
Another shot in the dark . . . . and I don't believe you've stated what fs type your using, but have you tried storing the data using other fs types or playing with the mount options? ----------------------------------------------------- toby bluhm philips medical systems, cleveland ohio tobias.bluhm@philips.com 440-483-5323