similar to: OpenSSH and non-blocking mode

Displaying 20 results from an estimated 200 matches similar to: "OpenSSH and non-blocking mode"

2003 Jul 17
1
2 GB Limit when writing to smbfs filesystems
I'm running RedHat 8.0 with samba-2.2.7-5.8.0 (installed from RedHat distribution) When I use cpio to write a backup (> 2GB) to a smbfs filesystem, I get the error: File size limit exceeded I get the same error when I linux copy (cp) a file (> 2GB) from a Linux ext3 filesystem to the smbfs filesystem. The smbfs filesystem is mounted from a Windows 2000 Professional workstation. After
2009 Jan 28
2
ZFS+NFS+refquota: full filesystems still return EDQUOT for unlink()
We have been using ZFS for user home directories for a good while now. When we discovered the problem with full filesystems not allowing deletes over NFS, we became very anxious to fix this; our users fill their quotas on a fairly regular basis, so it''s important that they have a simple recourse to fix this (e.g., rm). I played around with this on my OpenSolaris box at home, read around
2003 Dec 02
1
rdiff
Is there any chance for rdiff ? I need to frequently synchronize big text file (60MB+) undertaking small changes and I am interested in differences between the subsequent versions [DNS RBL data in dnsbl format, 1E6+ lines of text, new version every 20m, on average 50 new entries (lines) in every synchronization] I would like to get (small) diff file as result of rsync session and apply it to
2012 Oct 20
2
can't find the error in if function... maybe i'm blind?
Hi everybody, the following alway gives me the error "Fehler in if (File$X.Frame.Number[a] + 1 == File$X.Frame.Number[a + 1]) (File$FishNr[a] <- File$FishNr[a - : Fehlender Wert, wo TRUE/FALSE n?tig ist". Maybe its stupid, but i'm not getting why... Maybe someone can help me. Thanks a lot! for (i in unique(BigFile$TrackAll)) { File <-
2016 Oct 26
3
NFS help
On Tue, Oct 25, 2016 at 12:48 PM, Matt Garman <matthew.garman at gmail.com> wrote: > On Mon, Oct 24, 2016 at 6:09 PM, Larry Martell <larry.martell at gmail.com> wrote: >> The machines are on a local network. I access them with putty from a >> windows machine, but I have to be at the site to do that. > > So that means when you are offsite there is no way to access
2007 Nov 27
1
Syncing to multiple servers
Helle everyone, Let's say we have 3 servers, 2 of them have the latest (stable) version of rsyncd running (2.6.9) <Server1> ==> I N T E R N E T ==> <Server2 (rsyncd running)> ==> LAN ==> <Server3 (rsyncd running)> Suppose I want to send a big file (bigfile.big) from Server1 to both Server2 and Server3. It would be a good idea to send first from Server1
2007 Mar 02
1
--delete --force Won't Remove Directories With Dotnames
--delete --force Won't Remove Directories With Dotnames rsync 2.6.9 Me, personally, I reckon this to be an irritant ... but perhaps (and having thought about this a bit I decided it's a good chance) this is an intentional and useful behaviour. But it's a nuisance if you call your --partial-dir .partial, as I happen to do, since now if you remove a directory which was aborted in
2010 Nov 09
1
make quicktest failed
I had 444 errors I didn't want to put them all up here. Maybe I missed a step, maybe it's an easy "oops you forgot to do this" I'm on ubuntu 10.10 amd64 server completely uptodate today. I followed the http://wiki.samba.org/index.php/Samba4/HOWTO My git is also today. == samba4.rpc.echo on ncacn_ip_tcp with validate and --option=socket:testnonblock=True
2015 Sep 11
2
Cannot open: No space left on device
On Fri, Sep 11, 2015 at 3:19 PM, Dario Lesca <d.lesca at solinos.it> wrote: > the result. # du -sc /* /.??* --exclude /proc|sort -n 0 /.autofsck 0 /.autorelabel 0 /misc 0 /net 0 /sys 4 /cgroup 4 /media 4 /mnt 4 /selinux 4 /srv 8 /opt 16 /home 16 /lost+found 16 /tmp 112 /root 188 /dev 7956 /bin
2009 Apr 22
2
purge-empty-dirs and max-file-size confusion
I want to use --min-size to copy just large files (and their necessary parent directories), but everything I've tried copies *all* the source directories, and creates them empty on the destination even if they don't have any big files in them. I only want the minimal directory hierarchies that contain the big files. This doesn't work: $ rm -rf /tmp/foo $ rsync -ai --min-size
2002 Mar 27
2
Linux 2.4.18 on RH 7.2 - odd failures
Hi there, I'm using RH7.2 (with the 2.4.9-30 kernal and it's required components) as a base for a server system running kernel 2.4.18. I've gone to this version to get around non-performing aic7xxx drivers in the stock 7.2 kernels, and updated gigabit ethernet drivers. I have a raid unit (Medea) attached to an Adaptec 3916, coming up as sdb. It has 2kb blocks, but the fault
2005 Sep 23
2
17G File size limit?
Hi everyone, This is a strange problem I have been having. I'm not sure where the problem is, so I figured I'd start here. I as having problems with Bacula stopping on 17Gig Volume sizes, so I decided to try to Just dd a 50 gig file. Sure enough, once the file hit 17 gigs dd stopped and spit out an error (pandora bacula)# dd if=/dev/zero of=bigfile bs=1M count=50000 File size
2007 Nov 08
3
skip non-sequential lines using scan?
Hi all, Is there a way to skip non-sequential lines using the "skip" argument in the scan function? E.g., I have a matrix with 100 rows and 1e7 columns. I open a connection and want to read only lines 5, 7, 9, etc [i.e., seq(5,99,2)] It might seem that the syntax to do this would be something like this (if only the "skip" allowed vectors in the same way colClasses does in
2016 Oct 27
2
NFS help
On Wed, Oct 26, 2016 at 9:35 AM, Matt Garman <matthew.garman at gmail.com> wrote: > On Tue, Oct 25, 2016 at 7:22 PM, Larry Martell <larry.martell at gmail.com> wrote: >> Again, no machine on the internal network that my 2 CentOS hosts are >> on are connected to the internet. I have no way to download anything., >> There is an onerous and protracted process to get
2010 Apr 30
3
need help: about remove space
Hi all I have big file as below and would like to know how many line eg: wc -l file but can't figure out how to know If I type wc -l file, I only get the 1023 but it includes the space When I use cat file | tr -d "\r \n". it gives me "adrian alice......" I need it as fileB and then wc -l fileB. Thank you so much file ==== adrian alice Patrick file B ======
2012 Mar 08
4
Reading in 9.6GB .DAT File - OK with 64-bit R?
Hi there, I wish to read a 9.6GB .DAT file into R (64-bit R on 64-bit Windows machine) - to then delete a substantial number of rows & then convert to a .csv file. Upon the first attempt the computer crashed (at some point last night). I'm rerunning this now & am closely monitoring Processor/CPU/Memory. Apart from this crash being a computer issue alone (possibly), is R equipped to
2007 Aug 30
15
ZFS, XFS, and EXT4 compared
I have a lot of people whispering "zfs" in my virtual ear these days, and at the same time I have an irrational attachment to xfs based entirely on its lack of the 32000 subdirectory limit. I''m not afraid of ext4''s newness, since really a lot of that stuff has been in Lustre for years. So a-benchmarking I went. Results at the bottom:
2002 Oct 10
2
multiple sessions to same destination
Hi All, I had a look in archives but no joy. I just want to know before i deploy - if there is any problem with having multiple rsync sessions from many source locations all to same destination server ( all copying to different file systems obviously ) ?? Its jusat that i see on the destination machine a process rsync -server being called whenever a client connects. Thanks ! Laurence
2004 Feb 29
6
Samba Gigabit very very slow?
Hi! I?m having trubbles with the speed on my samba server, I just uppgraded to gigabit (Realtek 8169 NIC) at home when i copy stuff from the samba server i get around 5-6Mb/sec and if i use ftp to access the same file on the same server i get almost 30Mb/sec does anyone have a clue what causes this problem? When i used my old 3com 905c for the local net i got normal 100Mbit speed (around
2008 Sep 27
3
make a system call and proceed without waiting for result?
This is probably a general ruby question: in one of my models i need to make loads (up to 600 or so) of system calls with the curl command. It''s a fire-and-forget kind of deal - i don''t care, at that particular moment, whether the calls were successful or not and i certainly don''t want to keep the user waiting for the html responses to come back. Is it possible, with a