Hallo, we have a scenario where many clients are connected to a server over SSH. Each client sets up a reverse port forward, which is then used to access an HTTP service running on the client. When downloading files through the tunnelled connections, the memory usage per SSH connection on the server continuously increases. The used memory is released only after the connection has been closed by the client. There is a possible correlation between the file size and the number of simultaneous (parallel) downloads, e.g. * file size 0.5 MB, 50 clients, 500 downloads (30 simultaneous) -> memory usage ~ 2.6 MB per SSH connection on the server * file size 10 MB, 50 clients, 1000 downloads (60 simultaneous) -> memory usage ~ 6.1 MB per SSH connection on the server, where the peak memory usage was at 9.5 MB per server connection, i.e. some of the used memory is freed when the server load decreases. My question is whether there is a possibility of constraining the amount of memory, which is available to any single connection of the SSH server daemon. Further, why is the memory not freed when no more data is tunnelled through the connection, i.e. when the connection is idle for hours? Regards, Todor Dimitrov