You haven't even told us your OS!
The simple solution is to use a local copy of R. It is probably the case
that few of those 1500 packages are used, and certainly that few are used
in each R session: you may also want to install locally any that are
heavily used.
We found it advantageous to do this for Windows users of R (where the
remote discs are mounted by SMB) quite a while ago, and more recently
moved to local installation of R on Linux machines. The issue was not
network load but latency for interactive users: although R is heavily used
here it is not a major component of our network load and we are rather
protecting R users against other applications that are much more
demanding.
On Mon, 23 Oct 2006, Claudio Lottaz wrote:
> Dear all,
>
> I wonder, if anybody experiences similar problems and if there are any
> simple solutions to be suggested. We observe that R causes a lot of
> network traffic and thus slows down the performance of the whole
> network. When tracing the network traffic on the machine which serves
> the R installation via NFS, we see thousands of requests at
> initialization of R processes and regular calls, probably to shared
> libraries. Is there a way to compile or run R such that it causes less
> load on the network?
>
> Here is some information on our installation:
> - We use a single installation of R (version 2.3.1) loaded over NFS
> - there are approximately 1500 packages installed using ~8GB of disk
> - We use R on a queuing system running up to 50 processes in parallel
>
> The load on the machine which serves the R installation frequently rises up
to 5 or so although it is a dedicated machine.
> Any hints towards measures against network load are highly appreciated.
>
> Thanks,
> Claudio
>
>
>
--
Brian D. Ripley, ripley at stats.ox.ac.uk
Professor of Applied Statistics, http://www.stats.ox.ac.uk/~ripley/
University of Oxford, Tel: +44 1865 272861 (self)
1 South Parks Road, +44 1865 272866 (PA)
Oxford OX1 3TG, UK Fax: +44 1865 272595