similar to: How to use mpi.allreduce() in Rmpi?

Displaying 20 results from an estimated 1000 matches similar to: "How to use mpi.allreduce() in Rmpi?"

2008 Nov 07
1
Rmpi task-pull
Hi, I'm testing the efficiency of the Rmpi package regarding parallelization using a cluster. I've found and tried the task pull programming method, but even if it is described as the best method, it seems to cause deadlock, anyone could help me in using this method? here is the code I've found and tried: # Initialize MPI library("Rmpi") # Notice we just say "give us
2008 Sep 27
1
Problem with R on dual core under Linux - can not execute mpi.spawn.Rslaves()
Hi I am trying to utilize my dual core processor (and later a High-performance clusters (HPC) ) by using the Rmpi, snow, snowfall, ... packages, but I am struggling at the beginning, i.e. to initialise the "cluster" on my dual core computer. Whenever I try to initialize it (via sfInit(parallel=TRUE, cpus=2) or mpi.spawn.Rslaves(nslaves=2) ), I get an error message: >
2008 Jul 16
1
Problem with mpi.close.Rslaves()
I am running R 2.7.0 on a Suse 9.1 linux cluster with a job scheduler dispatching jobs and openmpi-1.0.1. I have tried running one of the examples at http://ace.acadiau.ca/math/ACMMaC/Rmpi/examples.html in Rmpi and they seem to be working, except mpi.close.Rslaves() hangs. The slaves are closed, but the master doesn't finish its script. Below is the example script and the call to R. The job is
2008 Apr 08
1
Rmpi 0.5-6 : error spawning process
Hi, I am using a cluster with LAM 7.1.3/MPI 2 and R 2.6.0. Rmpi version 0.5-5 is working very well. Now I have tested "Rmpi 0.5-6". During spawning the Rslaves I get an error: MPI_Error_string: error spawning process > sessionInfo() R version 2.6.0 (2007-10-03) x86_64-unknown-linux-gnu locale:
2007 Jul 05
0
Question on Rmpi looping
Dear R list, In the course of learning to work with Rmpi, we are confused about a few points. The following simple program is based on some examples we retrieved from the web. Each slave is writing the same output line multiple times (a multiple equal to the number of slaves). In other words, the write statements are being executed a number of times equal to the number of slaves. I am
2008 Oct 22
1
Problem about spawn nodes with Rmpi
Hi all, now I'm testing R in a "virtual cluster", made it with VirtualBox. This one has 3 nodes, running CentOS 5 and OpenMPI 1.2.8, and the principal node (called "server") exports the /home to other nodes. I have installed R and OpenMPI in /home, in fact, it seems work OK. Editing the openmpi-default-hostfile and run "mpirun -np 3 hostname" I can see the
2007 Sep 03
1
Snow on Windows Cluster
Hello, the package snow is not working on a windows cluster with MPICH2 and Rmpi. There is an error in makeCluster: launch failed: CreateProcess(/usr/bin/env "RPROG="C:\Programme\R\R-2.5.1\bin\R" "OUT=/dev/null" "R_LIBS=" C:/Programme/R/R-2.5.1/library/snow/RMPInode.sh) on 'cl1' failed, error 3 - Das System kann den angegbenen Pfad nicht finden. I
2010 Oct 04
0
Syntax for Rmpi cf multicore
I'm aiming to compare the workings of Rmpi and multicore on a duel processor quad core machine with 64 bit R-2.11.1 Kubuntu 10.4. It's impossible for me to get a small reproducable code segment to show what I mean, but if I show what works for mclapply, I'd hope it's possible to be shown what would be the equivalent way with mpi.apply. The function lr.gbm has variables trees,
2012 Apr 04
1
npRmpi trouble - mpi.comm.spawn causes segfault
Dear all, I have a large dataset of randomly generated weighed sample for which I wish to compute a kernel density estimate. I have used the "np" package successfully for smaller datasets, however for the larger ones, they take too long when using the cross validation options for bandwidth selection ("cv.ls" or "cv.ml"). Of course, they are much quicker with
2008 Jul 10
0
Rmpi unkown input format error
I have just installed Rmpi on a Suse 9.1 linux cluster with openmpi-1.0.1. I am trying the example included below from the tutorial website. However, I keep getting the following error: > # Load the R MPI package if it is not already loaded. > if (!is.loaded("mpi_initialize")) { + library("Rmpi") + } > > # Spawn as many slaves as possible >
2006 Feb 13
1
Turning control back over to the terminal
I'm invoking R from withing a shell script like this R --no-save --no-restore --gui=none > `hostname` 2>&1 <<BYE # various commands here BYE I would like to regain control from the invoking terminal at some point. I tried source(stdin()) but got a syntax error, presumably stdin is the little shell here snippet (the part between <<BYE and BYE). Is there some way to
2011 Feb 01
1
Rmpi; sample code not running, the slaves won't execute commands
Hi All, I'm trying to parallelize some code using Rmpi and I've started with a sample 'hello world' program that's available at http://math.acadiau.ca/ACMMaC/Rmpi/sample.html. The code is as follows; # Load the R MPI package if it is not already loaded. if (!is.loaded("mpi_initialize")) { library("Rmpi") } # Spawn as many slaves as possible
2008 Mar 20
1
Rmpi and C Code, where to get the communicator
Hello, I try to write parts of my code in C to accelerate the for-loops. But basic operations I want to do in R (e.g. start cluster). My R code looks something like this: library(Rmpi) mpi.spawn.Rslaves() mpi.remote.exec(....) dyn.load("test.so") erg <- .Call("test", ....) .... mpi.close.Rslaves() mpi.quit() And my C function looks something like this: #include
2007 Dec 20
2
Multicore computation in Windows network: How to set up Rmpi
R-users, My question is related to earlier posts about benefits of quadcore over dualcore computers; I am trying to setup a cluster of windows xp computers so that eventually I could make use of 10-20 cpu:s, but for learning how to do this, I am playing around with two laptops. I thought that the package snow would come handy in this situation, but to use snow, I would probably need to install
2009 Mar 25
0
Rmpi - send/receive multiple objects to slaves
-----BEGIN PGP SIGNED MESSAGE----- Hash: SHA1 I've written a function that uses Rmpi to perform a calculation in parallel. It works fine, but I'm trying to improve efficiency in terms of memory usage and the amount of data being passed back and forth between mater and slaves. Calculations are performed on a symmetrical matrix in order to zero-out some of the cells. In the parallel
2009 Jun 11
1
help installing Rmpi
Hello R users and developers, I would like to install Rmpi so that I may take advantage of all of the CPUs in my computer, but I cannot get it to install and I am not very good with linux so it is adding to the headache. I have looked through the help archive, but I have not been successful at getting Rmpi to work. I am not sure if I am even installing openMPI correctly in linux. I would really
2007 Nov 28
0
Rmpi : openmpi and mpi.spawn.Rslaves
Hello, I'm using R on a 10 blade dual quad core Rocks Cluster, and trying to use Rpmi and snow. I basically wondered if at the moment I ought to install Rmpi against another form of mpi (not openmpi) and wondered whether anyone could pass on any experience. I'm mainly worried about (a) the R server taking up 100% cpu time (I think this is a known issue with Rmpi and openmpi) and (b)
2007 Jun 07
1
Ubu edgy + latest CRAN R + Rmpi = no go
I'm just curious if anyone else has had problems with this configuration. I added the CRAN repository to apt and installed 2.5.0 with apt-get. I then did an install.packages("Rmpi") on cluster nodes. Rmpi loads and lamhosts() shows the nodes, but mpi.spawn.Rslaves() fails (something to do with temp files?). Rmpi works fine with the Edgy-native version of R (2.3.x) and installing
2003 Dec 30
1
Rmpi and PBS
Hello: Anybody knows how to run Rmpi through PBS (Portable Batch System) on a cluster computer. I'm using a supercomputer which require to submit jobs to PBS queue for dispatching. I tried use mpirun in my PBS script. But all my Rslaves are spawned to the same node. This is not desired. Any suggestions are welcome! Thanks in advance. ======================================== Shengqiao Li
2007 Mar 28
4
Rmpi and OpenMPI ?
Has anybody tried to use Rmpi with the OpenMPI library instead of LAM/MPI? LAM appears to be somewhat hardcoded in the Rmpi setup. Before I start to experiment with changing this, has anybody else tried Rmpi with non-LAM MPI implementations? Dirk -- Hell, there are no rules here - we're trying to accomplish something. -- Thomas A. Edison