Displaying 20 results from an estimated 38 matches for "mpirun".
2011 Nov 07
1
Strange behaviour of ssh
Hello together!
I've Debian 6.0.3 with OpenSSH_5.5p1 and problem with the execution of remote commands via ssh.
It seems as if the first command isn't looked up in all "$PATH" dirs.
Here normally I should get the version information of mpirun twice but the first one fails:
$ ssh cluster2 mpirun --version ; mpirun --version
bash: mpirun: command not found
mpirun (Open MPI) 1.4.3
Here I should get the place of "mpirun" twice but the first "which" doesn't find an "mpirun" and prints nothing:
$ ssh cluster...
2013 Dec 09
3
compat-openmpi issues after upgrade to CentOS 6.5
...yone can shed some light into an issue we are having
with compat-openmpi after upgrading CentOS to version 6.5
Some of our cluster applications are dependent on an older version of
OpenMPI, so we are using compat-openmpi. Up to CentOS 6.4 this was
version 1.4.3:
% /usr/lib64/compat-openmpi/bin/mpirun -V
mpirun (Open MPI) 1.4.3
but after the upgrade to CentOS 6.5 it suddenly reports version 1.5.3:
% /usr/lib64/compat-openmpi/bin/mpirun -V
mpirun (Open MPI) 1.5.3
But rpm/yum still reports that version 1.4.3 is installed:
% rpm -qv compat-openmpi
compat-openmpi-1.4.3-1.1.el6.x86_64
Is this del...
2006 Apr 04
1
Mpirun with R CMD scripts
...machine
"compute-0-12.local" "x86_64"
[[3]]
nodename machine
"compute-0-13.local" "x86_64"
> stopCluster(c1)
[1] 1
> q()
--------------------------------------
But on running the same script with mpirun i get the following error.
*****************************************************************8
[srividya at cheaha ~]$ mpirun -np 3 R -slave R CMD BATCH TestSnow.R
/home/srividya/R/library/snow/RMPInode.sh: line 9: 19431 Segmentation
fault ${RPROG:-R} --vanilla >${OUT:-/dev/null} 2>&...
2009 May 25
1
lam vs. openmpi
Dear R Debian Users:
I wrote a quick C program (eventually to become R code) and compiled it as:
mpicc -o greet greet.c
So far so good. Now when I run mpirun, this happens:
erin at erin-laptop:~$ mpirun -np 2 greet
-----------------------------------------------------------------------------
It seems that there is no lamd running on the host erin-laptop.
This indicates that the LAM/MPI runtime environment is not operating.
The LAM/MPI runtime environ...
2008 Jul 11
1
mpirun question with Rmpi
Dear R People:
I'm running Rmpi on a single machine and I have the following
statement from the command line:
mpirun -np 3 ./R --no-save < eek1.in >stuff4.out
The stuff4.out file only contains the third result. Is there a way to
fix this such that it shows all 3 sets, please
Thanks in advance,
Erin
--
Erin Hodgess
Associate Professor
Department of Computer and Mathematical Sciences
University of Housto...
2011 Feb 03
0
R: mpirun .C and R
...ode
----------------------------------------------------------------------------------------------------
#MSUB -l nodes=1:ppn=1
#MSUB -l walltime=100:00:00
#MSUB -m be
#MSUB -V
#MSUB -o /export/home/example/Runs/eg.out
#MSUB -e /export/home/example/Runs/eg.err
#MSUB -d /export/home/example/Runs
mpirun --mca mpi_warn_on_fork 0 -np 1 /export/home/R-2.12.1/bin/R
--slave -f /export/home/example/Runs/eg.r
echo "DONE multiple run!"
----------------------------------------------------------------------------------------------------
###
UNIVERSITY OF CAPE TOWN
This e-mail is subj...
2010 Dec 17
1
[R-sig-hpc] Error in makeMPIcluster(spec, ...): how to get a minimal example for parallel computing with doSNOW to run?
...le for parallel computing via "foreach" + "doSNOW" to run on a computer cluster (Brutus from ETH Zurich). The minimal example is given below. It runs perfectly fine on my MacBook but when I submit it as a batch job via ...
> bsub -n 3 -R "select[model==Opteron8380]" mpirun R --no-save -q -f doSNOW_minimal.R
> ... it does not work. The output is also given below. The error is "Error in makeMPIcluster(spec, ...) : a cluster already exists 1". The only similar thing I found on the web is http://www.mail-archive.com/r-help at stat.math.ethz.ch/msg35501.html...
2008 May 30
1
R and Openmpi
...ocesses running as I think I should.
Rmpi version 0.5.5
Openmpi version 1.1
Viglen HPC with (effectively) 9 blades and 8 nodes on each blade.
myhosts file contains details of the 9 blades, but specifies that there are 4 slots on each blade (to make sure I leave room for other users).
When running mpirun -bynode -np 2 -hostfile myhosts R --slave --vanilla task_pull.R
1. I get as many R slaves as there slots defined in my myhosts file (there are 36 slots defined, and I get 36 slaves, regardless of the setting of -np, the master goes on the first machine in the myhosts file.
2. The .Rout file co...
2013 Jun 16
2
Problem in linking a library in R package
...nt to use in my R package .My R package has src folder the there
is makevars.in file
### Setup R source code and objects.
PKG_CPPFLAGS = @PKG_CPPFLAGS@
PKG_LIBS = -L/home/g/Desktop/Project -fpmpip
### For user configuration.
USER_CONF = Makeconf
### Start making here.
all: $(SHLIB)@echo "MPIRUN = @MPIRUN@" > $(USER_CONF)@echo "MPIEXEC =
@MPIEXEC@" >> $(USER_CONF)@echo "ORTERUN = @ORTERUN@" >>
$(USER_CONF)@echo "TMP_INC = @TMP_INC@" >> $(USER_CONF)@echo "TMP_LIB
= @TMP_LIB@" >> $(USER_CONF)@echo "MPI_ROOT = @MPI_R...
2008 Jul 01
2
problem with mpiexec and Rmpi
Dear R People:
I'm having some trouble with mpiexec and Rmpi.
I would like to be able to pass in the number of "children" via the
mpiexec command (from the command line).
this is in SUSE10.1, with R-2.7.1
Here are my files:
cat eb.R
library(Rmpi)
mpi.remote.exec(paste("i am",mpi.comm.rank(),"of",mpi.comm.size()))
mpi.quit()
hodgesse at
2009 Sep 22
0
snowfall: missing MPI node
Hello,
I don't know if the question pertains to Rmpi, snow or snowfall.
I run my job by:
mpirun -np N -hostfile $PBS_NODEFILE RMPISNOW -f my-script.r --slave
In the snowfall sfInit call I have to specify one less CPU respect to
the mpirun call
sfInit(parallel=TRUE, cpus=N-1, type="MPI")
otherwise I receive an error similar to: "cluster size N-1 already
running" (s...
2013 Jul 18
2
001: RELIABILITY FIX: March 15, 2013
Can someone please provide me with a little more information about
this. It could be the source of some issues I am seeing with
mpirun/mpiexec.hydra/ssh (post earlier today), and information about
what it leads to (and any signatures) would be helpful. Thanks.
--
Professor Laurence Marks
Department of Materials Science and Engineering
Northwestern University
www.numis.northwestern.edu 1-847-491-3996
"Research is to see what...
2008 Apr 07
2
problem with Rmpi 0.5-5 and openmpi
...5 ::ffff:10.30.1.15 22
PKG_CONFIG_PATH=/opt/gnome/lib64/pkgconfig
LESSOPEN=lessopen.sh %s
INFOPATH=/usr/local/info:/usr/share/info:/usr/info:/opt/gnome/share/info
LESSCLOSE=lessclose.sh %s %s
G_BROKEN_FILENAMES=1
JAVA_ROOT=/usr/lib/jvm/java
COLORTERM=1
_=/usr/bin/env
pearman at master:~> which mpirun
/opt/openmpi/bin/mpirun
Also note: mpich is also installed and is also in the PATH, after openmpi
BTW. I have seen a number of posts concerning the pt2pt error message.
Still, I was unable to understand how they might apply to fixing the
current problem.
Help would be greatly appreciated....
2007 Sep 03
1
Snow on Windows Cluster
Hello,
the package snow is not working on a windows cluster with MPICH2 and
Rmpi. There is an error in makeCluster:
launch failed: CreateProcess(/usr/bin/env
"RPROG="C:\Programme\R\R-2.5.1\bin\R" "OUT=/dev/null" "R_LIBS="
C:/Programme/R/R-2.5.1/library/snow/RMPInode.sh) on 'cl1' failed, error
3 - Das System kann den angegbenen Pfad nicht finden.
I
2013 Jun 07
1
cannot load pbdMPI package after compilation
...suffix of executables...
checking whether we are cross compiling... no
checking for suffix of object files... o
checking whether we are using the GNU C compiler... yes
checking whether icc -std=gnu99 accepts -g... yes
checking for icc -std=gnu99 option to accept ISO C89... none needed
checking for mpirun... mpirun
checking for mpiexec... mpiexec
checking for orterun... orterun
checking for sed... /bin/sed
checking for mpicc... mpicc
checking for ompi_info... ompi_info
checking for mpich2version... F
found sed, mpicc, and ompi_info ...
>> TMP_INC_DIRS = /opt/openmpi/1.6.4-1/intel-13.1.1/includ...
2003 Dec 30
1
Rmpi and PBS
Hello:
Anybody knows how to run Rmpi through PBS (Portable Batch System) on a
cluster computer. I'm using a supercomputer which require to submit jobs
to PBS queue for dispatching. I tried use mpirun in my PBS script. But all
my Rslaves are spawned to the same node. This is not desired.
Any suggestions are welcome!
Thanks in advance.
========================================
Shengqiao Li
Research Associate
The Department of Statistics
PO Box 6330
West Virginia University
Morgantown, WV 26506...
2009 Jul 14
1
Snow/openmpi
I'm running R/snow on a small cluster with opensuse, openmpi, and openshh. I
start up R with "mpirun -n 1 R --no-save". That works but it strikes me how
easily I get kicked out of R whenever I run into syntax errors. Is there a
way to avoid this, for instance, by starting up a regular R session and
invoking/activating(?) openmpi within R, e.g. by passing on extra arguments
to the makeCluster...
2011 Jan 14
0
Fwd: Re: [R-sig-hpc] Working doSNOW foreach openMPI example
Whas missing the R in the command line:
"mpirun -n --hostfile /home/hostfile R --no-save -f rtest.R"
Hope this helps
mario
On 13-Jan-11 22:08, Justin Moriarty wrote:
> Hi,
> Just wanted to share a working example of doSNOW and foreach for an openMPI cluster. The function eddcmp() is just an examp...
2007 May 25
1
trouble with snow and Rmpi
Dear R People:
I am having some trouble with the snow package.
It requires MPICH2 and Rmpi.
Rmpi is fine. However, I downloaded the MPICH2 package, and installed.
There is no mpicc, mpirun, etc.
Does anyone have any suggestions, please?
Thanks in advance!
Sincerely,
Erin Hodgess
Associate Professor
Department of Computer and Mathematical Sciences
University of Houston - Downtown
mailto: hodgess at gator.uhd.edu
2011 Jun 22
2
Queries regarding Lustre Throughput Numbers with mdtest benchmark
Hi,
I have a query regarding Lustre Throughput Numbers with mdtest benchmark.I
am running mdtest benhmark with following options :-
/home/meshram/mpich2-new/mpich2-1.4/mpich2-install/bin/mpirun -np 256
-hostfile ./hostfile ./mdtest -z 3 -b 10 -I 5 -v -d /tmp/l66
where ,
mdtest - is the standard benchmark to test metadata operations. [
https://computing.llnl.gov/?set=code&page=sio_downloads ]
/tmp/l66 is my Lustre mount.
I am using 1Gige Network with TCP transport.
hostfile has 8 hos...