Dear Group, I am using DEAL package for modeling signal transduction nets. This process is deal slow on a SunFire server with 8 gigs ram. we have a grid that can process much faster that one individual server. However, to start the process in Grid, I have to give a command or submit a batch process. Is there any way, I can run R in bach process. I tried the following: R CMD | library(deal) | data <- data.frame(read.table('file1',header=TRUE, row.names=1)) Here I do not know know : 1. How can I point my data files to a function. 2. creating a function (other than in R environment) Could any one help me. Thank you in advance. Cheers Peri.
See comments below. On Thu, 2004-10-28 at 18:49, S Peri wrote:> Dear Group, > I am using DEAL package for modeling signal > transduction nets. This process is deal slow on a > SunFire server with 8 gigs ram. > > we have a grid that can process much faster that one > individual server. > > However, to start the process in Grid, I have to give > a command or submit a batch process. > > Is there any way, I can run R in bach process. > > I tried the following: > > R CMD | library(deal) | data <- > data.frame(read.table('file1',header=TRUE, > row.names=1)) >Something like this works in *NIX echo " print(mean(rnorm(10))) " | R --no-save HOWEVER you will run into nightmares soon trying to backslash all the quotes and other special characters. And try to recall a long sequence of commands you typed in a few days ago ... Better to put all your codes into a file, say script.R and do R --no-save < script.R > log_script.R & See help("BATCH").> Here I do not know know : > > 1. How can I point my data files to a function.Two ways : 1) Hard code the path inside the script.R 2) Take advantage of commandArgs(). Example. If your script.R contains data.path <- as.character( commandArgs()[3] ) print(data.path) load (data.path) # or read.delim or whatever .... Then you can pass the path to test.rda via command line R --no-save < script.R /home/speri/data/test.rda > log_script.R & One thing to keep in mind is to pass the absolute paths and not relative paths (e.g. ../../data/test.rda). Using relative path may not always work.> 2. creating a function (other than in R environment)Huh ? You can put all your functions into a file, say functions.R, and you can source("/path/to/functions.R") in your script.R/> > Could any one help me. > > Thank you in advance. > > > Cheers > Peri. > > ______________________________________________ > R-help at stat.math.ethz.ch mailing list > https://stat.ethz.ch/mailman/listinfo/r-help > PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html >
Does the grid process faster because there is more than one machine or because each machine (or at least one machine) is faster? If the former, you're asking about splitting an R process into several processes (to take advantage of the grid), and you will want to look at the various R packages to facilitate that: Snow, Rmpi, Rpvm, TaskPR and perhaps others (browse cran.rproject.org). Reid Huntsinger -----Original Message----- From: r-help-bounces at stat.math.ethz.ch [mailto:r-help-bounces at stat.math.ethz.ch] On Behalf Of Adaikalavan Ramasamy Sent: Thursday, October 28, 2004 2:41 PM To: S Peri Cc: R-help Subject: Re: [R] Running R on a grid engine See comments below. On Thu, 2004-10-28 at 18:49, S Peri wrote:> Dear Group, > I am using DEAL package for modeling signal > transduction nets. This process is deal slow on a > SunFire server with 8 gigs ram. > > we have a grid that can process much faster that one > individual server. > > However, to start the process in Grid, I have to give > a command or submit a batch process. > > Is there any way, I can run R in bach process. > > I tried the following: > > R CMD | library(deal) | data <- > data.frame(read.table('file1',header=TRUE, > row.names=1)) >Something like this works in *NIX echo " print(mean(rnorm(10))) " | R --no-save HOWEVER you will run into nightmares soon trying to backslash all the quotes and other special characters. And try to recall a long sequence of commands you typed in a few days ago ... Better to put all your codes into a file, say script.R and do R --no-save < script.R > log_script.R & See help("BATCH").> Here I do not know know : > > 1. How can I point my data files to a function.Two ways : 1) Hard code the path inside the script.R 2) Take advantage of commandArgs(). Example. If your script.R contains data.path <- as.character( commandArgs()[3] ) print(data.path) load (data.path) # or read.delim or whatever .... Then you can pass the path to test.rda via command line R --no-save < script.R /home/speri/data/test.rda > log_script.R & One thing to keep in mind is to pass the absolute paths and not relative paths (e.g. ../../data/test.rda). Using relative path may not always work.> 2. creating a function (other than in R environment)Huh ? You can put all your functions into a file, say functions.R, and you can source("/path/to/functions.R") in your script.R/> > Could any one help me. > > Thank you in advance. > > > Cheers > Peri. > > ______________________________________________ > R-help at stat.math.ethz.ch mailing list > https://stat.ethz.ch/mailman/listinfo/r-help > PLEASE do read the posting guide!http://www.R-project.org/posting-guide.html>______________________________________________ R-help at stat.math.ethz.ch mailing list https://stat.ethz.ch/mailman/listinfo/r-help PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html
Possibly Parallel Threads
- S-PLUS code in R
- R as a programming language
- Backpropagation to adjust weights in a neural net when receiving new training examples
- [PATCH v8 01/18] remoteproc: st_slim_rproc: add a slimcore rproc driver
- [PATCH v8 01/18] remoteproc: st_slim_rproc: add a slimcore rproc driver