Colleagues,
Several days ago, I wrote to the list about a lengthy delay in startup
of a a script. I will start with a brief summary of that email. I
have a 10,000 line script of which the final 3000 lines constitute a
function. The script contains time-markers (cat(date()) to that I can
determine how fast it was read. When I invoke the script from the OS
("R --slave < Script.R"; similar performance with R 2.6.1 or 2.7.0
on
a Mac / Linux / Windows), the first 7000 lines were read in 5 seconds,
then it took 2 minutes to read the remaining 3000 lines. I inquired
as to the cause for the lengthy reading of the final 3000 lines.
Subsequently, I whittled the 3000 lines to ~ 1000 (moving 2000 lines
to smaller functions). Now the first 9000 lines still reads in ~ 6
seconds and the final 1000 lines in ~ 15 seconds. Better but not ideal.
However, I just encountered a new situation that I don't understand.
The R code is now embedded in a graphical interface built with Real
Basic. When I invoke the script in that environment, the first 9000
lines takes the usual 6 seconds. But, to my surprise, the final 1000
lines takes 2 seconds!
There is one major difference in the implementation. With the GUI,
the commands are "pushed", i.e., the GUI opens R, then sends a
continuous stream of code.
Does anyone have any idea as to why the delay should be so different
in the two settings?
Dennis
Dennis Fisher MD
P < (The "P Less Than" Company)
Phone: 1-866-PLessThan (1-866-753-7784)
Fax: 1-415-564-2220
www.PLessThan.com
[[alternative HTML version deleted]]