Displaying 20 results from an estimated 40000 matches similar to: "error in parallel:::sendMaster"
2019 Nov 27
2
error in parallel:::sendMaster
Hi Andreas,
the error is reported when some child process cannot send results to the
master process, which originates from an error returned by write() -
when write() returns -1 or 0. The logic around the writing has not
changed since R 3.5.2. It should not be related to the printing in the
child, only to returning the value. The problem may be originating from
the execution environment,
2019 Nov 28
1
error in parallel:::sendMaster
Hi Andreas,
thank you very much, good job finding it was EBADF. Now the question is
why the pipe has been closed prematurely; it could be accidentally by R
(a race condition in the cleanup code in fork.c) or possibly by some
other code running in the same process (maybe the R program itself or
some other code it runs). Maybe we can take this off the list and come
back when we know the cause
2019 Nov 27
0
error in parallel:::sendMaster
Hi again,
One important correction of my first message: I misinterpreted the output. Actually in that R session 2 input files were processed one after the other in a loop. The first (with 88 parts went fine). The second (with 85 parts) produced the sendMaster errors and failed. If (in a new session via Rscript) I only process the second input file it will work. The other observations on R vs
2019 Nov 28
0
error in parallel:::sendMaster
Hi Tomas,
Thanks for your prompt reply and your offer to help. I might need to get back to this since I am not too experienced in debugging these kinds of issues. Anyway, I gave it a try and I think I have found the immediate cause:
I installed the debug symbols (r-base-core-dbg), placed https://github.com/wch/r-source/blob/tags/R-3-5-2/src/library/parallel/src/fork.c in cwd and changed the
2019 Dec 04
0
error in parallel:::sendMaster
Hi all,
With the help of Tomas, I was able to track the issue down: Prior to R v3.6.0 the parallel package passes an uninitialized variable as the file descriptor argument to the close system call.
In my particular R session this uninitialized variable (reproducibly) was holding the value 7, which corresponded to the file descriptor of the write end of the pipe the second child would use to
2019 Dec 04
0
error in parallel:::sendMaster
Hi all,
With the help of Tomas, I was able to track the issue down: Prior to R v3.6.0 the parallel package passes an uninitialized variable as the file descriptor argument to the close system call.
In my particular R session this uninitialized variable (reproducibly) was holding the value 7, which corresponded to the file descriptor of the write end of the pipe the second child would use to
2012 Apr 10
1
multicore/mcparallel error
Hello everyone,
I'm trying to parallelize an R script I have written. To do this, I am
first trying to use the multicore package, because I've had some previous
success with that.
The function I'm trying to parallelize is illumqc. I'd like to create a
separate process for each of 8 files, contained in the vector "files".
Below is my code:
for(i in
2018 Sep 19
5
segfault issue with parallel::mclapply and download.file() on Mac OS X
I have an lapply function call that I want to parallelize. Below is a very
simplified version of the code:
url_base <- "https://cloud.r-project.org/src/contrib/"
files <- c("A3_1.0.0.tar.gz", "ABC.RAP_0.9.0.tar.gz")
res <- parallel::mclapply(files, function(s) download.file(paste0(url_base,
s), s))
Instead of download a couple of files in parallel, I get a
2012 Feb 23
1
segfault when using data.table package in conjunction with foreach
Hi all,
I'm trying to use the package read.table within a foreach loop. I'm
grabbing 500M rows of data at a time from two different files and then
doing an aggregate/tapply like function in read.table after that. I
had planned on doing a foreach loop 39 times at once for the 39 files
I have, but obviously that won't work until I figure out why the
segfault is occurring. The
2020 Jan 10
2
SUGGESTION: Settings to disable forked processing in R, e.g. parallel::mclapply()
I'd like to pick up this thread started on 2019-04-11
(https://hypatia.math.ethz.ch/pipermail/r-devel/2019-April/077632.html).
Modulo all the other suggestions in this thread, would my proposal of
being able to disable forked processing via an option or an
environment variable make sense? I've prototyped a working patch that
works like:
> options(fork.allowed = FALSE)
>
2013 May 31
1
R 3.0.1 : parallel collection triggers "long memory not supported yet"
Dear R developers:
...
7: lapply(seq_len(cores), inner.do)
8: FUN(1:3[[3]], ...)
9: sendMaster(try(lapply(X = S, FUN = FUN, ...), silent = TRUE))
Selection: .....................Error in sendMaster(try(lapply(X = S, FUN =
FUN, ...), silent = TRUE)) :
long vectors not supported yet: memory.c:3100
admittedly, my outcome will be a very big list, with 30,000 elements, each
containing data frames
2018 Jun 21
1
DOCUMENTATION(?): parallel::mcparallel() gives various types of "Error in unserialize(r) : ..." errors if value is of type raw
I stumbled upon the following:
f <- parallel::mcparallel(raw(0L))
parallel::mccollect(f)
# $`77083`
# NULL
but
f <- parallel::mcparallel(raw(1L))
parallel::mccollect(f)
# Error in unserialize(r) : read error
traceback()
# 2: unserialize(r)
# 1: parallel::mccollect(f)
(restarting because the above appears to corrupt the R session)
f <- parallel::mcparallel(raw(2L))
2012 Mar 23
1
serialization regression in 2.15.0 beta
Hi,
I am experiencing a problem related to serialization behavior in
2.15.0 beta (binary installed from Debian unstable) and 2.16.0 (from
svn) that is not present in 2.14.2 (binary from Debian testing).
I don't fully understand the problem. Also, I tried but have not yet
been able to create a small, self-contained example that reproduces
the problem. However, I do have a large, not
2020 Jan 10
2
SUGGESTION: Settings to disable forked processing in R, e.g. parallel::mclapply()
If I understand the thread correctly this is an RStudio issue and I would suggest that the developers consider using pthread_atfork() so RStudio can handle forking as they deem fit (bail out with an error or make RStudio work). Note that in principle the functionality requested here can be easily implemented in a package so R doesn?t need to be modified.
Cheers,
Simon
Sent from my iPhone
2018 Oct 04
0
segfault issue with parallel::mclapply and download.file() on Mac OS X
Thanks for the report, but unfortunately I cannot reproduce on my system
(either macOS nor Linux, from the command line) to debug. Did you run
this in the command line version of R?
I would not be surprised to see such a crash if executed from a
multi-threaded application, say from some GUI or frontend that runs
multiple threads, or from some other R session where a third party
library
2018 Sep 20
0
segfault issue with parallel::mclapply and download.file() on Mac OS X
This code actually happens to work for me on macOS, but I think in
general you cannot rely on performing HTTP requests in fork clusters,
i.e. with mclapply().
Fork clusters create worker processes by forking the R process and
then _not_ executing another R binary. (Which is often convenient,
because the new processes will inherit the memory image of the parent
process.)
Fork without exec is not
2013 Aug 02
1
segfault and RunSnowWorker: not found
Hi,
While I suspect that this is an issue peculiar to my machine (Debian squeeze amd64, R version 3.0.1, up-to-date packages), I'm hoping that somebody on this list may be able to give me suggestions on how to troubleshoot and fix the following:
> library (snow)
> cl <- makeSOCKcluster(c("localhost","localhost"))
sh: 1: RunSnowWorker: not found
I presume/hope
2020 Jan 11
1
SUGGESTION: Settings to disable forked processing in R, e.g. parallel::mclapply()
> On Jan 10, 2020, at 3:10 PM, G?bor Cs?rdi <csardi.gabor at gmail.com> wrote:
>
> On Fri, Jan 10, 2020 at 7:23 PM Simon Urbanek
> <simon.urbanek at r-project.org> wrote:
>>
>> Henrik,
>>
>> the example from the post works just fine in CRAN R for me - the post was about homebrew build so it's conceivably a bug in their libraries.
>
> I
2020 Jan 10
6
SUGGESTION: Settings to disable forked processing in R, e.g. parallel::mclapply()
Henrik,
the example from the post works just fine in CRAN R for me - the post was about homebrew build so it's conceivably a bug in their libraries. That's exactly why I was proposing a more general solution where you can simply define a function in user-space that will issue a warning or stop on fork, it doesn't have to be part of core R, there are other packages that use fork() as
2019 May 19
2
Race condition on parallel package's mcexit and rmChild
I've been hacking with parallel package for some time and built a
parallel processing framework with it. However, although very rarely,
I did notice "ignoring SIGPIPE signal" error every now and then.
After a deep dig into the source code, I think I found something worth
noticing.
In short, wring to pipe in the C function mc_exit(SEXP sRes) may cause
a SIGPIPE. Code from