similar to: Bug in mclapply?

Displaying 20 results from an estimated 30000 matches similar to: "Bug in mclapply?"

2012 Nov 16
0
Bug in parallel / mclapply
Hi, there seem to be some (small) bugs in the mclapply function in parallel. I discovered this in the current R release version, and I checked that it is still present in R-devel. I think it only occurs in the part of the code corresponding to argument option mc.preschedule = FALSE. Here are two examples: a) library(parallel) mclapply(list(), identity, mc.preschedule=FALSE) Error in
2013 Apr 11
1
parallel::mclapply does not return try-error objects with mc.preschedule=TRUE
Hello, Consider this: 1) library(parallel) res <- mclapply(1:2, stop) #Warning message: #In mclapply(1:2, stop) : # all scheduled cores encountered errors in user code is(res[[1]], 'try-error') #[1] FALSE 2) library(parallel) res <- mclapply(1:2, stop, mc.preschedule=FALSE) #Warning message: #In mclapply(1:2, stop, mc.preschedule = FALSE) : # 2 function calls resulted in an
2023 Jun 09
2
inconsistency in mclapply.....
Dear members, I am using pbmcapply to parellise my code. But the following code doesn't work: > LYG <- pbmclapply(LYGH,FUN = arfima,mc.cores = 2,mc.preschedule = FALSE) | | 0%, ETA NA^ It just hangs. But the
2023 Jun 09
1
inconsistency in mclapply.....
On Fri, 9 Jun 2023 18:01:44 +0000 akshay kulkarni <akshay_e4 at hotmail.com> wrote: > > LYG <- pbmclapply(LYGH,FUN = arfima,mc.cores = 2,mc.preschedule = > > FALSE) > | > | > 0%, ETA NA^ > > It just hangs. My questions from the last time still stand: 0) What is your
2020 Oct 08
2
exiting mclapply early on error
Hey folks, Is there any way to exit an mclapply early on error? For example, in the following mclapply loop, I have to wait for all the processes to finish before the error is returned. ``` mclapply(X = 1:12, FUN = function(x) {Sys.sleep(0.1); if(x == 4) stop()}, mc.cores = 4, mc.preschedule = F) ``` When there are many calculations in FUN, it takes a long time before the error is returned.
2015 Jul 24
1
Memory limitations for parallel::mclapply
Hello, I have been having issues using parallel::mclapply in a memory-efficient way and would like some guidance. I am using a 40 core machine with 96 GB of RAM. I've tried to run mclapply with 20, 30, and 40 mc.cores and it has practically brought the machine to a standstill each time to the point where I do a hard reset. When running mclapply with 10 mc.cores, I can see that each process
2023 May 16
1
mclapply enters into an infinite loop....
Dear members, I am using arfima in an mclapply construction (from the parallel package): Browse[2]> LYG <- mclapply(LYGH, FUN = arfima, mc.cores = detectCores()) ^C Browse[2]> LYG <- mclapply(LYGH[1:10], FUN = arfima, mc.cores = detectCores()) ^C Browse[2]> LYG <- mclapply(LYGH[1:2], FUN = arfima, mc.cores = detectCores()) ^C You can see that I am
2013 Nov 11
2
problem using rJava with parallel::mclapply
Dear all, I got an issue trying to parse excel files in parallel using XLConnect, the process hangs forever. Martin Studer, the maintainer of XLConnect kindly investigated the issue, identified rJava as a possible cause of the problem: This does not work (hangs): library(parallel) require(rJava) .jinit() res <- mclapply(1:2, function(i) {
2020 Jan 10
2
SUGGESTION: Settings to disable forked processing in R, e.g. parallel::mclapply()
I'd like to pick up this thread started on 2019-04-11 (https://hypatia.math.ethz.ch/pipermail/r-devel/2019-April/077632.html). Modulo all the other suggestions in this thread, would my proposal of being able to disable forked processing via an option or an environment variable make sense? I've prototyped a working patch that works like: > options(fork.allowed = FALSE) >
2012 Feb 23
1
segfault when using data.table package in conjunction with foreach
Hi all, I'm trying to use the package read.table within a foreach loop. I'm grabbing 500M rows of data at a time from two different files and then doing an aggregate/tapply like function in read.table after that. I had planned on doing a foreach loop 39 times at once for the 39 files I have, but obviously that won't work until I figure out why the segfault is occurring. The
2020 Jan 11
1
SUGGESTION: Settings to disable forked processing in R, e.g. parallel::mclapply()
> On Jan 10, 2020, at 3:10 PM, G?bor Cs?rdi <csardi.gabor at gmail.com> wrote: > > On Fri, Jan 10, 2020 at 7:23 PM Simon Urbanek > <simon.urbanek at r-project.org> wrote: >> >> Henrik, >> >> the example from the post works just fine in CRAN R for me - the post was about homebrew build so it's conceivably a bug in their libraries. > > I
2020 Jan 11
2
SUGGESTION: Settings to disable forked processing in R, e.g. parallel::mclapply()
Henrik, the whole point and only purpose of mc* functions is to fork. That's what the multicore package was about, so if you don't want to fork, don't use mc* functions - they don't have any other purpose. I really fail to see the point - if you use mc* functions you're very explicitly asking for forking - so your argument is like saying that print() should have an option to
2020 Apr 28
2
mclapply returns NULLs on MacOS when running GAM
Dear R-devel, I am experiencing issues with running GAM models using mclapply, it fails to return any values if the data input becomes large. For example here the code runs fine with a df of 100 rows, but fails at 1000. library(mgcv) library(parallel) > df <- data.frame( + x = 1:100, + y = 1:100 + ) > > mclapply(1:2, function(i, df) { + fit <- gam(y ~ s(x, bs =
2010 Apr 13
0
Multicore mapply
Quick question regarding multicore versions of mapply. Package 'multicore' provides a parallelized version of 'lapply', called 'mclapply'. I haven't found any parallelized versions of 'mapply', however (although one can use the lower level function 'parallel', it becomes harder to control the number of spawned processes etc). Is anyone aware of a
2020 Jan 10
2
SUGGESTION: Settings to disable forked processing in R, e.g. parallel::mclapply()
If I understand the thread correctly this is an RStudio issue and I would suggest that the developers consider using pthread_atfork() so RStudio can handle forking as they deem fit (bail out with an error or make RStudio work). Note that in principle the functionality requested here can be easily implemented in a package so R doesn?t need to be modified. Cheers, Simon Sent from my iPhone
2012 Dec 31
3
weird bug with parallel, RSQlite and tcltk
Hello, I spent a lot of a time on a weird bug, and I just managed to narrow it down. In parallel code (here with parallel::mclappy, but I got it doMC/multicore too), if the library(tcltk) is loaded, R hangs when trying to open a DB connection. I got the same behaviour on two different computers, one dual-core, and one 2 xeon quad-core. Here's the code: library(parallel) library(RSQLite)
2020 Apr 28
2
mclapply returns NULLs on MacOS when running GAM
Yes I am running on Rstudio 1.2.5033. I was also running this code without error on Ubuntu in Rstudio. Checking again on the terminal and it does indeed work fine even with large data.frames. Any idea as to what interaction between Rstudio and mclapply causes this? Thanks, Shian On 28 Apr 2020, at 7:29 pm, Simon Urbanek <simon.urbanek at R-project.org<mailto:simon.urbanek at
2010 Aug 12
1
multicore mclapply error
I'm running r 2. on a mac running 10.6.4 and a dual-core macbook pro. I'm having a funny time with multicore. When I run it with 2 cores, mclapply, R borks with the following error. The process has forked and you cannot use this CoreFoundation functionality safely. You MUST exec(). Break on __THE_PROCESS_HAS_FORKED_AND_YOU_CANNOT_USE_THIS_COREFOUNDATION_FUNCTIONALITY___YOU_MUST_EXEC__()
2011 Mar 22
2
Problem with mclapply -- losing output/data
Hello, I am running large simulations, which unfortunately I can't really replicate here because the code is so extensive. I rely heavily on mclapply, but I realize that I'm losing data somewhere. There are two worrisome symptoms: 1) I am getting 'NULL' as a return value for some (but not all) elements of the output when I use mclapply, but not if I use lapply > tmp2[1:3]
2019 Apr 30
2
mccollect with NULL in R 3.6
Dear All, I'm running into issues with calling mccollect on a list containing NULL using R 3.6 (this used to work in 3.5.3): jobs <- lapply( list(NULL, 'foobar'), function(x) mcparallel(identity(x))) mccollect(jobs, wait = FALSE, timeout = 0) #> Error in names(res) <- pnames[match(s, pids)] : #> 'names' attribute [2] must be the same length as the vector