similar to: Problem with mclapply -- losing output/data

Displaying 20 results from an estimated 300 matches similar to: "Problem with mclapply -- losing output/data"

2011 Jan 21
1
Reading gz compressed csv file - 'incomplete line found'
Hi all, I am trying to download, decompress and read a csv file. My code: myurl <- "ftp://ftp.ncbi.nih.gov/pub/geo/DATA/supplementary/series/GSE24729/GSE24729_MitoNuclear_suppl_male_stats.csv.gz" # myfile <- "GSE24729_MitoNuclear_suppl_male_stats.csv.gz" # download.file(myurl, destfile=myfile, mode="w") # mycon <- gzcon(gzfile(myfile,
2010 Nov 16
2
Debugging segfault in foreach
Hi, I'm using R-2.12 on a linux 64bit machine. When I run a chunk of code inside a foreach() %do% { ...} or %dopar% {...} (with doMC backend) I keep getting a segfault. Running the *same* code within lapply(something, function(x) ... ) doesn't result in any segfaults. I'll paste the output below, but I'm not sure it would be helpful. I'm more curious how to go about smoking
2011 Feb 04
2
Strange behaviour of read and writeBin
To me it seems like writeBin() writes one char/byte more than expected. > con <- file("testbin", "wb") > writeBin("ttccggaa", con) > close(con) > con <- file("testbin", "rb") > readBin(con, what="character") [1] "ttccggaa" > seek(con, what=NA) [1] 9 > close(con) > con <-
2011 Feb 09
3
Problem with xlsx package
I am trying to read an xlsx spreadsheet (1506 rows, 501columns) all populated but getting the following error: Please advise as to how to get around this issue. > res <- read.xlsx("c:\\BSE_v2.xlsx",1) Error in .jcall("RJavaTools", "Ljava/lang/Object;", "invokeMethod", cl, : java.lang.OutOfMemoryError: Java heap space Here is the session info:
2011 Nov 18
2
libpng warning: Application built with libpng-1.2.26 but running with 1.5.2
Hi, I have a problem on my mac when trying in R to produce png images. I am getting this warnings with the ArrayQualityMetrics package: > arrayQualityMetrics(rma_fatBody, outdir="normData", force =T) The report will be written into directory 'normData'. KernSmooth 2.23 loaded Copyright M. P. Wand 1997-2009 (loaded the KernSmooth namespace) libpng warning: Application built
2010 Dec 31
3
survexp - example produces error
Dear All, reposting, because I did not find a solution, maybe someone could check the example below. It's taken from the help page of survdiff. Executing it, gives the error "Error in floor(temp) : Non-numeric argument to mathematical function" best regards, Heinz library(survival) ## Example from help page of survdiff ## Expected survival for heart transplant patients based
2012 Dec 11
1
Bug in mclapply?
I've been using mclapply and have encountered situations where it gives errors or returns incorrect results. Here's a minimal example, which gives the error on R 2.15.2 on Mac and Linux: library(parallel) f <- function(x) NULL mclapply(1, f, mc.preschedule = FALSE, mc.cores = 1) # Error in sum(sapply(res, inherits, "try-error")) : # invalid 'type' (list) of argument
2015 Feb 24
2
iterated lapply
> On Feb 24, 2015, at 10:50 AM, <luke-tierney at uiowa.edu> wrote: > > The documentation is not specific enough on the indented semantics in > this situation to consider this a bug. The original R-level > implementation of lapply was > > lapply <- function(X, FUN, ...) { > FUN <- match.fun(FUN) > if (!is.list(X)) > X <-
2015 Sep 02
4
mclapply memory leak?
Dear R-devel, I am running mclapply with many iterations over a function that modifies nothing and makes no copies of anything. It is taking up a lot of memory, so it seems to me like this is a bug. Should I post this to bugs.r-project.org? A minimal reproducible example can be obtained by first starting a memory monitoring program such as htop, and then executing the following code while
2019 Apr 05
2
Deep Replicable Bug With AMD Threadripper MultiCore
The following program is whittled down from a much larger program that always works on Intel, and always works on AMD's threadripper with lapply but not mclappy. With mclapply on AMD, all processes go into "suspend" mode and the program then hangs. This bug is replicable on an AMD Ryzen Threadripper 2950X 16-Core Processor (128GB RAM), running latest ubuntu 18.04. The R version
2019 Apr 13
3
SUGGESTION: Settings to disable forked processing in R, e.g. parallel::mclapply()
Hi Inaki, > "Performant"... in terms of what. If the cost of copying the data > predominates over the computation time, maybe you didn't need > parallelization in the first place. Performant in terms of speed. There's no copying in that example using `mclapply` and so it is significantly faster than other alternatives. It is a very simple and contrived example, but
2020 Apr 28
2
mclapply returns NULLs on MacOS when running GAM
Yes I am running on Rstudio 1.2.5033. I was also running this code without error on Ubuntu in Rstudio. Checking again on the terminal and it does indeed work fine even with large data.frames. Any idea as to what interaction between Rstudio and mclapply causes this? Thanks, Shian On 28 Apr 2020, at 7:29 pm, Simon Urbanek <simon.urbanek at R-project.org<mailto:simon.urbanek at
2011 Jun 10
3
CRAN package with dependencies on Bioconductor
Dear all, for a CRAN-package that depends on another Bioconductor-package I find two things annoying and would like to know whether there are some workarounds: 1) Is there some inevitable problem that install.packages does not install uninstalled packages (on which the specified package depends) also from Bioconductor (in the correct version)? 2) In my understanding (please correct me if
2013 Feb 02
1
best practice for packages using mclapply to avoid tcltk
Dear R-devel friends: I'm back to bother you again about the conflict between mclapply and tcltk. I've been monitoring several packages that want to use mclapply to parallelize computations and need to figure out what should be done. It appears tcltk cannot be safely unloaded, so the best we can do is check for the presence of tcltk and stop if it is found before mclapply() is used. I
2018 Mar 04
3
Change Function based on ifelse() condtion
Below is my full implementation (tried to make it simple as for demonstration) Lapply_me = function(X = X, FUN = FUN, Apply_MC = FALSE, ...) { if (Apply_MC) { return(mclapply(X, FUN, ...)) } else { if (any(names(list(...)) == 'mc.cores')) { myList = list(...)[!names(list(...)) %in% 'mc.cores'] } return(lapply(X, FUN, myList)) } } Lapply_me(as.list(1:4), function(xx) { if (xx ==
2019 Apr 12
2
SUGGESTION: Settings to disable forked processing in R, e.g. parallel::mclapply()
Just throwing my two cents in: I think removing/deprecating fork would be a bad idea for two reasons: 1) There are no performant alternatives 2) Removing fork would break existing workflows Even if replaced with something using the same interface (e.g., a function that automatically detects variables to export as in the amazing `future` package), the lack of copy-on-write functionality would
2015 Feb 26
1
iterated lapply
> On Feb 25, 2015, at 5:35 PM, Benjamin Tyner <btyner at gmail.com> wrote: > > Actually, it depends on the number of cores: Under current semantics, yes. Each 'stream' of function calls is lazily capturing the last value of `i` on that core. Under Luke's proposed semantics (IIUC), the result would be the same (2,4,6,8) for both parallel and serial execution. This is
2011 Aug 22
3
Ignoring loadNamespace errors when loading a file
On a Unix machine I ran caret::rfe using the multicore package, and I saved the resulting object using save(lm2, file = "lm2.RData"). [Reproducible example below.] When I try to load("lm2.RData") on my Windows laptop, I get Error in loadNamespace(name) : there is no package called 'multicore' I completely understand the error and I would like to ignore it and
2020 Apr 29
2
mclapply returns NULLs on MacOS when running GAM
Thanks Simon, I will take note of the sensible default for core usage. I?m trying to achieve small scale parallelism, where tasks take 1-5 seconds and make fuller use of consumer hardware. Its not a HPC-worthy computation but even laptops these days come with 4 cores and I don?t see a reason to not make use of it. The goal for the current piece of code I?m working on is to bootstrap many
2011 Jul 20
4
R on Multicore for Linux
Hi all, I have R installed on a box, which is running on a machine with 16 core and Redhat - Linux. I am handling huge (size of dataset will be 5 GB) dataset. Lets assume that my data is in the form of structured (multiple) logs. I access the data by using all.files(). Since by default basic version of R utilizes single core, the processing of my analysis code is taking too much time. I got to