search for: n_max

Displaying 7 results from an estimated 7 matches for "n_max".

2017 Sep 14
1
Print All Warnings that Occurr in All Parallel Nodes
..."Es_Ultima" = "c", "Comentarios" = "c"), locale = default_locale(), na = c("", " "), quoted_na = TRUE, quote = "\"", comment = "", trim_ws = TRUE, skip = 0, n_max = Inf, guess_max = min(1000, n_max), progress = FALSE)) } # C.2) parallel Package: Environment Settings no_cores <- detectCores() c1 <- makeCluster(no_cores) invisible(clusterEvalQ(c1, library(readr))) setDefaultCluster(c1) # C.3) parRapply Function Application...
2024 Apr 08
2
Exceptional slowness with read.csv
Hi Dave, That's rather frustrating. I've found vroom (from the package vroom) to be helpful with large files like this. Does the following give you any better luck? vroom(file_name, delim = ",", skip = 2459465, n_max = 5) Of course, when you know you've got errors & the files are big like that it can take a bit of work resolving things. The command line tools awk & sed might even be a good plan for finding lines that have errors & figuring out a fix, but I certainly don't envy you. All the...
2024 Apr 08
1
Exceptional slowness with read.csv
...on.au at gmail.com> wrote: > Hi Dave, > > That's rather frustrating. I've found vroom (from the package vroom) to be > helpful with large files like this. > > Does the following give you any better luck? > > vroom(file_name, delim = ",", skip = 2459465, n_max = 5) > > Of course, when you know you've got errors & the files are big like that it > can take a bit of work resolving things. The command line tools awk & sed > might even be a good plan for finding lines that have errors & figuring out > a fix, but I certainly don&...
2024 Apr 08
2
Exceptional slowness with read.csv
...09:18, Stevie Pederson wrote: > Hi Dave, > > That's rather frustrating. I've found vroom (from the package vroom) > to be helpful with large files like this. > > Does the following give you any better luck? > > vroom(file_name, delim = ",", skip = 2459465, n_max = 5) > > Of course, when you know you've got errors & the files are big like > that it can take a bit of work resolving things. The command line > tools awk & sed might even be a good plan for finding lines that have > errors & figuring out a fix, but I certainly d...
2024 Apr 08
4
Exceptional slowness with read.csv
Greetings, I have a csv file of 76 fields and about 4 million records. I know that some of the records have errors - unmatched quotes, specifically.? Reading the file with readLines and parsing the lines with read.csv(text = ...) is really slow. I know that the first 2459465 records are good. So I try this: > startTime <- Sys.time() > first_records <- read.csv(file_name, nrows
2007 Apr 18
31
[PATCH 00/28] Updates for firstfloor paravirt-ops patches
Hi Andi, This is a set of updates for the firstfloor patch queue. Quick rundown: revert-mm-x86_64-mm-account-for-module-percpu-space-separately-from-kernel-percpu.patch separate-module-percpu-space.patch Update the module percpu accounting patch fix-ff-allow-percpu-variables-to-be-page-aligned.patch Make sure the percpu memory allocation is page-aligned
2007 Apr 18
31
[PATCH 00/28] Updates for firstfloor paravirt-ops patches
Hi Andi, This is a set of updates for the firstfloor patch queue. Quick rundown: revert-mm-x86_64-mm-account-for-module-percpu-space-separately-from-kernel-percpu.patch separate-module-percpu-space.patch Update the module percpu accounting patch fix-ff-allow-percpu-variables-to-be-page-aligned.patch Make sure the percpu memory allocation is page-aligned