Displaying 20 results from an estimated 12000 matches similar to: "[linux] connection never times out"
2009 Mar 21
1
unlink fails to remove symbolic links
unlink fails to remove symbolic links. This is more prominent now --
when a package creates symbolic links during installation, 00LOCK is
not removed.
Martin
> setwd(tempdir())
> fl <- tempfile(); file.create(fl)
[1] TRUE
> lnFile <- tempfile(); system(paste("ln -s", fl, lnFile))
> list.files()
[1] "file19495cff" "file74b0dc51"
> unlink(fl);
2013 Mar 19
1
source, sys.source and error line numbers
Hi,
is there a way to retrieve the line number of where en error occurred when
sourcing a file in a tryCatch statement? Is it stored somewhere accessible?
It is not found in the error object.
Consider the following code/output and note the difference in the traceback
between source (has line number) and sys.source (has no line number).
Thank you,
Renaud
########
# code
########
codefile <-
2017 Jul 15
3
readLines without skipNul=TRUE causes crash
I am not able to reproduce this on a Linux platform:
#######################3
fn1 <- "/home/jdnewmil/Downloads/Microdados ENEM 2009/Dados Enem 2009/DADOS_ENEM_2009.txt"
sessionInfo()
## R version 3.4.1 (2017-06-30)
## Platform: x86_64-pc-linux-gnu (64-bit)
## Running under: Ubuntu 14.04.5 LTS
##
## Matrix products: default
## BLAS: /usr/lib/libblas/libblas.so.3.0
## LAPACK:
2011 Jul 11
4
Save generic plot to file (before rendering to device)
I am looking for a way to save a plot (graphics contents) to a file after the
plot has been calculated but before it has been rendered. More specifically,
assume that I made a plot that took a very long time to produce, I would
like to save this plot to a generic file that I can later, on a different
machine, render to either PDF, PNG, SVG using the usual R graphics devices,
without recalculating
2011 Jul 11
4
Save generic plot to file (before rendering to device)
I am looking for a way to save a plot (graphics contents) to a file after the
plot has been calculated but before it has been rendered. More specifically,
assume that I made a plot that took a very long time to produce, I would
like to save this plot to a generic file that I can later, on a different
machine, render to either PDF, PNG, SVG using the usual R graphics devices,
without recalculating
2010 Nov 12
1
issue with ... in write.fwf in gdata
Dear R-list
This is just message to inform that the there is an issue with write.fwf in the gdata library (from version 2.5.0 on). It does not seem to accept further arguments to write.table like "eol" as the help file indicates as it stops when executing tmp <- lapply(x, format.info, ...).
Great package though - I use it a lot except for this function :)
See example below.
>
2017 Jul 16
3
readLines without skipNul=TRUE causes crash
hi, thank you for attempting this. it looks like your unix machine unzipped
the txt file without corruption -- if you copied over the same txt file to
windows 7, i don't think that would reproduce the problem? i think it
needs to be the corrupted text file where R.utils::countLines( txtfile
) gives 809367. i am able to reproduce on two distinct windows machines
but no guarantee i'm
2017 Jul 16
2
readLines without skipNul=TRUE causes crash
hi, yep, there are two problems -- but i think only the segfault is within
the scope of a base R issue? i need to look closer at the corrupted
decompression and figure out whether i should talk to the brazilian
government agency that creates that .rar file or open an issue with the
archive package maintainer. my goal in this thread is only to figure out
how to replicate the goofy text file so
2017 Jul 15
4
readLines without skipNul=TRUE causes crash
hi, thanks Dr. Murdoch
i'd appreciate if anyone on r-help could help me narrow this down? i
believe the segfault occurs because there's a single line with 4GB and also
embedded nuls, but i am not sure how to artificially construct that?
the lodown package can be removed from my example.. it is just for file
download cacheing, so `lodown::cachaca` can be replaced with
`download.file`
2017 Jul 15
0
readLines without skipNul=TRUE causes crash
hi, i realized that the segfault happens on the text file in a new R
session. so, creating the segfault-generating text file requires a
contributed package, but prompting the actual segfault does not -- pretty
sure that means this is a base R bug? submitted here:
https://bugs.r-project.org/bugzilla3/show_bug.cgi?id=17311 hopefully i am
not doing something remarkably stupid. the text file
2017 Jul 15
2
readLines without skipNul=TRUE causes crash
I see the problem on Windows 10, R-3.4.0, R.exe. It is not compiled for
debugging but gdb gives some information when I attach the debugger after
the 'R..has stopped working' popup appears. I don't know how reliable it
is:
(gdb) info threads
Id Target Id Frame
* 4 Thread 11848.0x1500 0x00007ffe38dc8861 in ntdll!DbgBreakPoint ()
from
2008 Jul 25
1
serialize() to via temporary file is heaps faster than doing it directly (on Windows)
Hi,
FYI, I just notice that on Windows (but not Linux) it is orders of
magnitude (below it's 50x) faster to serialize() and object to a
temporary file and then read it back, than to serialize to an object
directly. This has for instance impact on how fast digest::digest()
can provide a checksum.
Example:
x <- 1:1e7;
t1 <- system.time(raw1 <- serialize(x, connection=NULL));
2017 Jul 15
0
readLines without skipNul=TRUE causes crash
I am not able to reproduce your segfault on a Windows 7 platform either:
##########################
fn1 <- "d:/DADOS_ENEM_2009.txt"
sessionInfo()
## R version 3.4.1 (2017-06-30)
## Platform: x86_64-w64-mingw32/x64 (64-bit)
## Running under: Windows 7 x64 (build 7601) Service Pack 1
##
## Matrix products: default
##
## locale:
## [1] LC_COLLATE=English_United States.1252
## [2]
2017 Jul 17
1
readLines without skipNul=TRUE causes crash
hi, thanks again for taking the time. since corrupted compression prompted
the segfault for me in the first place, i've just posted the text file
as-is. it's a 2.4GB file so to be avoided on a metered internet
connection. i've updated the bugzilla report at
https://bugs.r-project.org/bugzilla3/show_bug.cgi?id=17311 with more
relevant info. these lines of code crash both windows R
2017 Jul 16
0
readLines without skipNul=TRUE causes crash
I am stuck. The archive package won't compile for me on Ubuntu, and the CRANextra repo seems to be down so I cannot install packages on Windows right now. Perhaps you can zip the corrupt text file and put it online somewhere? Don't use the archive package to pack it since there seem to be issues with that tool on your machine.
I would discourage you from harassing the Brazilian
2017 Jul 17
0
readLines without skipNul=TRUE causes crash
The original file had a lot of trailing null bytes so I tried making a
similar file with:
tf <- tempfile(); file <- file(tf, "wb")
for(i in 1:(2^15-1))writeBin(rep(as.raw(32:127), len=2^16), file)
for(i in 1:(2^15-1))writeBin(rep(as.raw(0L), len=2^16), file)
close(file)
log2(file.size(tf))
#[1] 31.99996
Reading this with readLines() caused R-3.4.0 to segfault in Rf_con_pushback
2008 Jan 04
1
Addendum: nls (with SSlogis model and upper limit) never returns (PR#10548)
Peter Dalgaard reminded me to be more specific about my computing platform; it's Debian 4.1.1-19 on a 32-bit Pentium 4 machine (Dell Optiplex GX620).
The problem I described (nls not returning) also occurs with different data at other values of the scal parameter.
Regards
Hendrik Weisser
--
2016 Feb 16
2
iconv to UTF-16 encoding produces error due to embedded nulls (write.table with fileEncoding param)
If I execute the code from the "?write.table" examples section
x <- data.frame(a = I("a \" quote"), b = pi)
# (ommited code)
write.csv(x, file = "foo.csv", fileEncoding = "UTF-16LE")
the resulting CSV file has a size of 6 bytes which is too short
(truncated):
""",3
The problem seems to be the iconv function:
2017 Jul 16
0
readLines without skipNul=TRUE causes crash
So you are saying there are two problems... one that produces a corrupt file from a valid compressed file, and one that segfaults when presented with that corrupt file? Can you please confirm the file name and run md5sum on it and share the result so we can tell when the file problem has been reproduced?
--
Sent from my phone. Please excuse my brevity.
On July 16, 2017 3:21:21 AM PDT, Anthony
2011 Aug 29
1
Out-of-date manual or small bug in R CMD check?
Hey all,
I get a warning about an unsupported file type in the data directory during
R CMD check (for R 2.13.1) if I use the save function to create an Rdata,
but if I save the same object to a .rda file, no warning.
Section 1.1.5 (pg 11 of the pdf) of the Writing R Extensions manual (2.13.1)
appears to say that .Rdata files should be fine:
" Data files can have one of three types as