similar to: scan() crash in Windows 98 (PR#3234)

Displaying 20 results from an estimated 1000 matches similar to: "scan() crash in Windows 98 (PR#3234)"

2003 Nov 18
3
plot, plot, methods, crash (PR#5173)
(If this only happens in Win 98, I'm sure I could live with it. Just may be helpful to report it, I hope.) Start up R GUI, then > plot(1:4,1:4) # then close manually by clicking X > plot(1:4,1:4) # ditto > methods(plot) sometimes produces normal output and even the following prompt, but then crashes immediately, or more often, crashes immediately with no output. I can do any
2003 Aug 12
8
capturing output from Win 98 shell
How can I best achieve the following (works in Splus): filenames <- dos("dir *.sasb7dat /b") What I am asking, more generically, is: how can I capture the output of a DOS command in R? I have tried using system("COMMAND.COM /c dir /b", intern=T, show.output.on.console=T) where intern: a logical, indicates whether to make the output of the command an R
2003 Dec 22
2
Problems with read.table()
R version 1.8.1, OS Windows 98 Dear colleagues, if I import vegetation data (first row with column labels and first column with row labels) like 7MYRGERM;7AGRGIGA;7DRYOCTO;5MYRGERM;7SALELEA;7CHOCHON;7SALNIG?;....... t401;5;2;2;3;4;2;2;2;1;2;1;2;2;1;2;2;2;1;2;1;0;0;...... t403;3;0;0;6;4;0;3;0;0;3;0;0;0;0;3;0;0;0;2;0;2;0;..... with read.table("data.file", header=TRUE,
2003 Nov 18
4
address for bug reports? (PR#5171)
bug.report() tells me to email to r-bugs@r-project.org, whereas the Web site http://www.r-project.org/ points me to r-bugs@biostat.ku.dk. Which should I believe? Simon Fear Senior Statistician Syne qua non Ltd Tel: +44 (0) 1379 644449 Fax: +44 (0) 1379 644445 email: Simon.Fear@synequanon.com web: http://www.synequanon.com Number of attachments included with this message: 0 This
2003 May 28
2
Numbers that look equal, should be equal, but if() doesn'tsee as equal (repost with code included)
Try the following function (the name is supposed to be a joke, by the way), which will also do the right thing with NAs and characters. Use it as if(equal.enough(x,y)) rather than if(x==y), e.g. > equal.enough(0.1+0.2, 0.3) [1] TRUE My default of 15 significant figures may be overkill in many applications; be prepared to reduce this. Simon Fear "equal.enough" <- function(x, y,
2004 Aug 31
2
I've forgotten, why is box("") the default?
I've searched on CRAN for axes, axis, and other terms I've already forgotten, without (re)discovering the reason for S using "non-joining" axes by default, instead of box("l"). MASS points me towards Cleveland (1993) but I don't have ready access to this any more. Could someone give me a one-liner to justify this choice to a sceptic? It's something to do
2003 Sep 17
1
Just don't do it, surely? (was RE: Retrieve ... argument values)
Tony, I don't understand what you mean. Could you give an example? > -----Original Message----- > From: Tony Plate [mailto:tplate at blackmesacapital.com] > > ... I'm not saying "never write functions that use ...", > >I'm just saying "never write functions that depend on a particular > >argument being passed via ...". > > Several
2003 Aug 12
3
grep and gsub on backslash and quotes
The following code works, to gsub single quotes to double quotes: line <- gsub("'", '"', line) (that's a single quote within doubles then a double within singles if your viewer's font is not good). But The R Language Manual tells me that Quotes and other special characters within strings are specified using escape sequences: \' single quote \"
2003 Aug 27
3
seeking help with with()
I tried to define a function like: fnx <- function(x, by.vars=Month) print(by(x, by.vars, summary)) But this doesn't work (does not find x$Month; unlike other functions, such as subset(), the INDICES argument to "by" does not look for variables in dataset x. Is fully documented, but I forget every time). So I tried using "with": fnxx <- function(x, by.vars=Month)
2004 Feb 06
1
0.1 + 0.2 != 0.3 revisited
Prompted by Peter Dalgard's recent elegant "intbin" function, I have been playing with the extension to converting reals to binary representation. The decimal part can be done like this: decbase <- function(x, n=52, base=2) { if(n) { x <- x*base paste(trunc(x), decbase(x%%1, n-1, base), sep="") } } n=52 default because that's the number of bits in
2003 Oct 15
1
is.na(v)<-b (was: Re: Beginner's query - segmentation fault)
I think the thread ended up with several people (not only me) feeling certain they didn't like `is.na<-` but with the developers defending it and me not really understanding why. Uwe Ligges was going to come up with an example of `<- NA` going wrong (sorry Brian R, I mean behaving unexpectedly), but never did, and I think the problem has been fixed. It was apparently a problem with
2004 Mar 05
1
row-echelon form (was no subject)
I think one needs an LU decomposition rather than QR. However, I couldn't find anything off the shelf to do an LU, other than learning that determinant() now uses LU instead of QR or SVD, so the code to do it must be in there for those that want it. You'll probably need to divide rows of U by the first entry if you insist on the unique reduced REF. However, I can't see any reason
2003 Jul 31
6
Problem with data.frames
Hi, I just encountered a problem in R that may easily be fixed: If one uses attach for a data.frame e.g. 10000 times and forgets detach, then R gets incredibly slow (less then 10% of the original speed). My system: platform powerpc-apple-darwin6.0 arch powerpc os darwin6.0 system powerpc, darwin6.0 status major 1 minor 6.1 year 2002
2003 Nov 24
0
apologies (was RE: [R] ISOdate() and strptime())
Dear Brian and other R-developers, I have to say that I don't understand why what I wrote should have caused any offence. A smile was what I was hoping for. You know I devote more time than I am supposed to, to support R and its users, in partial repayment of my immeasurable debt to all the Developers. It's not much, it's sometimes misguided (I later discover), and my resources
2003 Sep 17
0
Just don't do it, surely? (was RE: Retrieve ... argument values)
Thanks for the insight. > -----Original Message----- > From: Prof Brian Ripley [mailto:ripley at stats.ox.ac.uk] <snip> > dots <- list(...) > haveYlim <- "ylim" %in% names(dots) > > is the sort of thing we still understand 5 years later. > I didn't say "understand", I said "easily follow". Obviously how "easily" is
2003 Oct 08
1
is.na(v)<-b (was: Re: Beginner's query - segmentation fault)
Note this behaviour: > a<-"a" > a<-NA > mode(a) [1] "logical" > a<-"a" > is.na(a) <- T > mode(a) [1] "character" However after either way of assigning NA to a, is.na(a) is true, and it prints as NA, so I can't see it's ever likely to matter. [Why do I say these things? Expect usual flood of examples where it does
2003 Oct 08
0
is.na(v)<-b (was: Re: Beginner's query - segmentation fault)
Well, that's a convincing argument, but maybe it's the name that's worrying some of us. Maybe it would be more intuitive if called set.na (sorry, I mean setNA). Also "is.na<-" cannot be used to create a new variable of NAs, so is not a universal method, which is a shame for its advocates. I note also that for a vector you can assign a new NA using either TRUE or
2003 Oct 09
1
is.na(v)<-b (was: Re: Beginner's query - segmentation fault)
> -----Original Message----- > From: Richard A. O'Keefe [mailto:ok at cs.otago.ac.nz] <snip> > The very existence of an "is.na<-" which accepts a logical > vector containing FALSE as well as TRUE ... And don't forget this is not the only usage of is.na<-. In fact it is designed to take any valid indexing value. For example: > a<-1:10 >
2003 Nov 19
5
ISOdate returns incorrect date?
Dear all, I have found the following (for me) incomprehensible behaviour of ISOdate (POSIXct): > ISOdate(1900,6,16) [1] "1900-06-15 14:00:00 Westeurop?ische Sommerzeit" > ISOdate(1950,6,16) [1] "1950-06-16 14:00:00 Westeurop?ische Sommerzeit" Note that in the first case I get the 15th of June back, not the 16th as I would have expected! This happened under R-1.7.1 on
2003 Nov 18
3
Copula calculation in R?
Hello Anyone that now of any function in R that can calculate copulas? Or if anyone have any code avaible I would be more than interested. Thank you in advance /Thomas ______________________________________________ R-help at stat.math.ethz.ch mailing list https://www.stat.math.ethz.ch/mailman/listinfo/r-help