Displaying 5 results from an estimated 5 matches for "fogarti".
Did you mean:
fogarty
2004 Jun 19
0
Charts and Graphs
Hi Roland:
I'd encourage you to take a look at the following page:
http://lilt.ilstu.edu/gmklass/pos138/datadisplay/badchart.htm
Best,
/Arin Basu
>Rau, Roland wrote:
> > it might be a bit off-topic but can anyone suggest some online
> > material concerning good graph / bad graph examples?
> > I imagine something like:
> > a) These are the data and this is
2004 Jul 08
0
R cookbook (Re: omit complete cases)
Hi Ivo:
You might check out Paul Jobnson's following page:
http://www.ukans.edu/~pauljohn/R/Rtips.html
HTH,
Arin
On Thu, 08 Jul 2004 ivo_welch-rstat8783@mailblocks.com wrote :
>
>...I used to use perl for much work, and although there is much to like about it, R seems to be even better for most tasks---except that there is one perl resource that R cannot beat: the Perl Cookbook.
2004 Jul 27
0
Reading SPSS file
Hi Karl:
a possible solution:
require(foreign)
mydata <- read.spss("somedata.sav", use.value.labels = TRUE, to.data.frame = TRUE)
-----
for more information, try
library(foreign)
?read.spss
HTH,
Arin
Message: 21
Date: Mon, 26 Jul 2004 10:36:33 +0200 (CEST)
From: Karl Knoblick <karlknoblich@yahoo.de>
Subject: [R] Read SPSS data (*.sav) in R 1.8.0 (ok) and R1.9.1(error)
To:
2004 Jul 07
7
Importing an Excel file
Hello, R users,
I am a very beginner of R and tried read.csv to import an excel file after
saving an excel file as csv. But it added alternating rows of fictitious NA
values after row number 16. When I applied read.delim, there were trailing
several commas at the end of each row after row number 16 instead of NA
values. Appreciate your help.
Kyong
[[alternative HTML version deleted]]
2004 Jul 18
2
a problem: factors, names, tables ..
Hi all,
I am *completely* lost in trying to solve a relatively simple task.
I want to compute the relative number of occurences of an event, the data
of which sits in a large table (read from file).
I have the occurences of the events in a table 'tt'
0 2 10 11 13 14 15
15 6 1 3 8 15 10
.. meaning that event of type '0' occurs 15 times, type '2' occurs 6 times