Displaying 20 results from an estimated 300000 matches similar to: "problem"
2018 Apr 18
1
Problem with regression line
Hi Anne,
I would suggest to change the linear model to lm(BloodPressure~Age), as
this model makes more sense in biological means (you would assume that
age influences pressure, not vice versa) and also obeys the statistical
assumption of weak exogeneity, that age can be measured without error,
at least compared to error-prone bp measures.
Cheers
Am 18.04.2018 um 16:07 schrieb Gerrit Eichner:
2005 Jan 25
2
"disregarded projections" warning when fitting lm model
Hi all,
I'm fitting a linear model (using lm) to some 2500 data points. The
model consists of 4 single terms and two combined terms. I get the
following warning message:
"Extra arguments projections are just disregarded. in: lm.fit(x, y,
offset = offset, singular.ok = singular.ok, ...) "
Can anybody clarify this ? I don't seem to find any pointer to what
this might
2004 Jan 14
1
Trellis graph and two colors display
Hello,
I would like to display two groups of dots in different colors or style and additionnaly a linear regression for all the dots in
panel plots of a Trellis graph. Actually to get in each panel the equivalent of:
plot(x[mois==3],y[mois==3],col="blue")
points(x[mois==9],y[mois==9],col="red")
abline(lm(y~x), col="green")
"mois" being a grouping
2009 Oct 08
0
predict.lm() out-of-sample predictions - problem with data classes
Hello!
I'm still working on my problem, which also occurs with the predict.lm()
function. - Providing newdata, which is a data.frame with all variables
being "numeric", as str() shows, R tells me the following:
ar1.xpred.test.pred <- predict(ar1.xpred.fitted, regdata.test, se.fit =
FALSE)
Fehler: variable 'lag(ret1)' was fitted with type "numeric" but type
2003 Jun 05
2
ridge regression
Hello R-user
I want to compute a multiple regression but I would to include a check for
collinearity of the variables. Therefore I would like to use a ridge
regression.
I tried lm.ridge() but I don't know yet how to get p-values (single Pr() and p
of the whole model) out of this model. Can anybody tell me how to get a
similar output like the summary(lm(...)) output? Or if there is
2004 May 12
1
Problem installing SparseM on Debian stable
I have troubles installing the "SparseM" package on my Debian stable
Linux system.
Debian's version of R is:
platform i386-pc-linux-gnu
arch i386
os linux-gnu
system i386, linux-gnu
status
major 1
minor 5.1
year 2002
month 06
day 17
language R
This is the installation output:
> R CMD INSTALL -l /usr/lib/R/ SparseM_0.36.tar.gz
* Installing
2004 Mar 02
2
Some timings for 64 bit Opteron (ATLAS, GOTO, std)
Hi Martin,
When I attended the LinuxWorld Expo in NYC back in January, I chatted with
some folks at the AMD booth, as well as guys from Penguin Computing (where
we bought our Opteron box). I was told that the Operton has this somewhat
strange setup that the memory is controlled by one CPU. The net effect of
this being that when both CPUs are running, one might only be running at
around 90%
2009 Apr 28
1
plot.lm cex.caption
Hello dear R users,
My objective is to change the size of this graphic : plot(lm(a~b), 4)
(Cook's distance)
I have found the help on the internet saying to change the size of a title
on a graphic plot(lm(a ~ b), 4), I should use the graphic parameter
cex.caption
http://stat.ethz.ch/R-manual/R-patched/library/stats/html/plot.lm.html
But in my install of R, it answers cex.caption isn't a
2010 Nov 16
3
plot linear model problem
Hi all,
Say I fit a linear model, and saved it as 'test.lm'
Then if I use plot(test.lm)
it gives me 4 graphs
How do I ask for a 'subset' of it??
say just want the 1st graph,
the residual vs fitted values,
or the 1,3,4th graph?
I think I can use plot(test.lm[c(1,3,4)]) before,
but now, it's not working...
Every time, it goes to the end, the only thing I can click is
2009 May 12
2
[Fwd: Re: ubuntu problem with 'r-cran-robustbase' [FWD Agustin Lobo]]
Subject: Re: [R-sig-Debian] ubuntu problem with 'r-cran-robustbase' [FWD
Agustin Lobo]
Date: Tue, 12 May 2009 13:30:49 +0200
From: Agustin Lobo <aloboaleu at gmail.com>
Reply-To: aloboaleu at gmail.com
To: Dirk Eddelbuettel <edd at debian.org>
CC: Martin Maechler <maechler at stat.math.ethz.ch>,
R-SIG-Debian at stat.math.ethz.ch
References: <18953.17704.527898.355877
2009 Jan 15
2
Interface to open source Reporting tools
Hi,
I am a new user of R 2.8.1. I use Tinn-R for code editing. I use a windows
2003 system with 1 GB RAM.
I am interested to generate dashboard and reports based on data from MS
Access. These reports need to be posted on a weekly basis to the web. The
reporting interface should provide facilities for "what if" scenarios.
Is it possible to interface R analysis results to good open
2003 Jun 27
1
R-help Digest, Vol 4, Issue 27 ( -Reply)
Hi,
I am out of town and will get back to you on the 13th of July.
Leo
>>> "r-help at stat.math.ethz.ch" 06/27/03 00:32 >>>
Send R-help mailing list submissions to
r-help at stat.math.ethz.ch
To subscribe or unsubscribe via the World Wide Web, visit
https://www.stat.math.ethz.ch/mailman/listinfo/r-help
or, via email, send a message with subject or body
2004 Jul 01
2
Inflection Points
Hi!
Some weeks ago I discovered R. Now, I have a somewhat complicated task
and am not sure whether R is the right tool to solve it.
I got data of several series or measurements where I have to find the
two inflection points. I did a linear regression (with ^2 and ^3
arguments), the problem there was that I had to look only at a very
narrow band of measurement in order to get the
2017 Nov 07
0
New vcov(*, complete=TRUE) etc -- coef(<lm>) vs coef(<aov>)
Dear Martin,
I think that your plan makes sense. It's too bad that aov() behaved differently in this respect from lm(), and thus created more work, but it's not be a bad thing that the difference is now explicit and documented.
I expect that that other problems like this will surface, particularly with contributed packages (and I know that you're aware that this has already happened
2011 Sep 20
3
Problem with legend
HI,
This code is part of a code I used to do a linear regression:
>points(var1~var2,data=Regress,pch=21,bg="grey")
>reg11<-lm(var1~var2,data=Regress)
>abline(lm(var1~var2,data=Regress),lty=2,lwd=2,col="grey")
>legend("topleft",legend=
>c("NDII from composite",
>"y= 0.0007x - 0.1156",expression(paste(r^2 ==
2018 Apr 18
0
Problem with regression line
Hi, Anne,
assign Age and Bloodpressure in the correct order
to the axes in your call to plot as in:
plot(y = Age, x = BloodPressure)
abline(SimpleLinearReg1)
Hth -- Gerrit
---------------------------------------------------------------------
Dr. Gerrit Eichner Mathematical Institute, Room 212
gerrit.eichner at math.uni-giessen.de Justus-Liebig-University Giessen
Tel:
2017 Sep 14
0
vcov and survival
Dear Martin,
I made three points which likely got lost because of the way I presented them:
(1) Singularity is an unusual situation and should be made more prominent. It typically reflects a problem with the data or the specification of the model. That's not to say that it *never* makes sense to allow singular fits (as in the situations you mentions).
I'd favour setting
2009 Sep 14
3
Eliminate cases in a subset of a dataframe
Hi folks,
I created a subset of a dataframe (i.e., selected only men):
subdata <- subset(data,data$gender==1)
After a residual diagnostic of a regression analysis, I detected three
outliers:
linmod <- lm(y ~ x, data=subdata)
plot(linmod)
Say, the cases 11,22, and 33 were outliers.
Here comes the problem: When I want to exclude these three cases in a
further regression analysis,
- for
2017 Nov 02
2
vcov and survival
>>>>> Fox, John <jfox at mcmaster.ca>
>>>>> on Thu, 14 Sep 2017 13:46:44 +0000 writes:
> Dear Martin, I made three points which likely got lost
> because of the way I presented them:
> (1) Singularity is an unusual situation and should be made
> more prominent. It typically reflects a problem with the
> data or the
2006 Jun 21
1
Extract information from the summary of 'lm'
Hi Everyone,
I just don't know how to extract the information I
want from the summary of a linear regression model
fitting.
For example, I fit the following simple linear
regression model:
results = lm(y_var ~ x_var)
summary(results) gives me:
Call:
lm(formula = y_var ~ x_var)
Residuals:
Min 1Q Median 3Q Max
-5.9859 -1.5849 0.4574 2.0163 4.6015
Coefficients: