Displaying 20 results from an estimated 120 matches similar to: "predict.ar() produces wrong SE's (PR#9614)"
2024 May 16
0
segmented 2.1-0 is released
dear R users,
I am pleased to announce that segmented 2.1-0 is now available on CRAN.
segmented focuses on estimation of breakpoints/changepoints of
segmented, i.e. piecewise linear, relationships in (generalized) linear
models. Starting with version 2.0-0, it is also possible to model
stepmented, i.e. piecewise constant, effects.
In the last release both models may be fitted via a formula
2024 May 16
0
segmented 2.1-0 is released
dear R users,
I am pleased to announce that segmented 2.1-0 is now available on CRAN.
segmented focuses on estimation of breakpoints/changepoints of
segmented, i.e. piecewise linear, relationships in (generalized) linear
models. Starting with version 2.0-0, it is also possible to model
stepmented, i.e. piecewise constant, effects.
In the last release both models may be fitted via a formula
2003 Dec 02
2
rsync: overhead?
Hello,
i am syncing 2 directorys with rsync.
There is nothing to do (i didn't changed anything).
Here is the output:
building file list ... done
wrote 371 bytes read 20 bytes 782.00 bytes/sec
total size is 5062161 speedup is 12946.70
Why did rsync wrote 371 bytes??
This output says rsync didn't changed anything!
regards,
hampel
2011 Sep 28
1
removing outliers in non-normal distributions
Hello,
I'm seeking ideas on how to remove outliers from a non-normal distribution
predictor variable. We wish to reset points deemed outliers to a truncated
value that is less extreme. (I've seen many posts requesting outlier removal
systems. It seems like most of the replies center around "why do you want to
remove them", "you shouldn't remove them", "it
2005 Aug 23
1
Robust M-Estimator Comparison
Hello,
I'm learning about robust M-estimators right now and had settled on the
"Huber Proposal 2" as implemented in MASS, but further reading made clear,
that at least 2 further weighting functions (Hampel, Tukey bisquare) exist.
In a post from B.D. Ripley going back to 1999 I found the following quote:
>> 2) Would huber() give me results that are similar (i.e., close
2006 Jul 05
2
p-values
Dear All,
When I run rlm to obtain robust standard errors, my output does not include
p-values. Is there any reason p-values should not be used in this case? Is
there an argument I could use in rlm so that the output does
include p-values?
Thanks in advance,
Celso
[[alternative HTML version deleted]]
2003 Feb 20
3
outliers/interval data extraction
Dear R-users,
I have two outliers related questions.
I.
I have a vector consisting of 69 values.
mean = 0.00086
SD = 0.02152
The shape of EDA graphics (boxplots, density plots) is heavily distorted
due to outliers. How to define the interval for outliers exception? Is
<2SD - mean + 2SD> interval a correct approach?
Or should I define 95% (or 99%) limit of agreement for data interval,
2008 Sep 15
0
RobASt-Packages
-----------------------------------------------------------------------------------------
Packages for the computation of optimally robust estimators
-----------------------------------------------------------------------------------------
We would like to announce the availability on CRAN (with possibly a
minor delay until on every mirror) of new versions of our packages for
the computation of
2008 Sep 15
0
RobASt-Packages
-----------------------------------------------------------------------------------------
Packages for the computation of optimally robust estimators
-----------------------------------------------------------------------------------------
We would like to announce the availability on CRAN (with possibly a
minor delay until on every mirror) of new versions of our packages for
the computation of
2012 Nov 22
1
help in M-estimator by R
hi guys and gals ... How are you all ...
i have to do something in robust regression by R programm , and i have some
problems as following:
*the first :*
suppose
w(r) =1/(1 r^2) and r <- c(7.01,2.07,7.061,5.607,8.502,54.909,12.222)
and i want to exclude some values from r so that (abs(r)>4.9 )...
after ,i want to used (w) to get on coefficients beta0 and beta1 (B1 <-
2004 Oct 18
1
meta.summaries and se's of effect sizes
Hi All,
I would like to use meta.summaries from package rmeta to do a meta-analysis.
I have available effect sizes as r's (which could be easily transformed to
effect sizes in terms of d's).
My problem is that I'm not sure what the se's of these r's should be ...
The r-values are themselves computed from F-tests and t-tests for various
studies.
Are there R-functions that
2009 Sep 14
0
fastest OLS w/ NA's and need for SE's
dear R wizards: apologies for two queries in one day. I have a long form
data set, which identifies about 5,000 regressions, each with about 1,000
observations.
unit date y x
1 20060101 <two values>
1 20060102 <two values>
...
5000 20081230 <two values>
5000 20081231 <two values>
I need to run such regressions many many times, because they are part of an
2009 Nov 13
0
Aov: SE's for split plot
Hello,
Can anyone explain why the following message appears for the function model.tables, where se=T? In the V&R MASS text, p.285 the se=T option works for a split plot example that seems similar to my operation. But the model.tables documentation, in the Arguments section for "se", states "should standard errors be computed?".
"Warning: Warning in
2012 Oct 10
2
se's and CI's for fitted lines in multivariate regression analysis
I?m entirely stumped on this particular issue and really hoping someone has
some advice for me.
I am running a covariant model in lm I would like to give the standard
errors or the confidence intervals for the fitted lines. I?ve been using the
dataset OrangeSprays where I want lines for each level of treatment over the
covariant ?colpos?. I?ve been able to calculate intercepts and slopes for
2018 Mar 14
3
the same function returning different values when called differently..
dear members,
I have a function ygrpc which acts on the daily price increments of a stock. It returns the following values:
ygrpc(PFC.NS,"h")
[1] 2.149997 1.875000 0.750000 0.349991 2.100006 0.199997 4.000000 2.574996 0.500000 0.349999 1.500000 0.700001
[13] 0.500000 1.300003 0.449997 2.800003 2.724998 66.150002 0.550003 0.050003 1.224991 4.899994 1.375000
2012 Sep 18
3
ommoting rows
Hi I have an output data Data.csv
this style:
,"V1","V2","V3" 1,"-9552","9552","C" 2,"0","9653","0"
3,"9614","9614","V" 4,"0","9527","0" 5,"-9752","9752","C"
6,"0","9883","0"
2001 Aug 12
2
rpart 3.1.0 bug?
I just updated rpart to the latest version (3.1.0). There are a number of
changes between this and previous versions, and some of the code I've been
using with earlier versions (e.g. 3.0.2) no longer work.
Here is a simple illustration of a problem I'm having with xpred.rpart.
iris.test.rpart<-rpart(iris$Species~., data=iris[,1:4],
parms=list(prior=c(0.5,0.25, 0.25)))
+ )
>
2012 May 23
1
numerical integration
Greetings,
Sorry, the last message was sent by mistake! Here it is again:
I encounter a strange problem computing some numerical integrals on [0,oo).
Define
$$
M_j(x)=exp(-jax)
$$
where $a=0.08$. We want to compute the $L^2([0,\infty))$-inner products
$$
A_{ij}:=(M_i,M_j)=\int_0^\infty M_i(x)M_j(x)dx
$$
Analytically we have
$$
A_{ij}=1/(a(i+j)).
$$
In the code below we compute the matrix
2007 Feb 18
3
chan_sip.c:1968 create_addr: No such host:
I have followed all the install note for A2billing and have everything installed and configured and my asterisk works except the callingcard application.
Added the following
[callingcard]
; CallingCard application
exten => 777,1,Answer
exten => 777,2,Wait,2
exten => 777,3,DeadAGI,a2billing.php
exten => 777,4,Wait,2
exten => 777,5,Hangup
I am using 777 as the calling card
2018 Mar 14
0
Fwd: the same function returning different values when called differently..
Hi Akshay,
(Please include r-help when replying)
You have learned that PFC.NS and snl[[159]] are not identical. Now you have
to figure out why they differ. This could also point to a bug or a logic
error in your program.
Figuring out how two objects differ can be a bit tricky, but with
experience it becomes easier. (Some others may even have some suggestions
for good ways to do it.)
Basically