Displaying 20 results from an estimated 900 matches similar to: "Subset of matrix"
2018 May 15
2
Systemfit
OK, Let's try this again! Here is the reproducible script; it is long because I had to copy the panel dataset here. My question is related to systemfit; I don't know how to get the result for the entire panel.
#Reproducible script
Empdata<- read.csv("/Users/ngwinuiazenui/Documents/UPLOADemp.csv")
View(Empdata)
install.packages("systemfit")
2018 May 16
0
Systemfit
Sadly you failed to set your email program to send plain text and the data is corrupted at my end.
I also think you need to reduce the size of the data set... the intent here is to increase your understanding, not debug your particular analysis.
I will say that I am having a very challenging time understanding what you are trying to accomplish though. What are the equations that you think need
2018 May 15
0
Systemfit
... and the mailing list is picky about attachments... whatever you attached did not conform to the stringent requirements mentioned in the Posting Guide. Pasting the code right into the email is usually safest, though you DO have to post using plain text (as the Posting Guide indicates) or your code may get mangled by the automatic html format removal.
On May 15, 2018 7:04:31 AM PDT, Bert Gunter
2008 Feb 19
4
[LLVMdev] 2008-01-25-ByValReadNone.c Failure
Hi all,
I'm seeing this failure on my PPC G4 box running TOT with llvm-gcc
4.2. Is anyone else seeing this? I'm sure it's related to the byval
stuff that's recently gone into LLVM. I'm attaching the output of
this command:
$ llvm-gcc -emit-llvm -O3 -S -o - -emit-llvm /Users/wendling/llvm/
llvm.src/test/CFrontend/2008-01-25-ByValReadNone.c
As you can see in it, there
2018 May 15
1
Systemfit
Unless there is good reason not to, always cc the list -- there are lots of
smarter folks than I on it who can help.
I may or may not have time to look at this. Hopefully someone else will.
-- Bert
Bert Gunter
"The trouble with having an open mind is that people keep coming along and
sticking things into it."
-- Opus (aka Berkeley Breathed in his "Bloom County" comic strip
2010 Feb 17
2
extract the data that match
Hi r-users,
I would like to extract the data that match. Attached is my data:
I'm interested in matchind the value in column 'intg' with value in column 'rand_no'
> cbind(z=z,intg=dd,rand_no = rr)
z intg rand_no
[1,] 0.00 0.000 0.001
[2,] 0.01 0.000 0.002
[3,] 0.02 0.000 0.002
[4,] 0.03 0.000 0.003
[5,] 0.04 0.000 0.003
[6,]
2013 Mar 28
0
using cvlm to do cross-validation
Hello,
I did a cross-validation using cvlm from DAAG package but wasn't sure how to assess the result. Does this result means my model is a good model?
I understand that the overall ms is the mean of sum of squares. But is 0.0987 a good number? The response (i.e. gailRel5yr) has min,1st Quantile, median, mean and 3rd Quantile, and max as follows: (0.462, 0.628, 0.806, 0.896, 1.000, 2.400) ?
2024 Aug 02
2
grep
Good Morning. Below I like statement like
j<-grep(".r\\b",colnames(mydata),value=TRUE); j
with the \\b option which I read long time ago which Ive found useful.
Are there more or these options, other than ? grep? Thanks.
dstat is just my own descriptive routine.
> x
?[1] "age"????????? "sleep"??????? "primary"????? "middle"
?[5]
2012 Jul 02
1
How to get prediction for a variable in WinBUGS?
Dear all,I am a new user of WinBUGS and need your help. After running the following code, I got parameters of beta0 through beta4 (stats, density), but I don't know how to get the prediction of the last value of h, the variable I set to NA and want to model it using the following code.Does anyone can given me a hint? Any advice would be greatly appreciated.Best
2007 Aug 10
7
Help wit matrices
Hello all,
I am working with a 1000x1000 matrix, and I would like to return a
1000x1000 matrix that tells me which value in the matrix is greater
than a theshold value (1 or 0 indicator).
i have tried
mat2<-as.matrix(as.numeric(mat1>0.25))
but that returns a 1:100000 matrix.
I have also tried for loops, but they are grossly inefficient.
THanks for all your help in advance.
Lanre
2012 Aug 03
1
Multiple Comparisons-Kruskal-Wallis-Test: kruskal{agricolae} and kruskalmc{pgirmess} don't yield the same results although they should do (?)
Hi there,
I am doing multiple comparisons for data that is not normally distributed.
For this purpose I tried both functions kruskal{agricolae} and
kruskalmc{pgirmess}. It confuses me that these functions do not yield the
same results although they are doing the same thing, don't they? Can anyone
tell my why this happens and which function I can trust?
kruskalmc() tells me that there are no
2007 Aug 09
2
Systematically biased count data regression model
Dear all,
I am attempting to explain patterns of arthropod family richness
(count data) using a regression model. It seems to be able to do a
pretty good job as an explanatory model (i.e. demonstrating
relationships between dependent and independent variables), but it has
systematic problems as a predictive model: It is biased high at low
observed values of family richness and biased low at
2024 Aug 02
1
grep
?s 02:10 de 02/08/2024, Steven Yen escreveu:
> Good Morning. Below I like statement like
>
> j<-grep(".r\\b",colnames(mydata),value=TRUE); j
>
> with the \\b option which I read long time ago which Ive found useful.
>
> Are there more or these options, other than ? grep? Thanks.
>
> dstat is just my own descriptive routine.
>
> > x
> ?[1]
2010 Feb 04
2
help needed using t.test with factors
I am trying to use t.test on the following data:
date type INTERVAL nCASES MTF SDF MTO SDO
nFST MF nOBS MO MB BIASCV BIASEV ME MAE
RMSE CRCF
2001-06-15 avn GE1.00 4385 0.246 0.300 1.502
0.556 1367 1.373 4385 1.502 1.471 0.285 0.164
-1.256 1.266 1.399 0.056
2001-06-15 avn
2008 Jan 28
0
(no subject)
Hi all
I am trying to generate a normal unbalanced data to estimate the coefficients of LM, LMM, GLM, and GLMM and their standard errors. Also, I am trying to estimate the variance components and their standard errors. Further, I am trying to use the likelihood ratio test to test H0: sigma^2_b = 0 (random effects variance component), and the t-test to test H0:mu=0 (intercept of the model Yij = mu
2008 Aug 25
3
lmer4 and variable selection
Dear list,
I am currently working with a rather large data set on body temperature
regulation in wintering birds. My original model contains quite a few
dependent variables, but I do not (of course) wish to keep them all in my
final model. I've fitted the following model to the data:
>
2012 May 02
1
coxph reference hazard rate
Hi,
In the following results I interpret exp(coef) as the factor that multiplies
the base hazard rate if the corresponding variable is TRUE. For example,
when the bucket is ks008 and fidelity <= 3, then the rate, compared to the
base rate h_0(t), is h(t) = 0.200 h_0(t). My question is then, to what case
does the base hazard rate correspond to? I would expect the reference to be
the first
2004 Jan 19
1
qda problem
Hi,
the following strange error appears when I use qda:
> qda1 <- qda(as.data.frame(mfilters[cvtrain,]),as.factor(traingroups))
Error: function is not a closure
That's also strange:
> qda1 <- qda(mfilters[cvtrain,],as.factor(traingroups))
Error in qda.default(mfilters[cvtrain, ], as.factor(traingroups)) :
length of dimnames must match that of dims
Some backgroud:
>
2009 Feb 23
1
why results from regression tree (rpart) are totally inconsistent with ordinary regression
Hi,
In my analysis of impacts of insecticide-treated bednets on malaria, I
look at the relationship between malaria incidence and mosquito
behaviors. The condensed data set is copied here. Ordinary regression
(lm) shows that Incidence was negatively related to Mortality. This
makes sense because the latter reflected the strength of killing
mosquitoes by insecticide-treated nets. Since the
2007 Jul 10
1
exces return by mktcap decile for each year
I have a data frame, lets call it dat,
with 3 columns ( mc, yr, ret) which represent market
cap, year, and return. mc is a factor, mc, and ret are
real numbers.
I want to add a column to the data calculated as
follows.
For each year, I want to split the data by mc decile,
then calculate the mean ret within that mc decile, and
finally subtract that year's decile mean from the raw
return. Then