similar to: error message in plm

Displaying 20 results from an estimated 400 matches similar to: "error message in plm"

2009 May 29
1
save plm coefficients
Hi R-helpers, I want to determine the coefficients of the following regression for several subsets, and I want to save it in a dataframe: The data is in ?regaccdis?, ?regaccdis$caedois? is the column that defines the subsets and the function I have runned is coef(plm(ff,data=regaccdis,na.action=na.omit,model="pooling",subset=(regaccdis$caedois==i))) I?ve created a dataframe named
2009 Oct 14
1
using mapply to avoid loops
Hello, I would like to use mapply to avoid using a loop but for some reason, I can't seem to get it to work. I've included copies of my code below. The first set of code uses a loop (and it works fine), and the second set of code attempts to use mapply but I get a "subscript out of bounds" error. Any guidance would be greatly appreciated. Xj, Yj, and Wj are also lists, and s2,
2009 May 08
1
plm: plm.data vs pdata.frame
Hello, I am trying to use the plm package for panel econometrics. I am just trying to get started and load my data. It seems from most of the sample documentation that I need to use the pdata.frame function to get my data loaded. However, even after installing the "plm" package, my R installation cannot find the function. I am trying to follow the example in plmEN.pdf (
2009 Oct 29
3
Weird error: Error in xj[i] : invalid subscript type 'list'
I got the error. I haven't been able to get a stand along case so that I can show it here. But could somebody give some clue on what could cause this error? Since I never defined xj[i], I don't understand where this error come from. Error in xj[i] : invalid subscript type 'list'
2008 Jul 01
1
[.data.frame speedup
Below is a version of [.data.frame that is faster for subscripting rows of large data frames; it avoids calling duplicated(rows) if there is no need to check for duplicate row names, when: i is logical attr(x, "dup.row.names") is not NULL (S+ compatibility) i is numeric and negative i is strictly increasing "[.data.frame" <- function (x, i, j,
2004 May 24
1
as.matrix.data.frame() in R 1.9.0 converts to character when it should (?) convert to numeric
Conversion of a data frame to a matrix using as.matrix() when a column of the data frame is POSIXt and all other columns are numeric has changed in R 1.9.0 from R 1.8.1. The new behavior issues a warning message and converts to a character matrix. In R 1.8.1, such an object was converted to a numeric matrix. Here is an example. #### R 1.9.0 #### > foo <- data.frame(
2017 Dec 01
1
Bug is as.matrix.data.frame with nested data.frame
Converting a data.frame with a nested data.frame to a matrix fails: x <- structure(list(a = data.frame(letters)), class = "data.frame", row.names = .set_row_names(26)) as.matrix(x) #> Error in ncol(xj) : object 'xj' not found The offending code is here, in the definition of as.matrix.data.frame (source/base/all.R): for (j in pseq) {
2009 Nov 04
4
unexpected results in comparison (x == y)
Dear readers of the list, I have a problem a comparison of two data from a vector. The comparison yields FALSE but should be TRUE. I have checked for mode(), length() and attributes(). See the following code (R2.10.0): ----------------------------------------------- # data vector of 66 double data X =
2003 Jun 18
1
suggestion for make.names
I would like to suggest a modification to the make.names() function. The current implementation has two problems: 1. It doesn't check if a name matches an R keyword (like "function"). 2. The uniqueness algorithm is not invariant to concatenation. In other words, make.names(c("a","a","a"),unique=T) !=
2003 Jun 18
1
suggestion for make.names
I would like to suggest a modification to the make.names() function. The current implementation has two problems: 1. It doesn't check if a name matches an R keyword (like "function"). 2. The uniqueness algorithm is not invariant to concatenation. In other words, make.names(c("a","a","a"),unique=T) !=
2002 Oct 09
1
problems with missing values created by conversion using as.matri (PR#2130)
> version _ platform sparc-sun-solaris2.8 arch sparc os solaris2.8 system sparc, solaris2.8 status major 1 minor 6.0 year 2002 month 10 day 01 language R
2006 Nov 07
1
data frame subscription operator
Hi all, I was looking at the data frame subscription operator (attached in the end of this e-mail) and got puzzled by the following line: class(x) <- attr(x, "row.names") <- NULL This appears to set the class and row.names attributes of the incoming data frame to NULL. So far I was not able to figure out why this is necessary - could anyone help ? The reason I am
2011 May 27
0
System is computationally singular error for plm random effects models
Dear all, I am using the plm package for both fixed and random effects models on my country-year panel data. However, for some of the random effects models I get the following error: Error in solve.default(OM) : system is computationally singular: reciprocal condition number = 1.78233e-18 The same models work fine for fixed effects. I have also noticed that once I remove some of my variables
2011 Sep 26
0
how to handle with gap's in panel data (plm package)
Hi everyone, I’m working with a panel of firm/years observations. My panel not only is unbalanced but also have some gap’s in years. For example, firm 1 has 1999, 2000, 2001, 2004, 2005, firm 2 has 2000, 2001, 2003, 2005, and so on. I’m using the plm package and what I’m asking is how can I handle with this gap’s ? Thank you very much, Cecília Carmo Universidade de Aveiro
2011 Sep 05
1
plm package, R squared, dummies in panel data
Hi R-helpers, I have two questions I hope you could help me with them: In the plm package how can I calculate the R2 within, R2 between and R2 overall? Is there any special reason to not display these values? When using first differences do I need to have some special care with dummies (both year dummies and industry dummies)? (A friend who works with Stata told me that there is
2010 Nov 02
0
predict() for plm?
Hi, I have a small N large T panel which I am estimating via plm, with fixed effects. Is there any way to get predicted values for a new dataset? (I want to estimate parameters on a subset of my sample, and then use these to calculate model-implied values for the whole sample). Alternatively, is there some way of extracting the fixed effects from the plm fitted model object (then I can
2011 Jul 28
1
Fixed effects using Within transformation in PLM package
Hi all, I am trying to do my own fixed effects regression using the Within function in PLM. I apply the Within function to all my pseries and then run OLS on the transformed vectors using lm(). When I compare the results to those obtained via plm ("within"), the estimates are not always the same. Specifically, if there are missing values (NA), the parameter estimates are not the same.
2013 Jan 28
0
Using relaimpo or relimp with PLM and GLS
Dears, Unfortunatelly, the packages relaimpo and relimp do not seem to work with plm function (plm package) or gls function (in nlm package). I've been studying on how to adapt one of them for this pourpose. In that sense, I have two questions regarding to this work: 1) have anyone hard of any workaround for those incompatibilities, or at least of any ideas on that - especially for plm? 2)
2009 Apr 07
0
summary.plm error
Dear plm Package users, I use the plm package a lot but I have not updated it for some times. Now I realized the following difficulty with the summary.plm function (demonstrated with the example from the ?plm documentation). library(plm) data("Produc", package="Ecdat") estimation_method<-"within" estimation_effect<-"individual" zz
2012 Nov 09
0
Can pgmm in the plm package include additional endogenous variables?
Dear R-Users, I am using pgmm in the plm package to estimate a dynamic models with panel data. Besides the lagged dependent variable, I also have some other endogenous variables. Does the pgmm have an argument that allows me to specify these endogenous variables and their instruments? I didn't find this argument in the description and online. Thank you very much for your help!