Displaying 20 results from an estimated 2000 matches similar to: "Adjusting MaxNwts in MICE Package"
2013 Jan 14
0
Changing MaxNWts with the mi() function (error message)
Hello,
I am trying to impute data with the mi() function (mi package) and
keep receiving an error message. When imputing the variable, "sex,"
the mi() function accesses the mi.categorical() function, which then
accesses the nnet() function. I then receive the following error
message (preceded by my code below):
> imputed.england=mi(england.pre.imputed, n.iter=6, add.noise=FALSE)
2013 Oct 29
3
Ayuda con Mice con polyreg
Saludo gente, antes que nada gracias por la ayuda que puedan aportarme, soy
iniciante en R, estoy usando el paquete Mice para realizar imputaciones
múltiples sobre variables en su mayoría categóricas. El problema está que
cuando expresó este comando imp <- mice(dataset,method="polr",maxit=1)
donde el dataset es un data.frame me tirá este error :
iter imp variable
1 1 pial1a
2010 Jul 14
1
Changing model parameters in the mi package
I am trying to use the mi package to impute data, but am running into
problems with the functions it calls.
For instance, I am trying to impute a categorical variable called
"min.func." The mi() function calls the mi.categorical() function to
deal with this variable, which in turn calls the nnet.default()
function, and passes it a fixed parameter MaxNWts=1500. However, as
2013 Oct 29
0
Fwd: Ayuda con Mice con polyreg
Saludo gente, antes que nada gracias por la ayuda que puedan aportarme, soy
iniciante en R, estoy usando el paquete Mice para realizar imputaciones
múltiples sobre variables en su mayoría categóricas. El problema está que
cuando expresó este comando imp <- mice(dataset,method="polr",maxit=1)
donde el dataset es un data.frame me tirá este error :
iter imp variable
1 1 pial1a
2005 Feb 08
1
Toying with neural networks
Hello all,
Ive been playing with nnet (package 'nnet') and Ive come across this
problem. nnet doesnt seems to like to have more than 1000 weights. If I
do:
> data(iris)
> names(iris)[5] <- "species"
> net <- nnet(species ~ ., data=iris, size=124, maxit=10)
# weights: 995
initial value 309.342009
iter 10 value 21.668435
final value 21.668435
stopped after 10
2010 Jun 02
1
nnet: cannot coerce class c("terms", "formula") into a data.frame
Dearest all,
Objective: I am now learning neural networks. I want to see how well can
train an artificial neural network model to discriminate between the two
files I am attaching with this message.
http://r.789695.n4.nabble.com/file/n2240582/3dMaskDump.txt 3dMaskDump.txt
http://r.789695.n4.nabble.com/file/n2240582/test_vowels.txt test_vowels.txt
Question: when I am attempting to run
2012 Jun 26
1
Error in mice
Hi all,
I am imputing missingness of 90 columns in a data frame using mice.
But "mice" gives back :
Error in nnet.default(X, Y, w, mask = mask, size = 0, skip = TRUE, softmax = TRUE, : too many (1100) weights
Any idea to solve this error is welcome,
Anera
[[alternative HTML version deleted]]
2008 Feb 15
2
Softmax in nnet
Hi R help,
I run my data in nnet with skip layer, factor response (with 0 & 1
values) and explicitly put softmax=T to compare the result of the
default nnet with no softmax specification. I assume this should give
me the same result. I got the result the default one, but not the
softmax version and I got the error message that I did not quite
understand.
test6.nn.skipT.softm.Yfac <-
2013 Oct 30
2
disculpe las molestias ...ayuda con MICE
Saludo gente, antes que nada gracias por la ayuda que puedan aportarme, soy
iniciante en R, estoy usando el paquete Mice para realizar imputaciones
múltiples sobre variables en su mayoría categóricas. El problema está que
cuando expresó este comando imp <- mice(dataset,method="polr",maxit=1)
donde el dataset es un data.frame me tirá este error :
iter imp variable
1 1 pial1a
2013 Oct 30
0
disculpe las molestias ...ayuda con MICE
Amalia,
No obtengo tus resultados. Corrí tus formulas y datos y el resultado es
x <- structure(list(ï..psraid = c(202517L, 202518L, 202520L, 202523L,
+ 202527L, 202537L, 202543L, 202544L, 202551L, 202566L, 202570L,
+ 202571L, 202606L, 202619L, 202624L, 202629L, 202631L, 202632L,
+ 202633L, 202648L, 202657L, 202663L, 202676L, 202683L, 202685L,
+ 202706L, 202708L, 202709L, 202710L, 202734L,
2010 Jan 29
0
Help interpreting libarary(nnet) script output..URGENT
Hello,
I am pretty new to R. I am working on neural network classifiers and I am
feeding the nnet input from different regions of interest (fMRI data). The
script that I am using is this:
library (MASS)
heap_lda <-
data.frame(as.matrix(t(read.table(file="R_10_5runs_matrix9.txt")))*100000,syll
= c(rep("heap",3),rep("hoop",3),rep("hop",3)))
library(nnet)
2013 Oct 30
1
disculpe las molestias ...ayuda con MICE
Muchas gracias, pero claro en una muestra de 50 datos se ejecuta, en la
muestra original de 1000 registros
me tira error :(
2013/10/30 daniel <daniel319@gmail.com>
> Amalia,
>
> No obtengo tus resultados. Corrí tus formulas y datos y el resultado es
> x <- structure(list(ï..psraid = c(202517L, 202518L, 202520L, 202523L,
> + 202527L, 202537L, 202543L, 202544L, 202551L,
2012 Jan 04
0
Error formal argument "softmax" matched by multiple actual arguments
I am running the nnet package as
> neural.soft<-nnet(custcat~region+ed+marital+tenure+age+address+income,size=3,softmax=TRUE)
This returns the error message : formal argument "softmax" matched by
multiple actual arguments
Here the dependent variable "custcat" is a factor with 4-levels. This error
does not crop up for any other arguments of nnet(), including
2006 Jan 25
16
Slideshow beta
Ok,
I finally got the slideshow code to a state worth showing it
off. The site is a very rough cut of a site I''m building for my wife''s
photography, so ignore the unfinished design for now :)
http://rachel.kathihill.com/
To see the ajax version, go to:
http://rachel.kathihill.com/?ajax=1
To randomize the order the images show:
http://rachel.kathihill.com/?random=1
To change
2012 Jan 05
2
difference of the multinomial logistic regression results between multinom() function in R and SPSS
Dear all,
I have found some difference of the results between multinom() function in
R and multinomial logistic regression in SPSS software.
The input data, model and parameters are below:
choles <- c(94, 158, 133, 164, 162, 182, 140, 157, 146, 182);
sbp <- c(105, 121, 128, 149, 132, 103, 97, 128, 114, 129);
case <- c(1, 3, 3, 2, 1, 2, 3, 1, 2, 2);
result <- multinom(case ~ choles
2009 Nov 13
2
help sample from large dataset - misleading error?
Hi All,
I want to take a simple random sample from a large dataset, gly, but I'm
getting an error message. Any help?
dim(gly)
[1] 112371 37
> s1 <- sample(gly,100)
Error in `[.data.frame`(x, .Internal(sample(length(x), size, replace, :
cannot take a sample larger than the population when 'replace = FALSE'
Thanks,
Rachel
[[alternative HTML version
2006 Jun 04
2
Can anyone help?
Quick question please....A user logs into windowsXP and tries to create a folder/document and the ownership on the new file/folder defaults to nobody:nobody. I have the user set up in samba on the IRIX machine. All other users have no problem. Anyone have any suggestions?
Thanks
Rachel
2005 Apr 11
4
R: function code
HI
sorry to be a nuisance to all!!!
how can i see the code of a particular function?
e.g. nnet just as an example
2009 Jul 09
3
Stratified data summaries
Hi All,
I'm trying to automate a data summary using summary or describe from the
HMisc package. I want to stratify my data set by patient_type. I was
hoping to do something like:
Describe(myDataFrame ~ patient_type)
I can create data subsets and run the describe function one at a time,
but there's got to be a better way. Any suggestions?
Rachel
[[alternative HTML
2012 May 30
1
caret() train based on cross validation - split dataset to keep sites together?
Hello all,
I have searched and have not yet identified a solution so now I am sending
this message. In short, I need to split my data into training, validation,
and testing subsets that keep all observations from the same sites together
? preferably as part of a cross validation procedure. Now for the longer
version. And I must confess that although my R skills are improving, they
are not so