similar to: Editing the "..." argument

Displaying 20 results from an estimated 500 matches similar to: "Editing the "..." argument"

2011 May 25
1
L-BFGS-B and parscale in optim()
Hi, When using method L-BFGS-B along with a parscale argument, should the lower and upper bounds provided be on the scaled or unscaled values? Thanks. Cheers, -- Seb
2006 Jun 23
1
How to use mle or similar with integrate?
Hi I have the following formula (I hope it is clear - if no, I can try to do better the next time) h(x, a, b) = integral(0 to pi/2) ( ( integral(D/sin(alpha) to Inf) ( ( f(x, a, b) ) dx ) dalpha ) and I want to do an mle with it. I know how to use mle() and I also know about integrate(). My problem is to give the parameter values a and b to the
2008 Aug 13
2
messing with ...
I'm looking for advice on manipulating parameters that are going to be passed through to another function. Specifically, I am working on my version of "mle", which is a wrapper for optim (among other optimizers). I would prefer not to replicate the entire argument list of optim(), so I'm using ... to pass extra arguments through. However: the starting values are
2010 Mar 01
2
Advice wanted on using optim with both continuous and discrete par arguments...
Dear R users, I have a problem for which my objective function depends on both discrete and continuous arguments. The problem is that the number of combinations for the (multivariate) discrete arguments can become overwhelming (when it is univariate this is not an issue) hence search over the continuous arguments for each possible combination of the discrete arguments may not be feasible. Guided
2003 Feb 28
2
optim
Dear all, I have a function MYFUN which depends on 3 positive parameters TETA[1], TETA[2], and TETA[3]; x belongs to [0,1]. I integrate the function over [0,0.1], [0.1,0.2] and [0.2,0.3] and want to choose the three parameters so that these three integrals are as close to, resp., 2300, 4600 and 5800 as possible. As I have three equations with three unknowns, I expect the exact fit, i.e., the SS
2012 Apr 12
1
Help with vectorization
Hi every one. I have a exponential function (3 fitting parameters) that I would like to use to produce data (6 series) without having to use a loop. Here wl = seq(300,500,1) k1 = c(1.2e-6, 4.9e-6, 9.6e-6, 2.7e-10, 6.7e-8, 7.44e-6) k2 = c(726, 352, 128, 5232, 1538, 128) k3 = c(-176, -224, -257, 88.7, -111, -256) stations = c('R5d', 'R5a', 'R9', '108',
2003 Feb 17
2
returning argument names
Dear r-list folks, I have a problem which has been bugging me for a while now and I was hoping someone out there might be able to help. If I have a user-defined function with an indeterminate number of arguments, using the well-known "..." construct, how can I get the function to return the names of the items which were the arguments of the function as part of the function's
2006 Nov 30
3
writing function with ,... )
Hi to all I did not found the right hints for functions with the dot-dot-dot argument. Is it possible to write own functions with the tree dots and if yes what's wrong with the following example? test <- function(x, ...) { print (x) if (exists("y"))print(y) if (exists("z"))print(z) } test(4,y=2) With regards Carmen
2012 Feb 22
3
OpenMP and random number generation
Dear all, Now that R has OpenMP facilities, I'm trying to use it for my own package but I'm still wondering if it is safe to use random number generation within a OpenMP block. I looked at the R writing extension document both on the OpenMP and Random number generation but didn't find any information about that. Could someone tell me if it is safe or not please ? Best, Mathieu
2005 Mar 09
1
nnet abstol
Hi, I am using nnet to learn transfer functions. For each transfer function I can estimate the best possible Mean Squared Error (MSE). So, rather than trying to grind the MSE to 0, I would like to use abstol to stop training once the best MSE is reached. Can anyone confirm that the abstol parameter in the nnet function is the MSE, or is it the Sum-of-Squares (SSE)? Best regards, Sam.
2002 Dec 17
3
Changing "..." inside a function: impossible? desirable?
This is was something like a request for your comments, thoughts on the topic... Many of you will know that the "..." (aka \dots) argument is very useful for passing ``further graphical parameters'', but can be a pain when itself is passed to too many plotting functions inside your own function. An artificial example being myplot <- function(x,y, ...) { plot(0:1, 0:1,
2003 Jul 16
2
numerical differentiation in R? (for optim "SANN" parscale)
Dear R users, I am running a maximum likelihood model with optim. I chose the simulated annealing method (method="SANN"). SANN is not performing bad, but I guess it would be much more effecive if I could set the `parscale' parameter. The help sais: `parscale' A vector of scaling values for the parameters. Optimization is performed on `par/parscale' and these
2005 Apr 26
2
"wild" function example in optim
Dear all, Firstly, I do apologize if my question is simple and posted in the wrong place but I had no reply from the R-help mailing list (maybe it is too simple!). I was wondering why parscale is set to 20 in the "wild" function example used in ?optim. This function has only one parameter and if we set parscale equal to 1 then the solution near the global minimum is not found. I
2005 Apr 19
1
Optim(...parscale...)
Hi there, The optim(par, fn, ...parscale...) function in R requires 'parscale' which is defined as: "A vector of scaling values for the parameters. Optimisation is performed on 'par/parscale' and these should be comparable in the sense that a unit change in any element (??) produces a unit change in the scaled value". I am just not understanding the
2008 Mar 23
2
scaling problems in "optim"
Dear R users, I am trying to figure out the control parameter in "optim," especially, "fnscale" and "parscale." In the R docu., ------------------------------------------------------ fnscale An overall scaling to be applied to the value of fn and gr during optimization. If negative, turns the problem into a maximization problem. Optimization is performed on
2007 Jun 23
2
Names of objects passed as ... to a function?
Dear list, I have a function whose first argument is '...'. Each element of '...' is a data frame, and there will be at least 2 data frames in '...'. The function processes each of the data frames in '...' and returns a list, whose components are the processed data frames. I would like to name the components of this returned list with the names of the original data
2008 Feb 08
0
scaling and optim
?optim says, in describing the control parameter, 'fnscale' An overall scaling to be applied to the value of 'fn' and 'gr' during optimization. If negative, turns the problem into a maximization problem. Optimization is performed on 'fn(par)/fnscale'. 'parscale' A vector of scaling values for the parameters.
2004 Sep 13
2
Problem with mle in stats4 (R 1.9.1)
Hi! This is a repost of an earlier message (with a clearer example demonstrating the problem I ran into). If you run the mle example in stats4 library(stats4) x <- 0:10 y <- c(26, 17, 13, 12, 20, 5, 9, 8, 5, 4, 8) ll <- function(ymax=15, xhalf=6) -sum(stats::dpois(y, lambda=ymax/(1+x/xhalf), log=TRUE)) (fit <- mle(ll)) plot(profile(fit),
2008 Jul 21
1
Control parameter of the optim( ): parscale
Hi everybody, I am using the L-BFGS-B method of the mle2() function to estimate the values of 6 parameters. mle2 uses the methods implemented in optim. As I got it from the descriptions available online, one can use the parscale parameter to tell R somehow what the values of the estimated parameters should be . . . Could somebody please help me understand what one has to do actually with the
2009 Nov 02
2
a prolem with constrOptim
Hi, I apologize for the long message but the problem I encountered can't be stated in a few lines. I am having some problems with the function constrOptim. My goal is to maximize the likelihood of product of K multinomials, each with four catagories under linear constraints on the parameter values. I have found that the function does not work for many data configurations. #The likelihood