similar to: Computing s^2 for non-negative least squares.

Displaying 20 results from an estimated 40000 matches similar to: "Computing s^2 for non-negative least squares."

2001 Nov 20
0
Summary: non-negative least squares
Thank you Brian Ripley, Gardar Johannesson, and Marcel Wolbers for your prompt and friendly help! I will share any further learnings as I move through these suggestions. -Bob Abugov Brian Ripley wrote: I just use optim() on the sum of squares with non-negativity constraints. That did not exist in 1999. Gardar Johannesson wrote: You can always just use the quadratic programing library in R
2001 Nov 16
1
non-negative least squares?
In July of 1999 Douglas Bates invited R users to implement an algorithm for non-negative least squares based on Bates and Wolf, 1984: Communications in Statistics, Part B 13:841-850. <http://www.ens.gu.edu.au/robertk/R/help/99b/0058.html> I'm wondering if anybody has implemented this or something similar so I won't have to reinvent the wheel. Thanks! Bob Abugov
2012 Jan 04
0
Non Negative Least Squares Regression with nnls
Hello R experts, I have two questions related to the nnls library (http://www.inside-r.org/packages/cran/nnls), and more broadly to linear regression with positive coefficients. Sample code is below the Qs. Q1: Regular regression (with lm) gives me the significance of each variable. How do I get variable significance with nnls? If there's no ready function, any easy way to manually derive
2004 Mar 01
1
non-negative least-squares
Hi all, I am trying to do an inversion of electromagnetic data with non-negative least squares method (Tikhonov regularisation) and have got it programmed in S-Plus. However I am trying to move all my scripts from S-Plus to R. Is there an equivalent to nnls.fit in R? I think this can be done with pcls? Right? S-Plus script: A, L and data are matrices, lambda is a vector of possible lambda
2005 Mar 17
1
Optimization of constrained linear least-squares problem
Dear R-ians, I want to perform an linear unmixing of image pixels in fractions of pure endmembers. Therefore I need to perform a constrained linear least-squares problem that looks like : min || Cx - d || ? where sum(x) = 1. I have a 3x3 matrix C, containing the values for endmembers and I have a 3x1 column vector d (for every pixel in the image). In theory my x values should all be in the
2008 Mar 27
1
Significance of confidence intervals in the Non-Linear Least Squares Program.
I am using the non-linear least squares routine in "R" -- nls. I have a dataset where the nls routine outputs tight confidence intervals on the 2 parameters I am solving for. As a check on my results, I used the Python SciPy leastsq module on the same data set and it yields the same answer as "R" for the coefficients. However, what was somewhat surprising was the the
2006 Jul 11
3
least square fit with non-negativity constraints for absorption spectra fitting
I would really appreciate it if someone can give suggestions on how to do spectra fitting in R using ordinary least square fitting and non-negativity constraints. The lm() function works well for ordinary least square fitting, but how to specify non-negativity constraints? It wouldn't make sense if the fitting coefficients coming out as negative in absorption spectra deconvolution. Thanks.
2007 Feb 28
3
Packages in R for least median squares regression and computing outliers (thompson tau technique etc.)
Hi I am looking for suitable packages in R that do regression analyses using least median squares method (or better). Additionally, I am also looking for packages that implement algorithms/methods for detecting outliers that can be discarded before doing the regression analyses. Although some websites refer to "lms" method under package "lps" in R, I am unable to find such a
2008 Mar 27
1
[Re: Significance of confidence intervals in the Non-Linear Least Squares Program.]
Thanks for the response. I was not very clear in my original request. What I am asking is if in a non-linear estimation problem using nls(), as the condition number of the Hessian matrix becomes larger, will the t-values of one or more of the parameters being estimated in general become smaller in absolute value -- that is, are low t-values a sign of an ill-conditioned Hessian? Typical
2006 Feb 21
2
How to get around heteroscedasticity with non-linear least squares in R?
I am using "nls" to fit dose-response curves but am not sure how to approach more robust regression in R to get around the problem of the my error showing increased variance with increasing dose. My understanding is that "rlm" or "lqs" would not be a good idea here. 'Fairly new to regression work, so apologies if I'm missing something obvious.
2008 Oct 08
0
genoud nonlinear least squares optimisation
Hello, I am trying to optimise a nonlinear model to derive 'best-fit' parameter esimates using the genoud function. I have been using the genetic algorithm - gafit - in order to do this, but I am getting parameter estimates that do not always reach the global minimum. I am very keen to apply genoud to optimising this model to see if my results will improve, and also out of personal
2006 Feb 17
3
Matrix indexing in a loop
How do you specify matrix location a[i,j] (or a[i-1,j], etc.) in a "for" loop? I am looking for a flexible method of indexing neighbors over a series of lags (1,2,3...) and I may wish to extend this method to 3D arrays. Example: Data matrix > fun [,1] [,2] [,3] [1,] 1 5 9 [2,] 2 6 10 [3,] 3 7 11 [4,] 4 8 12 For each element a[i,j] in
2007 Sep 05
2
question about non-linear least squares in R
Hi, everyone, My question is: It's not every time that you can get a converged result from the nls function. Is there any solution for me to get a reasonable result? For example: x <- c(-0.06,-0.04,-0.025,-0.015,-0.005,0.005,0.015,0.025,0.04,0.06) y <- c(1866760,1457870,1314960,1250560,1184850,1144920,1158850,1199910,1263850,1452520) fitOup<- nls(y ~ constant + A*(x-MA)^4 +
2006 Sep 02
1
nonlinear least squares fitting Trust-Region"
Dear Mr Graves, Thank you very much for your response. Nobody else from this mailing list ventured to reply to me for the two weeks since I posted my question. "nlminb" and "optim" are just optimization procedures. What I need is not just optimization, but a nonlinear CURVE FITTING procedure. If there is some way to perform nonlinear curve fitting with the
2006 Mar 13
1
Constrained least squares
Is there a function in R for constrained linear least squares? I used the matlab function LSQLIN: my aim is to obtain non-negative regression coefficients which sum 1. Thanks in advance, domenico vistocco ___________________________________ Yahoo! Mail: gratis 1GB per i messaggi e allegati da 10MB http://mail.yahoo.it
2009 Jul 12
2
Nonlinear Least Squares nls() programming help
Hi, I am trying to use the nls() function to closely approximate a vector of values, colC and I'm running into trouble. I am not sure how if I am asking the program to do what I think its doing, because the same minimization in Excel's Solver does not run into problems. If anyone can tell me what is going wrong, and why I'm getting a singular convergence(7) error, please tell me. I
2007 Jun 11
0
Weighted least squares
As John noted, there are different kinds of weights, and different terminology: * inverse-variance weights (accuracy weights) * case weights (frequencies, counts) * sampling weights (selection probability weights) I'll add: * inverse-variance weights, where var(y for observation) = 1/weight (as opposed to just being inversely proportional to the weight) * weights used as part of an
2006 Oct 22
1
least median squares
Does anyone can provide a code to implement least median squares regression in R (not using the lqs function or calling C functions)? Reason: teaching/learning purposes Thanks PM
2007 May 09
1
generalized least squares with empirical error covariance matrix
I have a bayesian hierarchical normal regression model, in which the regression coefficients are nested, which I've wrapped into one regression framework, y = X %*% beta + e . I would like to run data through the model in a filter style (kalman filterish), updating regression coefficients at each step new data can be gathered. After the first filter step, I will need to be able to feed
2003 Sep 26
1
least squares regression using (inequality) restrictions
Dear R Users, I would like to make a lesast squares regression similar to that what is done by the command "lm". But additionally, I would like to impose some restrictions: 1) The sum of all regression coefficients should be equal to 1. 2) Each coefficient should assume a value between 0 and 1. (inequality restrictions) Which command is the best to use in order to solve this problem