similar to: weighted kernel density estimation

Displaying 20 results from an estimated 400 matches similar to: "weighted kernel density estimation"

2006 Jun 14
1
Estimate region of highest probabilty density
Estimate region of highest probabilty density Dear R-community I have data consisting of x and y. To each pair (x,y) a z value (weight) is assigned. With kde2d I can estimate the densities on a regular grid and based on this make a contour plot (not considering the z-values). According to an earlier post in the list I adjusted the kde2d to kde2d.weighted (see code below) to estimate the
2005 Jan 20
2
(no subject)
Hello I would like to compare the results obtained with a classical non parametric proportionnal hazard model with a parametric proportionnal hazard model using a Weibull. How can we obtain the equivalence of the parameters using coxph(non parametric model) and survreg(parametric model) ? Thanks Virginie
2018 Feb 20
0
Unwanted behaviour of bw.nrd: sometimes, zero is returned as a valid bandwidth
Dear all, Sorry if I am posting to the wrong place, but I could not find the link for registration on the bug tracker, that?s why I am writing here. I think there is inconsistency between two R functions from the stats package, bw.nrd0 and bw.nrd. Consider the following vector: D <- c(0, 1, 1, 1, 1) bw.nrd(D) returns zero bandwidth for this object even without a warning. Considering the
2017 Dec 03
1
Discourage the weights= option of lm with summarized data
Peter, This is a highly structured text. Just for the discussion, I separate the building blocks, where (D) and (E) and (F) are new: BEGIN OF TEXT -------------------- (A) Non-?NULL? ?weights? can be used to indicate that different observations have different variances (with the values in ?weights? being inversely proportional to the variances); (B) or equivalently, when the elements of
2006 Jan 23
1
weighted likelihood for lme
Dear R users, I'm trying to fit a simple random intercept model with a fixed intercept. Suppose I want to assign a weight w_i to the i-th contribute to the log-likelihood, i.e. w_i * logLik_i where logLik_i is the log-likelihood for the i-th subject. I want to maximize the likelihood for N subjects Sum_i {w_i * logLik_i} Here is a simple example to reproduce
2003 Dec 15
1
distribution of second order statistic
Hi, I am getting some weird results here and I think I am missing something. I am trying to program a function that for a set of random variables drawn from uniform distributions plots that distribution of the second order statistic of the ordered variables. (ie I have n uniform distributions on [0, w_i] for w_i different w_j and i=1..n. I want to plot the distribution of the second order
2010 Oct 27
1
GLM and Weights
Dear all, I am trying to use the 'glm' package as part of a semiparametric technique that involves weighting a likelihood in various ways, i.e. L(theta;data)=Sum_i=1,..,n (W_i)(log L(theta;data_i)) Where W_i can be a kernel weighting function, or W_i can be an indicator of 'non-missingness' divided by a propensity score. In a Monte Carlo exercise, the option glm(...,
2004 Aug 06
0
hello
K2Fb4jFc4`eOyWeV|~]!5P")k:JgiZ k;tj2X.Hs!Yg`Qo{dDRqqOKEcE <J:DiMo]9g#"rw;);UY*8GayoN$r?g8Paxn0tb:wL' ~Nl^n7x%^ $`xi_oK?K&-[1vOWe 8xiXiR* i`C9{Xj]W_i^s!'zs( 0G ByNw,pHf&;_kb-`:c _QRG):P.7qIgan[[M-S vCXV)C UdepZlk2Bk(|-DD'}O[^*} Ru\~- hraw~**p'4nMnG3[Is1 g3dh!s t# Ca $z&)KCb`_:# ZT QwYBj"aTB/)/g;_zGjd8bsP u;\;fxMHe#/A"Cg
2005 Jan 03
0
speed of the cluster.stats function
Hello list (happy new yeaR), Here's a copy of a message i just send to Christian Hennig (who wrote the fpc package). That may interrest some of you, and maybe someone could have a better solution than mine. Romain. ------------------------------------------------------------------------------------------ Mister Hennig, [[[ I'm writing in english because i don't know german
2004 Dec 15
0
(sans objet)
Hello, Just look at the examples in ?persp. There is a function called trans3d defined in it that will traduce your 3D coordinates to 2D, and so you will be able to draw lines with lines function. Romain. Corey Bradshaw a ??crit : >I've created a perspective plot using 'persp' in the graphics package. >I'd like to add a second plane of z values to the existing plot, but I
2007 Jun 28
0
maximum difference between two ECDF's
Hello, I have a vector of samples x of length N. Associated with each sample x_i is a certain weight w_i. All the weights are in another vector w of the same length N. I have another vector of samples y of length n (small n). All these samples have equal weights 1/n. The ECDF of these samples is defined as for example at http://en.wikipedia.org/wiki/Empirical_distribution_function and I can
2013 Jul 02
0
Optimización MINLP
Muy buenas, Tengo la siguiente duda/problema, He optimizado con éxito un problema de este tipo: \sum f(x_i) donde f es una curva exponencial (función no lineal) sujeto a: a_i < x_i < b_i y \sum f(x_i) < Presupuesto Vamos, es repartir un presupuesto forzando a que inviertas como poco a_i y como mucho b_i para cada i Esto lo hecho correctamente usando el paquete:
2017 Nov 28
0
Discourage the weights= option of lm with summarized data
My local R-devel version now has (in ?lm) Non-?NULL? ?weights? can be used to indicate that different observations have different variances (with the values in ?weights? being inversely proportional to the variances); or equivalently, when the elements of ?weights? are positive integers w_i, that each response y_i is the mean of w_i unit-weight observations
2014 Oct 08
2
Optimización con restricciones lineales
Hola a todos, Estoy intentando resolver un problema de optimización con R con restricciones lineales, pero no consigo incluir dichas restricciones. Es decir, f<-function(w){ sd(...) # desviación típica de ciertos datos } optim(rep(1/2,8),fn = f,lower=0,upper=1,method='L-BFGS-B') # no se como incluir aquí las restricciones Las restricciones son: la suma de los w_i es 1 y todos los
2006 May 24
1
(PR#8877) predict.lm does not have a weights argument for
I am more than 'a little disappointed' that you expect a detailed explanation of the problems with your 'bug' report, especially as you did not provide any explanation yourself as to your reasoning (nor did you provide any credentials nor references). Note that 1) Your report did not make clear that this was only relevant to prediction intervals, which are not commonly used.
2006 Feb 10
1
Lmer with weights
Hello! I would like to use lmer() to fit data, which are some estimates and their standard errors i.e kind of a "meta" analysis. I wonder if weights argument is the right one to use to include uncertainty (standard errors) of "data" into the model. I would like to use lmer(), since I would like to have a "freedom" in modeling, if this is at all possible. For
2005 Dec 07
2
Bandwidth selection for ksmooth( )
Dear R Users, Before running ksmooth( ), a suitable bandwidth selection is needed. I use some functions for this task and receive these results for my data: width.SJ(y,nb=100,method="ste") : 40.25 bcv(y,nb=100) : 40.53 ucv(y) : 41.26 bandwidth.nrd(y) : 45.43 After implementing the function ksmooth(x,y, bandwidth= each of abovementioned bandwidths), I have some NAs
2012 Jul 26
2
density
Hi all, I have a question regarding the density function which gives the kernel density estimator. I want to decide the bandwidth when using gaussian kernel, given a set of observations. I am not familiar with different methods for bandwidth determination. Below are the different ways in R on deciding the bandwidth. Can anyone give an idea on which ones are preferred. Also, how can I take
2012 Mar 21
1
enableJIT() and internal R completions (was: [ESS-bugs] ess-mode 12.03; ess hangs emacs)
Hello, JIT compiler interferes with internal R completions: compiler::enableJIT(2) utils:::functionArgs("density", '') gives: utils:::functionArgs("density", '') Note: no visible global function definition for 'bw.nrd0' Note: no visible global function definition for 'bw.nrd' Note: no visible global function definition for 'bw.ucv'
2011 Jan 13
1
Weighted Optimization
Hi All, I am trying to code an R script which gives me the time varying parameters of the NIG and GH distributions. Further, becasue I think these these time varying parameters should be more responsive to more recent observations, I would like to include a weighted likelihood estimation proceedure where the observations have an exponentially decaying weighting rather than the equal weighting