similar to: Weibull- Random Censoring

Displaying 20 results from an estimated 400 matches similar to: "Weibull- Random Censoring"

2010 Sep 20
1
Removing slected values from original vector and definning new vector with the rest?
sampleSize <- 20 shape.true <- 1.82 scale.true <- 987 sampWB <- rweibull(sampleSize, shape=shape.true, scale=scale.true) print(sampWB) censidx <- sample(1:length(sampWB), length(sampWB)*0.3) Censored.data <- sampWB[censidx] noncensidx <- defines the rest values of the vector which is not included at Censored.data? [[alternative HTML version deleted]]
2010 Sep 16
1
Weibull simulation- number of items to replace is not a multiple of replacement length
Hi, I write below code for simulation for weibull- estimating parameters by weibullMLE function, Although I define metrix for the variables still I got this message: number of items to replace is not a multiple of replacement length Any suggestion > est=matrix (NA, 2,2) > se=matrix (NA, 2,2) > for ( p in 1:2) { + sampleSize <- 20 + shape.true <- 1.82 + scale.true <- 987 +
2011 Sep 09
3
Reliability metric
Hi, Below is the code I write, I am trying to create a metric of h and t while the values is out of function R. First I have message warning Second, the metric is not created > h <- seq(0.1, 0.9, by=0.1) > t <- seq(0,11000, by=100) > z <- cbind(t) > eta=10000 > beta=2 > R <- array (1:1100, dim= c(110,10)) > R= exp(-(z/eta*(1-h))^(beta*(1-h))) Warning messages: 1:
2010 Sep 17
2
Matrix- create mean/min/max/stdev on column of matrix or rows?
I made simulation with Weibull and create Matrix, How can I create mean/min/max/stdev on column or rows of matrix?, Thanks,
2008 Apr 13
2
Arrays and functions
Hi, I' am doing a stats project using R to work out the size of a t-test and wilcoxon test depending on the distribution and sample size. I just can't get it to work - I want to put my results from the function size() into an array.At the moment I keep getting the error message:Error in res[distribution, test, samplesize] <- results : subscript out of boundsCan anyone tell me where
2009 Sep 18
1
lapply - value changes as parameters to function?
Hi, I'm trying to get better at things like lapply but it still stumps me. I have a function I've written, tested and debugged using individual calls to the function, ala: ResultList5 = DoAvgCalcs(IndexData, Lookback=5, SampleSize=TestSamples , Iterations=TestIterations ) ResultList8 = DoAvgCalcs(IndexData, Lookback=8, SampleSize=TestSamples , Iterations=TestIterations ) ResultList13
2012 Nov 21
1
Listing elements of a 4D array
Dear list, I'm having trouble to see how my elements on a 4 dimensional array are listed. For example, I generated the following array: junk.melt=melt(occ.data,id.var=c("Especie", "Site", "Rep", "Año"), measure.var="Pres") y=cast(junk.melt, Site ~ Rep ~ Especie ~ Año) Now, I want to be able to look at how my species (Especie) are listed, in
2010 Jul 22
1
How do I get rid of list elements where the value is NULL before applying rbind?
Here is the function that makes the data.frames in the list: funweek <- function(df) if (length(df$elapsed_time) > 5) { res = fitdist(df$elapsed_time,"exp") year = df$sale_year[1] sample = df$sale_week[1] mid = df$m_id[1] estimate = res$estimate sd = res$sd samplesize = res$n loglik = res$loglik aic = res$aic bic = res$bic chisq =
2018 Oct 04
2
Bug : Autocorrelation in sample drawn from stats::rnorm (hmh)
Hi Hugo, I've been able to replicate your bug, including for other distributions (runif, rexp, rgamma, etc) which shouldn't be surprising since they're probably all drawing from the same pseudo-random number generator. ?Interestingly, it does not seem to depend on the choice of seed, I am not sure why that is the case. I'll point out first of all that the R-devel mailing list is
2018 Oct 04
2
Bug : Autocorrelation in sample drawn from stats::rnorm (hmh)
Hi Hugo, I've been able to replicate your bug, including for other distributions (runif, rexp, rgamma, etc) which shouldn't be surprising since they're probably all drawing from the same pseudo-random number generator. ?Interestingly, it does not seem to depend on the choice of seed, I am not sure why that is the case. I'll point out first of all that the R-devel mailing list is
2008 Jan 07
3
Seeking a more efficient way to find partition maxima
Hi. Suppose I have a vector that I partition into disjoint, contiguous subvectors. For example, let v = c(1,4,2,6,7,5), partition it into three subvectors, v1 = v[1:3], v2 = v[4], v3 = v[5:6]. I want to find the maximum element of each subvector. In this example, max(v1) is 4, max(v2) is 6, max(v3) is 7. If I knew that the successive subvector maxima would never decrease, as in the example,
2001 Dec 09
1
Help for Power analysis
Dear colleague, I not sure this R code is correctly ? I would to show the number of Sample Size at Sample Size Axis that line draw from Power Axis (80%) from R code. How I show this and select the most appropriate of this power (.79955687 - 80983575). Thank for your help and answer. Best Regards, Nikom Thanomsieng, Email: nikom at kku.ac.th .... #Power analysis: Sample size for
2011 Mar 25
1
Appending data to a data.frame and writing a csv
Dear R helpers exposure <- data.frame(id = c(1,2,3,4,5,6,7,8,9,10,11,12,13,14,15,16,17,18,19,20), ead = c(9483.686,50000,6843.4968,10509.37125,21297.8905,50000,706152.8354, 62670.5625, 687.801995,50641.4875,59227.125,43818.5778,52887.72534,601788.7937, 56813.14859,4012356.056,1419501.179,210853.4743,749961,6599.0862), pd =
2018 Oct 05
2
Bug : Autocorrelation in sample drawn from stats::rnorm (hmh)
On 05/10/2018, 09:45, "R-help on behalf of hmh" <r-help-bounces at r-project.org on behalf of hugomh at gmx.fr> wrote: Hi, Thanks William for this fast answer, and sorry for sending the 1st mail to r-help instead to r-devel. I noticed that bug while I was simulating many small random walks using c(0,cumsum(rnorm(10))). Then the negative
2010 Aug 11
1
sem & psych
Dear R users, I am trying to simulate some multitrait-multimethod models using the packages sem and psych but whatever I do to deal with models which do not converge I always get stuck and get error messages such as these: "Error in summary.sem(M1) : coefficient covariances cannot be computed" "Error in solve.default(res$hessian) : System ist f?r den Rechner singul?r: reziproke
2007 Feb 01
3
Can this loop be delooped?
Hi. I have the following code in a loop. It splits a vector into subvectors of equal size. But if the size of the original vector is not an exact multiple of the desired subvector size, then the first few subvectors have one more element than the last few. I know that the cut function could be used to determine where to break up the vector, but it doesn't seem to provide control over
2012 Jan 13
2
beanplot-Error: sample is too sparse to find TD
Hi all, Since two days I am trying to find a solution to be able to create beanplots for my data. When I call the beanplot function the following error appears: > beanplot(y1 ~ x1, log="", what=c(1,1,1,0), ylim=c(0,1)) > Error in bw.SJ(x, method = "dpi") : sample is too sparse to find TD What is really strange: I have 32 different vectors and the problem occurs for
2006 Mar 12
1
meta / lme
Hi I'm conducing a meta-analysis using the meta package. Here's a bit of code that works fine - tmp <- metacont(samplesize.2, pctdropout.2, sddropout.2, samplesize.1, pctdropout.1, sddropout.1, data=Dataset, sm="WMD") I would now like to control for a couple of variables (continuous and categorical) that aren't in the equation. Is meta
2009 Aug 25
3
Regular expression to define contents between parentheses
Hello dear R-helpers, I haven't been able to figure out of find a solution in the R-help archives about how to delete all the characters contained in groups of parenthesis. I have a vector that looks more or less like this: myvector<-c("something (80 km/h, sd) & more (6 kg/L,sd)", "somethingelse (48 m/s, sd) & moretoo (50g/L , sd)") I want to extract all
2009 Apr 17
1
matching subvectors in vector sets
Hi, I've got a list of ~20000 elements that look like this: [1] "A00096:A00096:A00096:A00096:A02178:A02178:A07776" [2] "A00046:A00076:A01101:A04146:A05671:A07169" [3] "A00038:A00932:A02185:A02370:A02818:A02818:A02818:A02818:A04732:A07142:A07142" [4] "A00096:A01352:A01352:A02023:A05001:A05001:A07776" [5]