similar to: not deleting from the root

Displaying 20 results from an estimated 10000 matches similar to: "not deleting from the root"

2007 Aug 31
2
memory.size help
I keep getting the 'memory.size' error message when I run a program I have been writing. It always it cannot allocate a vector of a certain size. I believe the error comes in the code fragement below where I have multiple arrays that could be taking up space. Does anyone know a good way around this? w1 <- outer(xk$xk1, data[,x1], function(y,z) abs(z-y)) w2 <- outer(xk$xk2,
2008 Mar 22
1
Simulating Conditional Distributions
Dear R-Help List, I'm trying to simulate data from a conditional distribution, and haven't been able to modify my existing code to do so. I searched the archives, but didn't find any previous post that matched my question. n=10000 pop = data.frame(W1 = rbinom(n, 1, .2), W2 = runif(n, min = 3, max = 8), W3 = rnorm(n, mean=0, sd=2)) pop = transform(pop, A = rbinom(n, 1,
2020 May 05
2
"Earlyclobber" but for a subset of the inputs
Hi Quentin, > It sounds like you only need the earlyclobber description for the N, N > variant. > In other words, as long as you use different opcodes for widen-op NN and > widen-op WN, you model exactly what you want. > > What am I missing? > we are using different opcodes for widen-op NN and widen-op WN. My understanding is that not setting earlyclobber to the W, N
2008 Apr 05
2
How to improve the "OPTIM" results
Dear R users, I used to "OPTIM" to minimize the obj. function below. Even though I used the true parameter values as initial values, the results are not very good. How could I improve my results? Any suggestion will be greatly appreciated. Regards, Kathryn Lord #------------------------------------------------------------------------------------------ x = c(0.35938587,
2008 Apr 05
2
How to improve the "OPTIM" results
Dear R users, I used to "OPTIM" to minimize the obj. function below. Even though I used the true parameter values as initial values, the results are not very good. How could I improve my results? Any suggestion will be greatly appreciated. Regards, Kathryn Lord #------------------------------------------------------------------------------------------ x = c(0.35938587,
2008 Jan 29
2
Using Predict and GLM
Dear R Help, I read through the archives pretty extensively before sending this email, as it seemed there were several threads on using predict with GLM. However, while my issue is similar to previous posts (cannot get it to predict using new data), none of the suggested fixes are working. The important bits of my code: set.seed(644) n0=200 #number of observations
2002 Jan 30
1
Hi,
Hi, Sorry for the confusion. I would like to estimate a model wherein the marginals of z with respect to w1 and w2 are smooth functions of x and y. I have data on z, x, y, w1 and w2. so E[dz/dw1] = f(x,y) and E[dz/dw2] = g(x,y) and I would like to estimate f(x,y) and g(x,y) I suppose I could try to fit something more general using projection pursuit, but the nature of the problem suggests
2012 Nov 28
1
Help setting optimization problem to include more constraints
Dear R-helpers, I am struggling with an optimization problem at the moment and decided to write the list looking for some help. I will use a very small example to explain what I would like to. Thanks in advance for your help. We would like to distribute resources from 4 warehouses to 3 destinations. The costs associated are as follows: Destination >From 1 2 3 Total
2011 Mar 10
1
getting percentiles by factor
Hello, I'm trying to get percentiles (PERCENTRANK for excel users) by factor in the following data.frame: myExample <- data.frame(Ret=seq(-2, 2.5, by=0.5),PE=seq(10,19),Sectors=rep(c("Financial","Industrial"),5)) myExample <- na.omit(myExample) Thanks to Patrick I I managed to put together the following lines which does it for the "Ret" column: myecdf
2013 Jan 09
2
Using objects within functions in formulas
Dear all, I'm looking to create a formula within a function to pass to glmer() and I'm having a problem that the following example will illustrate: library(lme4) y1 = rnorm(10) x1 = data.frame(x11=rnorm(10), x12=rnorm(10), x13=rnorm(10)) x1 = data.matrix(x1) w1 = data.frame(w11=sample(1:3,10, replace=TRUE), w12=sample(1:3,10, replace=TRUE), w13=sample(1:3,10, replace=TRUE)) test1 <-
2020 Apr 26
2
assembly code for array iteration generated by llvm is much slower than gcc
Hi all developers, I'm changing compiler from gcc to llvm on a RISCV target now. but I found in some case the assembly code generated by llvm is much more than gcc. It cause my program's performance about 40% decrease. The flowing is a simple test code. It shows the problem. We can see than gcc prefer to use pointer to iterate the array, but llvm perfere to use index to iterate
2011 Jan 10
2
Calculating Portfolio Standard deviation
Dear R helpers I have following data stocks <- c("ABC", "DEF", "GHI", "JKL") prices_df <- data.frame(ABC = c(17,24,15,22,16,22,17,22,15,19),                                          DEF = c(22,28,20,20,28,26,29,18,24,21),                                           GHI = c(32,27,32,36,37,37,34,23,25,32),                                          
2003 Jun 01
1
daemon crashes
Linux: RedHat 7.1 Samba: 2.2.7 Windoze #1: 98SE Windoze #2: W2K Here is the situation: copy files from W1 to Linux. At same time, transfer those files from Linux to W2. Of course, the transfer from Linux to W2 doesn't occur until the particular file has completed the transfer from W1 to Linux. This senerio of dual transfer from the same area on the Linux disc will ultimately case the smbd
2020 May 04
2
"Earlyclobber" but for a subset of the inputs
Hi all, I'm working on a target whose registers have equal-sized subregisters and all of those subregisters can be named (or the other way round: registers can be grouped into super registers). So for instance we've got 16 registers W (as in wide) W0..W15 and 32 registers N (as in narrow) N0..N31. This way, W0 is made by grouping N0 and N1, W1 is N2 and N3, W2 is N4 and N5, ..., W15 is
2009 Jan 23
1
Anova and unbalanced designs
Dear R-list! My question is related to an Anova including within and between subject factors and unequal group sizes. Here is a minimal example of what I did: library(car) within1 <- c(1,2,3,4,5,6,4,5,3,2); within2 <- c(3,4,3,4,3,4,3,4,5,4) values <- data.frame(w1 = within1, w2 = within2) values <- as.matrix(values) between <- factor(c(rep(1,4), rep(2,6))) betweenanova <-
2012 May 29
2
Wilcoxon-Mann-Whitney U value: outcomes from different stat packages
Given this example #start code a<-c(0,70,50,100,70,650,1300,6900,1780,4930,1120,700,190,940, 760,100,300,36270,5610,249680,1760,4040,164890,17230,75140,1870,22380,5890,2430) b<-c(0,0,10,30,50,440,1000,140,70,90,60,60,20,90,180,30,90, 3220,490,20790,290,740,5350,940,3910,0,640,850,260) wilcox.test(a, b, paired=FALSE) #sum of rank for first sample sum.rank.a <-
2003 Feb 11
1
mean function on correlation matrices (PR#2540)
Full_Name: Raymond Salvador Version: R 1.6.2 OS: Windows ME Submission from: (NULL) (131.111.93.195) The mean function applied on individual components of several correlation matrices gives a wrong result (gives the first value instead of the mean). Here there is a simple example x1 <- rnorm(10,1,1) y1 <- rnorm(10,1,1) z1 <- cbind(x1,y1) w1 <- cor(z1) x2 <- rnorm(10,1,1) y2
2007 Mar 06
1
A few more bugs...
Hello, I found a couple bugs concerning my favourite Shade/Unshade function. 1) Sometimes the shaded window gets more space than it should and, of course, nobody takes care about it, so everything drawn on this space remains here. If I move a decorated window along, its shadow cummulates quickly to full black here. Unfortunately, I don't know any reliable way how to reproduce this. (But it
2014 Sep 02
3
[LLVMdev] LICM promoting memory to scalar
All, If we can speculatively execute a load instruction, why isn’t it safe to hoist it out by promoting it to a scalar in LICM pass? There is a comment in LICM pass that if a load/store is conditional then it is not safe because it would break the LLVM concurrency model (See commit 73bfa4a). It has an IR test for checking this in test/Transforms/LICM/scalar-promote-memmodel.ll However, I have
2008 Nov 13
2
Weighted Sum Optimization in R (Maximization)
Dear All, First of all, this is the first time for me to use R for optimization, I tried to search r-help postings & googled on weighted sum optimization, I could not find anything applicable. I would need to optimize following function in R; MAXIMIZE function = w1*R1 + w2*R2 + w3*R3 + w4*R4 Where constraints are, w1 + w2 + w3 + w4 = 1 and 0 <= w1, w2, w3, w4 <= 1 Does optim