similar to: Splitting a Vector

Displaying 20 results from an estimated 2000 matches similar to: "Splitting a Vector"

2004 Jun 30
1
linear models and colinear variables...
Hi! I'm having some issues on both conceptual and technical levels for selecting the right combination of variables for this model I'm working on. The basic, all inclusive form looks like lm(mic ~ B * D * S * U * V * ICU) Where mic, U, V, and ICU are numeric values and B D and S are factors with about 16, 16 and 2 levels respectively. In short, there's a ton of actual explanatory
2012 Nov 16
1
Code works, but not as function.
Hi, I have some values in a list format generated by the following: Path_Number <- 0010 ID.Path <- formatC(0001:Path_Number, width=4, flag=0) # Make vector of ID's. No_of_Effectors <- sample(1:550, length(ID.Path), replace=TRUE) # Define Number of Effectors each individual gets. Effectors <- split(sample(1:10000, sum(No_of_Effectors), replace=TRUE), rep(ID.Path, No_of_Effectors))
2017 Dec 16
2
[RFC] - Deduplication of debug information in linkers (LLD).
>Wasn't our (lld/ELF's) position on debug info size that we should focus on providing a great split-dwarf workflow and not try go too far out of our way to deduplicate >or otherwise reduce debug info size inside LLD? I recall there being some patches that made linking of large debug binaries like 1.5GB+ clang faster, but we decided to >reject those changes because split-dwarf was
2011 May 16
1
Linear Discriminant Analysis error: "Variables appear constant"
Hi R experts, I'm attempting to run Linear Discriminant Analysis using the lda function in the MASS package. I've got around 50 predictor variables and one response variable. My response variable has 5 numeric categories that represent different clusters of fish abundance data (clusters were developed using Bray-Curtis and NMDS), and my predictor variables are environmental variables that
2010 Jan 07
1
logistic regression based on principle component analysis
Dear all: I try to analyse a dataset which contain one binary response variable and serveral predict variables, but multiple colinear problem exists in my dataset, some paper suggest that logistic regression for principle components is suit for these noise data, but i only find R can done principle component regression using "pls" package, is there any package that can do the task i
2017 Dec 16
3
[RFC] - Deduplication of debug information in linkers (LLD).
?But could not we for example do split dwarf, but for example do dedup of types ? I do not mean right now, but in a theory ? Best regards, George | Developer | Access Softek, Inc ________________________________ От: David Blaikie <dblaikie at gmail.com> Отправлено: 16 декабря 2017 г. 22:25 Кому: George Rimar Копия: Sean Silva; llvm-dev at lists.llvm.org; Rui Ueyama; Rafael Espindola Тема:
2004 Oct 16
3
Cox PH Warning Message
Hi, Can anybody tell me what the message below means and how to overcome it. Thanks, Neil Warning message: X matrix deemed to be singular; variable 2 in: coxph(Surv(age_at_death, death) ~ project$pluralgp + project$yrborn + ......... >
2006 Jul 05
2
Colinearity Function in R
Is there a colinearty function implemented in R? I have tried help.search("colinearity") and help.search("collinearity") and have searched for "colinearity" and "collinearity" on http://www.rpad.org/Rpad/Rpad-refcard.pdf but with no success. Many thanks in advance, Peter Lauren.
2004 Jul 16
1
Interaction between "wins support = yes" and "os level = 65"
I'm a little unclear about something. I want my Linux box to be the Local Browse Master -- so that the machine that's "on" all the time is the one that other computers look to. Is it correct that I want in my Global Settings: wins support = yes os level = 65 (or some higher number) And should my Windows XP workstations have the Linux box as the Wins Server? Or should I
2003 Sep 29
1
CP for rpart
Hi All, I have some questions on using library rpart. Given my data below, the plotcp gives me increasing 'xerrors' across different cp's with huge xstd (plot attached). What causes the problem or it's not a problem at all? I am thinking 'xerror's should be decreasing when 'cp' gets smaller. Also what the 'xstd' really tells us? If the error bars for
2002 Sep 15
7
loess crash
Hi, I have a data frame with 6563 observations. I can run a regression with loess using four explanatory variables. If I add a fifth, R crashes. There are no missings in the data, and if I run a regression with any four of the five explanatory variables, it works. Its only when I go from four to five that it crashes. This leads me to believe that it is not an obvious problem with the data,
2006 Oct 24
2
colinearity?
I'm sorry to all those who are tired of seeing my email appear in need of help. But, I've never coded in any program before, so this has been a difficult process for me. Is there a simple function to test for colinearity in R? I'm running a logistic regression and a linear regression. Thanks for the help! [[alternative HTML version deleted]]
2017 Dec 15
3
[RFC] - Deduplication of debug information in linkers (LLD).
>Not quite sure what you mean by "on linker side" - but I guess you mean using linker features like comdats etc, rather than DWARF parsing/reassembly/etc. I mean that it probably not a good idea for external library. I feel it is much more convinent to do such proccessing in a linker. Linker do and knows much more about things like sections that are ICFed, eliminated, about COMDATs
2017 Dec 17
2
[RFC] - Deduplication of debug information in linkers (LLD).
On Sat, Dec 16, 2017 at 11:40 AM George Rimar <grimar at accesssoftek.com> wrote: > Or following workflow: > > Split dwarf is used to make linker to proccess less, like relocations, > right ? > Partly, though the main motivation as far as I know, was to have to provide fewer bytes to the linker at all. That's why something like Apple's scheme (leave the debug info in
2012 Nov 06
2
R and SPSS
Hi group: I have a data set, which has severe colinearity problem. While running linear regression in R and SPSS, I got different models. I am wondering if somebody knows how to make the two software output the same results. (I guess the way R and SPSS handling singularity is different, which leads to different models.) Thanks. [[alternative HTML version deleted]]
2005 Aug 29
1
lme and ordering of terms
Dear R users, When fitting a lme() object (from the nlme library), is it possible to test interactions *before* main effects? As I understand, R conventionally re-orders all terms such that highest-order interactions come last - but I??d like to know if it??s possible (and sensible) to change this ordering of terms. I??ve tried the terms() command (from aov) but I don??t know if something
2008 Sep 04
1
Stepwise
Hi, Is there any facility in R to perform a stepwise process on a model, which will remove any highly-correlated explanatory variables? I am told there is in SPSS. I have a large number of variables (some correlated), which I would like to just chuck in to a model and perform stepwise and see what comes out the other end, to give me an idea perhaps as to which variables I should focus on. Thanks
2005 Oct 08
2
keeping interaction terms
Hello,<?xml:namespace prefix = o ns = "urn:schemas-microsoft-com:office:office" /><o:p></o:p> while doing my thesis in habitat modelling I´ve come across a problem with interaction terms. My question concerns the usage of interaction terms for linear regression modelling with R. If an interaction-term (predictor) is chosen for a multiple model, then, according to
2012 Nov 10
1
colineraity among categorical variables (multinom)
Dear all users, I"d like to ask you how to make decision about colinearity among categorical independent variables when the model is multinomial logistic regression. Any help is appreciated, Niklas [[alternative HTML version deleted]]
2008 May 28
1
Fixing the coefficient of a regressor in formula
Dear R users, I want to estimate a Cox PH model with time-dependent covariates so I am using a counting process format with the following formula: Surv(data$start, data$stop, data$event.time) ~ cluster(data$id) + G1 + G2 + G3 + G4 + G5 +G6 Gs represent a B-spline basis functions so they sum to 1 and I can't estimate the model as is without getting the last coefficient to be NA, which