similar to: If condition using accessors

Displaying 20 results from an estimated 4000 matches similar to: "If condition using accessors"

2023 Jun 11
1
Problem with filling dataframe's column
Dear Rui; Many thanks for your email. I used one of your codes, "data2$LU[which(data2$Layer == "Level 12")] <- "Park"", and it works correctly for me. Actually I need to expand the codes so as to consider all "Levels" in the "Layer" column. There are more than hundred levels in the Layer column. If I use your provided code, I have to write it
2023 Jun 11
1
Problem with filling dataframe's column
?s 22:54 de 11/06/2023, javad bayat escreveu: > Dear Rui; > Many thanks for your email. I used one of your codes, > "data2$LU[which(data2$Layer == "Level 12")] <- "Park"", and it works > correctly for me. > Actually I need to expand the codes so as to consider all "Levels" in the > "Layer" column. There are more than hundred
2009 Nov 13
2
why the same values cannot be judged to be the same in R
Hi Rusers, I found sometimes that the same values cannot be judged to be the same in R. Anybody knows the probelm? I think i ignored some minor detail. Thanks. Here is the example. ############ data1<-matrix(data=c(1,1.2,1.3,"3/23/2004",1,1.5,2.3,"3/22/2004",2,0.2,3.3,"4/23/2004",3,1.5,1.3,"5/22/2004"),nrow=4,ncol=4,byrow=TRUE)
2023 Jun 11
1
Problem with filling dataframe's column
?s 21:05 de 11/06/2023, javad bayat escreveu: > Dear R users; > I am trying to fill a column based on a specific value in another column of > a dataframe, but it seems there is a problem with the codes! > The "Layer" and the "LU" are two different columns of the dataframe. > How can I fix this? > Sincerely > > > for (i in 1:nrow(data2$Layer)){ >
2008 Nov 05
2
how can I save the estimates of a regression model in a file?
Dear all I need some help with R. How can I save the estimates of a regression model in a file? here is what I did: 1) this is my regression model: fit1 <- lm(logmilk ~ logdays + days, data=data2) 2) however, I want to get the parameters estimates for each individual (by group): so i did the following: by(data2, list(data2$V2),function(.data2) lm(logmilk ~ logdays + days, data= .data2)) 3)
2011 Jul 16
1
Fwd: construct boxplots from data with varying column widths
From: David Winsemius <dwinsemius at comcast.net> On Jul 16, 2011, at 12:15 PM, Rory Campbell-Lange wrote: > On 16/07/11, David Winsemius (dwinsemius at comcast.net) wrote: >> >> On Jul 16, 2011, at 11:19 AM, Rory Campbell-Lange wrote: >> >>> I'm an R beginner, and I would like to construct a set of boxplots >>> showing database function runtimes.
2009 Nov 06
1
probem on merge data
Hi there, data1<-matrix(data=c(1,1.2,1.3,"3/23/2004",1,1.5,2.3,"3/22/2004",2,0.2,3.3,"4/23/2004",3,1.5,1.3,"5/22/2004"),nrow=4,ncol=4,byrow=TRUE) data1<-data.frame(data1) names(data1)<-c("areaid","x","y","date") data1 areaid x y date 1 1 1.2 1.3 3/23/2004 2 1 1.5 2.3 3/22/2004 3 2
2010 Sep 06
1
combining collumns for data.frames
Hi This question is far less simple than the title suggests, please read carefully, thanks. I have 2 sets of data, both read into R >data1<-read.table ("1.txt", header=T, sep="\t") >data2<-read.table ("2.txt", header=T, sep="\t") >data1 Taxon stage1 stage2 stage3 stage4 T1 0 0 1 1 T2 0
2011 Nov 16
3
plotting a double y axis when x and y lengths differ
Hello All, Many thanks to the help I have received so far. Here is an example data set I hope to plot Data1 Year Data SE 1 2005 2 0.01 2 2006 4 0.01 3 2007 5 0.01 4 2008 2 0.01 5 2009 3 0.01 6 2010 6 0.01 Data2 Year Data SE 1 2006 32 1 2 2007 100 2 3 2008 60 4 4 2009 67 3 5 2010 8 1 Notice Data2 has one less years worth of data than Data1 (which is my
2008 Aug 21
1
problem merging two data sets ( one with a header and one without)
I have two set of data, Data1 and Data2 . Data1 has a header and Data2 does not. I would like to merge the two data sets after removing some columns from data2 . I am having a problem merging so I had to write and read final data and specify the ?header=F? so the merge can be done by?V1?. Is there a way to avoid this step. The problem is when I do cbind the FinalData has different column names
2004 Feb 04
5
Date Time Conversion problems...
At one time (version 1.7), the code below used to work for converting and extracting based on the Date Time. In version 1.8.1, something changed I know, but I cannot for the life of me figure out what... Data: UserName,RequestDate,PO,OrderDate,ExpDelivDate,Vendor,Total "Woody, Jim",12/19/2002,AP15063,1/7/2003,2/10/2003,Ames ,8570 "Harrold,
2009 Nov 12
2
redundant factor levels after subsetting a dataset
#I have a data frame with a numeric and a character variable. x=c(1,2,3,2,0,2,-1,-2,-4) md=c(rep("Miller",3), rep("Richard",3),rep("Smith",3)) data1=data.frame(x,md) #I subset this data.frame in a way such that one level of the character variable does not appear in the new dataset. data2=data1[x>0,] data3=subset(data1,x>0) #However, when I check the levels
2010 May 28
1
Match 2 vectors
Hi, I have 2 dataframes of unequal length, and I would like to match a factor to them so that both dataframes will have the same number of rows. example: # create the 2 dataframes with unequal length data1 <- data.frame(letters, 1:26)[-c(5,10,19:21),] data2 <- data.frame(letters, 1:26)[-c(6,9,15:18),] data2a <- match(data1[,1], data2[,1]) data2b <- data2[data2a,] When I match
2006 May 13
2
What does it mean to be "masked from data" when attaching? (Newbie question)
I have several data frames, each with six variables and several hundred cases broken out from a larger dataframe by eleven values of a factor called "Division". I have to perform the same analysis on each one. I would like to do it by creating a data frame called data2 eleven times, once with data corresponding to each value of the factor, and performing the same analysis on each of
2008 Oct 08
1
Lattice question: plotting two sets of data, defining groups for the second set
R friends, I'm running R 2.7.2 on Windows XP SP2. I have some data that's amenable to smoothing, and some that's not. I'm trying to plot smoothed lines for the former along with just points for the latter in a single panel. The problem comes when trying to break out the points by group. My sample code follows. data1 <-
2008 Nov 04
1
fuse_setlk_cbk error
I'm building a two node cluster to run vserver systems on. I've setup glusterfs with this config: # node a volume data-posix type storage/posix option directory /export/cluster end-volume volume data1 type features/posix-locks subvolumes data-posix end-volume volume data2 type protocol/client option transport-type tcp/client option remote-host
2007 Apr 10
2
Reverse-ATA : Using PSTN lines to connect to Asterisk
Hi, I'm looking for a few pointers on using ATA to connect Asterisk to the PSTN. Basically, I'm running a Hosted PBX service, and in urban centers I can usually get SIP or PRIs. Since I sell my customers SIP hardphones, the data flow is like this: Customer's SIP Hardphone ---- My own Asterisk ----- Outside lines But when it comes to smaller villages (I deal with people in tiny
2005 Sep 12
1
fit data with gammadistribution
hello my data is data2:2743 4678 21427 6194 10286 1505 12811 2161 6853 2625 14542 694 11491 14924 28640 17097 2136 5308 3477 91301 11488 3860 64114 14334 by calculating shape<-(mean(data2))^2/var(data2) scale<-var(data2)/mean(data2) i get the idea what the parameters of the gammadistribution would be. but if i try using the method mle() i get stock and i don't know, how to
2011 Oct 07
2
Merge dataframes
Hello, I am having some problems to use the 'merge' function. I'm not sure if I got its working right. What I want to do is: 1) Suppose I have a dataframe like: height width 1 1.1 2.3 2 2.1 2.5 3 1.8 1.9 4 1.6 2.1 5 1.8 2.4 2) And I generate a second
2012 May 27
3
Problem with strptime
Hello Forum, I have a problem with the strptime function. With the ''data1'' dataset below it works fine, but with the ''data2'' dataset something goes wrong (see final line below). Both data1 and data2 are in exactly the same original format, the only difference is that they span different dates. Please help, since it is driving me nuts! Many thanks. Best