similar to: different way for a for loop for several columns?

Displaying 20 results from an estimated 50000 matches similar to: "different way for a for loop for several columns?"

2012 Feb 15
2
function similar to ddply? + calculations based on previous row
Hi all, I was wondering if there is a function kind of similar that splits a dataframe, applies a function to each row and returns in a data frame. I know ddply but this one isn?t useful in this situation. I have a dataframe with values for each day (rows) for different objects (columns). I have values for several years. Now, I want to do calculations on only the data of that year. With the
2012 Feb 20
1
apply with as function ifelse with 2 logical conditions
Hi all, I have a question concerning using several conditions in an ifelse function used as the function in apply. I want to create a new value with the function ifelse ? object which can be coerced to logical mode ?test[n,] >1 & test[n-1,]==0? With n I mean the row. I don?t know how I could do this without a loop. I want to avoid the usage of loops and was thinking about apply. This
2012 Jan 18
3
manipulating data of several columns simultaneously
Dear all, I have a question concerning manipulating data of several columns of a dataframe at the same time. I manage to do it for one column (with the use of the specific name for this column). In each columns, I have 60 values. But I should reorganize the values (because I created this as an output before and I want to compare it with an other dataset). I want that the value on row 2 becomes
2005 Aug 13
1
How to make a lagged variable in panel data?
Suppose we observe N individuals, for each of which we have a time-series. How do we correctly create a lagged value of the time-series variable? As an example, suppose I create: A <- data.frame(year=rep(c(1980:1984),3), person= factor(sort(rep(1:3,5))), wage=c(rnorm(15))) > A year person wage 1 1980 1 0.17923212 2 1981
2013 Mar 14
1
ggplot2 problem
Hello all! I have a problem with ggplot2 library. I want to do an heat map and the y variables are the year months. If I use the following code, he y values are in alphabetical order, but I want it in month order. The code is: library(reshape) library(ggplot2) library(scales) p <- ggplot(data.m, aes(variable, Month)) + geom_tile(aes(fill = value),
2009 Mar 14
3
plotting question
Greetings all, I have two questions. I have a data set that is arranged in the example below. I wish to obtain a plot of the performance of each ID over Year on v1. It's not clear how I set this up? ID Year V1 1 1980 1 1 1981 2 1 1982 6 1 1983 4 2 1980 5 2 1981 5 2 1982 5 2 1983 6 Also,I would like to transpose the data to have the
2011 Nov 19
3
reshape data.frame
A late friday afternoon coding question. I'm having a hard time thinking of the correct search terms for what I want to do. If I have a df like this: a <- data.frame(name=c(rep('a',10),rep('b',15)),year=c(1971:1980,1971:1985),amount=1:25) name year amount 1 a 1971 1 2 a 1972 2 3 a 1973 3 4 a 1974 4 5 a 1975 5 6 a 1976
2011 Oct 12
1
exclude columns with at least three consecutive zeros
Hi everyone, I have a large data set with about 3'000 columns and I would like to exclude all columns which include three or more consecutive zeros (see below example). A further issue is that it should just jump NA values if any. How can I do this? In the below example R should exclude column C and D (since in D jumping the NA leaves three consecutive zeros). I would appreciate
2017 Sep 16
2
require help
You can just use the same code that I provided before but now use your dataset. Like this df <- read.csv(file="data2.csv",header=TRUE) dates <- as.Date(paste(df$year,"-01-01",sep="")) myXts <- xts(df,order.by=dates) head(myXts) #The last command "head(myXts)" shows you the first few rows of the xts object year cnsm incm wlth
2004 Mar 09
1
vector extraction
Hello, I could need some help on this one: >From the data.frame "Test.dataset2" below (TSCS data for 151 "countries.to.map" for "year" 1973-95; each "country.to.map" is described by a unique code), I would like to extract a vector "color" that for each "country.to.map" takes on the value of "dv" (a categorical variable with
2017 Sep 16
0
require help
oky.. thank you very much to all of you On Sat, Sep 16, 2017 at 2:06 PM, Eric Berger <ericjberger at gmail.com> wrote: > You can just use the same code that I provided before but now use your > dataset. Like this > > df <- read.csv(file="data2.csv",header=TRUE) > dates <- as.Date(paste(df$year,"-01-01",sep="")) > myXts <-
2017 Sep 22
2
require help
Assuming the input data.frame, DF, is of the form shown reproducibly in the Note below, to convert the series to zoo or ts: library(zoo) # convert to zoo z <- read.zoo(DF) # convert to ts as.ts(z) # Note: DF <- structure(list(year = c(1980, 1981, 1982, 1983, 1984), cnsm = c(174, 175, 175, 172, 173), incm = c(53.4, 53.7, 53.5, 53.2, 53.3), with = c(60.3, 60.5, 60.2, 60.1, 60.7)),
2010 Apr 26
2
Tapply.
Having some difficulties with understanding how tapply works and getting return values I expect Data: dataframe. DF DF$Id $D $Year....... Id D Year Jan Feb Mar Apr May Jun Jul Aug Sep Oct Nov Dec 11264402000 1 1980 NA NA NA NA NA 212 203 209 228 237 NA NA 11264402000 0 1981 NA NA 243 244 NA NA NA NA 225 NA 231 NA 11264402000 1
2009 Nov 22
1
Metaplot Axis Annotation
Hello, We are looking to adjust the font size of the axis annotation on the graph that results from use of the metaplot() function. Metaplot seems to respond to cex and cex.lab to change those graphical parameters, but it doesn't respond to cex.axis. Is there a way to work around this by creating a customized x-axis, and if so, how? Thanks for all your help. Syntax is below. Best, Dawn
2007 Jan 30
2
Simple Date problems with cbind
I am clearly misunderstanding something about dates and my reading of the help and RSiteSearch have not turned up anything. I have a variable of class "Date" and I want to add include it in a data.frame. However when do a cbind the date var is coerced into a numeric. However when I tried to create a example I also seem to be doing something wrong as I cannot seem even to create a
2008 Nov 04
1
perform Kruskal-Wallis test without using the built-in command in R
Hi, again i am stuck in my presentation, and i have never learn R before in my life but need this to be done, so please help me out for a favour: http://www.nabble.com/file/p20333155/kew.dat kew.dat run this in R and these comes up: Month Year Rain 1 Jan 1900 74.400000 2 Feb 1900 80.500000 3 Mar 1900 23.600000 4 Apr 1900 23.600000 5 May 1900 25.100000 6
2009 Jan 14
1
publication statistics from Web of Science
Dear list, This is a bit of an off-topic question, but I'm hoping to get some advice from more experienced people. I've used the website "Web of Science" to manually collect publication counts responding to several keywords as a function of date, since the 1960s. http://apps.isiknowledge.com/RAMore.do?product=UA&search_mode=&SID=P1g9lFJp9 at
2017 Sep 15
7
require help
hello to all. I am working on macroeconomic data series of India, which in a yearly basis. I am unable to convert my data frame into time series. kindly help me. also using zoo and xts packages. but they take only monthly observations. 'data.frame': 30 obs. of 4 variables: $ year: int 1980 1981 1982 1983 1984 1985 1986 1987 1988 1989 ... $ cnsm: num 174 175 175 172 173 ... $ incm:
1999 Jan 18
1
Program advice
Hi Starting to use R as a serious tool, I have come across a programming problem that I can't see the answer too yet. Can someone advise me plese. The problem is that I want to plot a series of lines which represent short term growths. All the data is in a single vector and I can indicate the index via a second vector. In GLIM, if the second vector is a factor, a single $GRA Size Year
2017 Jul 05
4
Help with reshape/reshape2 needed
Hi all: I'm struggling with getting my data re-formatted using functions in reshape/reshape2 to get from: 1957 0.862500000 1958 0.750000000 1959 0.300000000 1960 0.287500000 1963 0.675000000 1964 0.937500000 1965 0.025000000 1966 0.387500000 1969 0.087500000 1970 0.275000000 1973 0.500000000 1974 0.362500000 1976 0.925000000 1978 0.712500000 1979 0.337500000 1980 0.700000000 1981 0.425000000