Displaying 20 results from an estimated 100 matches similar to: "Aggregating data -- table almost does it"
2016 Jun 03
4
[Bug 11949] New: A malicious sender can still use symlinks to overwrite files
https://bugzilla.samba.org/show_bug.cgi?id=11949
Bug ID: 11949
Summary: A malicious sender can still use symlinks to overwrite
files
Product: rsync
Version: 3.1.2
Hardware: All
OS: All
Status: NEW
Severity: normal
Priority: P5
Component: core
Assignee:
2008 Jul 15
3
Melt (reshape) question
Dear all,
I
have a grid of 720 columns by 360 rows of global population density
values, and hope to convert this to column format using the 'melt' command in the 'reshape' package. I'm not receiving
any errors as such, but when the code has finished running, my output
looks like this:
> head(PopDens.long)
Latitude Longitude PopDensity
1 -84.75 V1 0
2
2008 Jun 17
2
Reshape or Stack? (To produce output as columns)
Dear all,
I have used 'read.table' to create a data frame of 720 columns and 360 rows (and assigned this to 'Jan'). The row and column names are numeric:
> columnnames <- sprintf("%.2f", seq(from = -179.75, to = 179.75, length = 720)).
> rnames <- sprintf("%.2f", seq(from = -89.75, to = 89.75, length = 360))
> colnames(Jan) <- columnnames
2009 Apr 08
2
Reshape - strange outputs
Dear R Users,
I am using the reshape package to reformat gridded data into column format using the code shown below. However, when I display the resulting object, a single column is fomed (instead of three) and all the latitude values (which should be in either column one or two) are collected at the bottom. Also, the NA values aren't removed, despite this being requested in the code.
Code:
2012 Nov 14
2
aggrete data from combination
Dear R users,
A have a dataframe (matrix) with two collumns (plot, and diameter (d)). I
want all diameters values for different combination of plots.
For example I want all d values for all posible combination, 100C2 (all d
values for plot 1 with all d values in the plot 2.......with all d values
from plot 1 with all d values from plot 100, ...... with all d values from
plot 99 with all d values
2004 Jun 07
0
authentication, pam, etc.
Dear Samba folks,
I very recently replaced our SGI Challenge S file
server, which employed samba 2 to service Windows boxes,
with an Intel Linux box running Fedora core 2 Linux
with samba 3.0.3.
When I connect to the server, it takes a number
of *minutes* to get an authentication challenge
window. After entering name and password,
the connection proceeds, and shares are displayed
normally.
2004 Jun 08
0
authentication, pam, etc. (more)
Dear Samba Folks,
Re the message I sent you earlier (reproduced
below). The logs are also producing error
messages such as this:
smbd[12778]: pam_succeed_if: requirement "uid < 100" not met by user
"bonomo"
smbd[12821]: pam_succeed_if: requirement "uid < 100" not met by user
"bonomo"
smbd[12840]: pam_succeed_if: requirement "uid <
2016 Apr 15
0
aggregate combination data
Hello,
I'm cc'ing R-Help.
Sorry but your question was asked 3.5 years ago, I really don't
remember it. Can you please post a question to R-Help, with a
reproducible example that describes your problem?
Rui Barradas
?
Citando catalin roibu <catalinroibu at gmail.com>:
> Dear Rui,
> ?
> I helped me some time ago with a code..... regarding aggregated data
>
2009 Mar 16
0
Ignore switch to REVERSED Polarity on channel 1, state 4
Hi,
Trying to trace an asterisk hang on a production (it had to be didn't
it) system. The last thing before it crashed was
[Mar 16 12:32:42] DEBUG[7754] chan_zap.c: Ignore switch to REVERSED
Polarity on channel 1, state 4
[Mar 16 12:54:34] DEBUG[7754] chan_zap.c: Ignore switch to REVERSED
Polarity on channel 2, state 4
[Mar 16 12:54:35] DEBUG[7754] chan_zap.c: Ignore switch to REVERSED
2017 Nov 14
0
Aggregating Data
R-Help
I created a "shortdate" for the purpose of aggregating each var (S72 .S119)
by daily sum , but not sure how to handle using a POSIXlt object.
> myData$shortdate <- strftime(myData$time, format="%Y/%m/%d")
> head(myData)
time s72 s79 s82 s83 s116 s119 shortdate
1 2016-10-03 00:00:00 0 0 1 0 0 0 2016/10/03
2 2016-10-03 01:00:00
2017 Nov 14
0
Aggregating Data
R-Help
Please disregard as I figure something out, unless there is a more elegant
way ...
myData.sum <- aggregate(x =
myData[c("s72","s79","s82","s83","s116","s119")],
FUN = sum,
by = list(Group.date = myData$shortdate))
> head(myData.sum)
Group.date s72 s79 s82 s83 s116 s119
1
2002 Nov 06
1
Aggregating a List
Hi all,
There must be a really obvious R solution to this, but I can't figure out
how to aggregate a list. For instance, if I read.table the following from
a file:
Val1 Val2
A 3 4
A 5 6
B 4 4
I would like to take the mean (or median) across any/all rows of type "A"
to end up with the structure:
Val1 Val2
A 4 5
B 4 4
in this case. How would I go about doign that w/o doing a
2006 Jan 19
0
aggregating variables with pca
hello R_team
having perfomed a PCA on my fitted model with the function:
data<- na.omit(dataset)
data.pca<-prcomp(data,scale =TRUE),
I´ve decided to aggregate two variables that are highly correlated.
My first question is:
How can I combine the two variables into one new predictor?
and secondly:
How can I predict with the newly created variable in a new dataset?
Guess I need the
2006 Apr 07
1
Aggregating an its series
I'm using a very long irregular time-series of air temperature and
relative humidity of this kind (this is an extract only)
its.format("%
Y%d%m %X)
> base
T H
20020601
12.00.00 27.1 47
20020601 15.00.00 29.1 39
20020601 18.00.00 27.4 39
20020601 21.00.00 24.0 40
20020602 0.00.00 22.0 73
20020602 3.00.00
19.2 49
20020602 6.00.00 19.5 74
20020602
2007 Mar 28
2
aggregating data with Zoo
Is there a way of aggregating 'zoo' daily data according to day of week? eg
all Thursdays
I came across the 'nextfri' function in the documentation but am unsure how
to change this so any day of week can be aggregated.
I have used POSIX to arrange the data (not as 'zoo' series) according to day
of week, but am curious if I've missed if a similar option available
2010 Aug 01
1
aggregating a daily zoo object to a weekly zoo object
Dear R People:
I'm trying to convert a daily zoo object to a weekly zoo object:
xdate <- seq(as.Date("2002-01-01"),as.Date("2010-07-10"),by="day")
library(zoo)
length(xdate)
xt <- zoo(rnorm(3113),order=xdate)
xdat2 <- seq(index(xt)[1],index(xt)[3113],by="week")
xt.w <- aggregate(xt,by=xdat2,mean)
Error: length(time(x)) ==
2011 Oct 17
0
Aggregating Survey responses for weighting
I have about 27,000 survey responses from across about 150 Bus Routes, each with potentially 100 stops. I've recorded the total Ons and Offs for each stop on each bus run, as well as the stop pair each survey response corresponds to.
I wish to create weights based on the On and Off stop for each line and direction. This will create a very sparse "half table" (observations by
2005 Oct 24
0
aggregating using several functions
Dear R users,
I would like to aggregate a data frame using several functions at once
(e.g., mean plus standard error).
How can I make this work using aggregate()? The help file says scalar
functions are needed; can anyone help?
Below is the code for my "meanse" function which I??d like to use like this:
aggregate(dataframe, list(factorA,factoB),meanse)
Thanks for your help!
2006 Oct 18
0
Aggregating a data frame (was: Re: new R-user needs help)
Please use an informative subject for sake of the archives.
Here are several solutions:
aggregate(DF[4:8], DF[2], mean)
library(doBy)
summaryBy(x1 + x2 + x3 + x4 + x5 ~ name, DF, FUN = mean)
# if Exp, name and id columns are factors then this can be reduced to
library(doBy)
summaryBy(. ~ name, DF, FUN = mean)
library(reshape)
cast(melt(DF, id = 1:3), name ~ variable, fun = mean)
On
2009 Jul 28
2
aggregating strings
I am currently summarising a data set by collapsing data based on common identifiers in a column. I am using the 'aggregate' function to summarise numeric columns, i.e. "aggregate(dat[,3], list(dat$gene), mean)". I also wish to summarise text columns e.g. by concatenating values in a comma separated list, but the aggregate function can only return scalar values and so something