search for: 23.00

Displaying 20 results from an estimated 62 matches for "23.00".

Did you mean: 3.00
2010 Nov 02
1
class changed after execution with sqldf
When I run sqldf to merge two datasets, it's changing the Date (class date) to a numeric value (class factor). Not sure why. Appreciate any insight. Console output for two datasets and the merged dataset (via sqldf) listed below. > summary(df.aggregate) Date Hour x Min. :2010-07-01 0 : 64 Min. : 0.00 1st Qu.:2010-07-25 1 :
1997 Jul 03
1
R-alpha: plot( pch = <character> ) is slow.. -- why ? --
Can anyone explain this to us : > unlist(version) platform arch os "sparc-sun-solaris2.5" "sparc" "solaris2.5" system status status.rev "sparc, solaris2.5" "Beta" "0"
2006 Jun 20
3
Create variables with common values for each group
Dear all, sorry, this is for sure really basic, but I searched a lot in the internet, and just couldn't find a solution. The problem is to create new variables from a data frame which contains both individual and group variables, such as mean age for an household. My data frame: df hhid h.age 1 10010020 23 2 10010020 23 3 10010126 42 4 10010126 60 5 10010142
2008 Mar 17
1
summary of summaries
Hi, I have a few hundreds files with numerical information of different length but with the same column structure. I use the following code to get summary statistics fplist <- list.files(pattern=".*analysis") for (fp in fplist){ x2 <- read.delim(fp) summary(x2) } Summary gives something like: summary (x2) V1 V2
2013 Mar 11
3
take two columns from a set of lists
say I have a matrix and lists like x <- matrix(c(12.1, 3.44, 0.1, 3, 12, 33.1, 1.1, 23), nrow=2) x.list <- lapply(seq_len(nrow(x)), function(i) x[i,]) if I want a column of the matrix x, I write x[, 2] for example. But how can I do something similar for a set of lists, x.list, above? > x.list [[1]] [1] 12.1 0.1 12.0 1.1 [[2]] [1] 3.44 3.00 33.10 23.00 unlist(x.list)[,2] does
2009 Aug 24
3
[LLVMdev] x86_64-apple-darwin Polyhedron 2005 benchmarks
The current llvm/llvm-gcc-4.2 2.6 branch passes all of the Polyhedron 2005 benchmarks built with its gfortran. The results compare as follows... Compile Command : gfortran -ffast-math -funroll-loops -msse3 -O3 %n.f90 -o %n benchmark gcc-4.2.4 llvm-gcc-svn llvm-gcc-2.6 llvm-gcc-2.6 at -m32 20081031 -m32 at -m32 at -m64 ac 18.30
2010 Mar 06
1
TukeyHSD model thing
Hi, I am trying to reproduce a tukey test in R ========================== x=c(145,40,40,120,180, 140,155,90,160,95, 195,150,205,110,160, 45,40,195,65,145, 195,230,115,235,225, 120,55,50,80,45 ) y2=c( rep(as.character(1),5), rep(as.character(2),5), rep(as.character(3),5), rep(as.character(4),5), rep(as.character(5),5), rep(as.character(6),5) ) crd2=data.frame(x,y2)
2016 May 25
6
Slow RAID Check/high %iowait during check after updgrade from CentOS 6.5 -> CentOS 7.2
I?ve posted this on the forums at https://www.centos.org/forums/viewtopic.php?f=47&t=57926&p=244614#p244614 - posting to the list in the hopes of getting more eyeballs on it. We have a cluster of 23 HP DL380p Gen8 hosts running Kafka. Basic specs: 2x E5-2650 128 GB RAM 12 x 4 TB 7200 RPM SATA drives connected to an HP H220 HBA Dual port 10 GB NIC The drives are configured as one large
2003 Feb 07
2
Data manipulation
I am interested in building a model with a subset of data from a column. The first 6 lines of my data look like this: QUAD YEAR SITE TREAT HERB TILL PLANT SEED Kweed 1 A4 2002 s 1 N N N N 55.00 2 A10 2002 s 1 N N N N 60.00 3 B2 2002 s 1 N N N N 35.00 4 C2 2002 s 1 N N N N 23.00 5 C9
2010 Feb 24
2
How to read percentage and currency data?
I'm struggling to find any help on this seemingly simple question - how does one read data with percentage (%) or currency (?,$ etc.) signs? When I try to read a data file which has any of those symbols in the data fields, they are read as characters rather than values. Is there a function or library which can deal with such values? As an example, I use this sample from one of chinna's
2011 Jun 24
2
SQL Changing Data Type
Passing in two dates to a sql statement (sqldf). Is returning a factor. Tried setting back to a Date via as.Date, but get an error the error: character string is not in a standard unambiguous format. Any thoughts appreciated. Code/Results listed below: > summary(df.possible.combos) Date Hour Min. :2011-03-01 Min. : 0.00 1st Qu.:2011-03-23 1st Qu.: 5.75
2013 Apr 30
0
lmer Error: Downdated X'X is not positive definite
Hi, This is the first time I've posted, and I apologize if I formulate this incorrectly. I am analyzing data from a multi-region carrot variety trial. 35 varieties of carrots were grown in 3 randomized complete blocks in organic and conventional fields in Wisconsin, Indiana, Washington, and California. In this example I am comparing the heights of the carrot tops at harvest. In other
2003 Sep 07
1
extracting monthly temperature data
I know that R is very advanced when it comes to DateTime handling. I am not quite as advanced as R however. I just downloaded a stupendously ugly dataset of hourly air temperature from 1985 to 2003. It has a great many NAs. I want to extract mean, median, max, and min monthly values. So far I have read it in as object. Date and Time are factors and Temp is an int. > summary(temp.dat)
2018 Apr 11
2
Unreasonably poor performance of replicated volumes
Hello everybody! I have 3 gluster servers (*gluster 3.12.6, Centos 7.2*; those are actually virtual machines located on 3 separate physical XenServer7.1 servers) They are all connected via infiniband network. Iperf3 shows around *23 Gbit/s network bandwidth *between each 2 of them. Each server has 3 HDD put into a *stripe*3 thin pool (LVM2) *with logical volume created on top of it, formatted
2012 Oct 12
1
Problem with which function
Hej, i need the which() funktion to find the positions of an entry in a matrix. the entries i'm looking for are : seq(begin,end,0.01) and there are no empty spaces i'm searching in the right range. so i was looking for the results R can find and i recieved this answer. for (l in
2017 Feb 23
1
Scaling to 10 Million IMAP sessions on a single server
On 23 Feb 2017, at 23.00, Timo Sirainen <tss at iki.fi> wrote: > > I mainly see such external databases as additional reasons for things to break. And even if not, additional extra layers of latency. Oh, just thought that I should clarify this and I guess other things I said. I think there are two separate things we're possibly talking about in here: 1) Temporary state: This is
2018 Apr 12
0
Unreasonably poor performance of replicated volumes
Guess you went through user lists and tried something like this already http://lists.gluster.org/pipermail/gluster-users/2018-April/033811.html I have a same exact setup and below is as far as it went after months of trail and error. We all have somewhat same setup and same issue with this - you can find same post as yours on the daily basis. On Wed, Apr 11, 2018 at 3:03 PM, Anastasia Belyaeva
2017 Nov 03
5
Extreme bunching of random values from runif with Mersenne-Twister seed
This is cross-posted from SO (https://stackoverflow.com/q/47079702/1414455), but I now feel that this needs someone from R-Devel to help understand why this is happening. We are facing a weird situation in our code when using R's [`runif`][1] and setting seed with `set.seed` with the `kind = NULL` option (which resolves, unless I am mistaken, to `kind = "default"`; the default being
2019 Apr 18
1
Problem with mysql backend and SSL ciphers
On 17.4.2019 23.00, Kostya Vasilyev via dovecot wrote: > I'm not Aki but hope you don't mind... > > On Wed, Apr 17, 2019, at 10:42 PM, TG Servers via dovecot wrote: >> Hi, >> >> MariaDB documentation says it accepts OpenSSL cipher strings in its >> ssl_cipher parameters like ssl_cipher="TLSv1.2". >> This is also mentioned when creating or
2018 Apr 13
1
Unreasonably poor performance of replicated volumes
Thanks a lot for your reply! You guessed it right though - mailing lists, various blogs, documentation, videos and even source code at this point. Changing some off the options does make performance slightly better, but nothing particularly groundbreaking. So, if I understand you correctly, no one has yet managed to get acceptable performance (relative to underlying hardware capabilities) with