search for: 1.500000

Displaying 19 results from an estimated 19 matches for "1.500000".

Did you mean: 1.000000
2009 Jun 16
4
confusion on levels() function, and how to assign a wanted order to factor levels, intentionally?
Dear R-helpers, I want to make a series of boxplots on several numeric univariates with two group variables (species and population, population nested in species, and with population as the X-axis). In order to get a proper order of the individual populations in X-axis, I need to assign a wanted order to the factor (population). I used the levels() function to do this assignment, but it seemed
2012 Feb 17
4
How can I tabulate time series data (in RStudio or any other R editor)?
Hello, I have a question on how to tabulate the time series data. I use RStudio, but if can be done in any other R editor, it should work in RStudio as well. > a1<-11:22 > a1ts<-ts(a1, frequency=4, start=c(1978,1)) > a1ts Qtr1 Qtr2 Qtr3 Qtr4 1978 11 12 13 14 1979 15 16 17 18 1980 19 20 21 22 If I click the variable "a1ts" on the
2010 Dec 08
1
I want to get smoothed splines by using the class gam
Hi all, I try to interpolate a data set in the form: time Erg 0.000000 48.650000 1.500000 56.080000 3.000000 38.330000 4.500000 49.650000 6.000000 61.390000 7.500000 51.250000 9.000000 50.450000 10.500000 55.110000 12.000000 61.120000 18.000000 61.260000 24.000000 62.670000 36.000000 63.670000 48.000000 74.880000 I want to get smoothed splines by using the class gam The first way I tried , was
2007 Apr 17
1
predict.ar() produces wrong SE's (PR#9614)
Full_Name: Kirk Hampel Version: 2.4.1 OS: Windows Submission from: (NULL) (144.53.251.2) Given an AR(p) model, the last p SE's are wrong. The source of the bug is that the C code (ver 2.4.0) assumes *npsi is the length of the psi vector (which is n+p), whilst the predict.ar function in R passes out as.integer(npsi), where npsi <- n-1. Some R code following reproduces the error. Let p=4,
2013 Apr 17
2
On matrix calculation
Hello again, Let say I have a matrix: Mat <- matrix(1:12, 4, 3) And a vector: Vec <- 5:8 Now I want to do following: Each element of row-i in 'Mat' will be divided by i-th element of Vec Is there any direct way to doing that? Thanks for your help
2013 Jul 17
0
usar partial=TRUE en rake
Estimados usuarios de R: Estoy usando un programa en que utilizo la funciĆ³n rake: *rake(ps,list(~A,~B,~C),list(pop.A, pop.B, pop.C),control = list(maxit=1000, epsilon = 1, verbose=TRUE))* ** *Obtengo como resultado:* ** , , C = 1 B A 1 2 3 4 5 A1 0.000000 0.000000 0.000000 0.000000 0.000000 A2 3.000000 3.000000 0.000000
2009 May 06
1
Asterisk with Sphinx
Hi, Did anyone tried speech recognition using Sphinx ? I used sphinx using this website (http://scribblej.com/svn/) but when i run astsphinx i am getting the following error. Any clue what might have caused this problem ? Thanks -Azher INFO: s2_semi_mgau.c(1080): 1 mixture Gaussians, 256 components, 4 feature streams, veclen 51 INFO: s2_semi_mgau.c(748): Loading senones from dump file
2018 Mar 14
3
the same function returning different values when called differently..
dear members, I have a function ygrpc which acts on the daily price increments of a stock. It returns the following values: ygrpc(PFC.NS,"h") [1] 2.149997 1.875000 0.750000 0.349991 2.100006 0.199997 4.000000 2.574996 0.500000 0.349999 1.500000 0.700001 [13] 0.500000 1.300003 0.449997 2.800003 2.724998 66.150002 0.550003 0.050003 1.224991 4.899994 1.375000
2009 Jan 22
1
subset exact values
Hi- I need to subset the following data by the column 'dal' for values that equal the regular interval seq(0, 150, by=0.5) exactly.... ....excluding rows with irregular 'dal' values such as c(2.888958, 2.891620), etc. data<-data.frame(id=id, dal=dal, date=date, mu.x=mu.x) $dal [1] 0.000000 0.500000 1.000000 1.500000 2.000000 2.500000 2.888958 2.891620 3.000000 3.245405
2011 Aug 08
1
read in cel file by ReadAffy and read.celfile
Hi there, I got a problem when trying to read in a .cel file using ReadAffy(). R codes: require(affy) ReadAffy(filenames="CH1.CEL") It failed and I got the error, Error in read.celfile.header(as.character(filenames[[1]])) : Is CH1.CEL really a CEL file? tried reading as text, gzipped text, binary, gzipped binary, command console and gzipped command console formats Also, I tried
2018 Mar 14
0
Fwd: the same function returning different values when called differently..
Hi Akshay, (Please include r-help when replying) You have learned that PFC.NS and snl[[159]] are not identical. Now you have to figure out why they differ. This could also point to a bug or a logic error in your program. Figuring out how two objects differ can be a bit tricky, but with experience it becomes easier. (Some others may even have some suggestions for good ways to do it.) Basically
2009 Apr 12
1
looking for one-liner for strsplit and regex
Hi, I have a line such as: myline <- " 0.100000 1.5000 0.6000 538 0.369404" and I would like to put the numbers into a vector. Some combination of tabs and spaces occur between the numbers. I tried: try1 <- strsplit(myline,"[[:blank:]]+") > try1 [[1]] [1] "" "0.100000" "1.5000"
2010 Dec 03
1
Linear separation
In https://stat.ethz.ch/pipermail/r-help/2008-March/156868.html I found what linear separability means. But what can I do if I find such a situation in my data? Field (2005) suggest to reduce the number of predictors or increase the number of cases. But I am not sure whether I can, as an alternative, take the findings from my analysis and report them. And if so, how can I find the linear
2012 Sep 12
2
Deadlock in btrfs-cleaner, related to snapshot deletion
Hello, (this is a recap of yesterday''s discussion on BTRFS IRC, also to save relevant pastes before pastebins expire) I have my /home on btrfs; a cronjob makes one snapshot every 30 minutes; these snapshots are kept for 24-48 hours, then deleted in batches. This is a 16K Leaf/Node BTRFS on top of mdadm RAID1. As system uptime approached 2 weeks, I started noticing that the free space
2009 Jun 29
3
oggz-merge.exe
Hi folks,I'm joining this list because I've encountered difficulties with the ogg tools. I'm running Windows, and can't find binaries for liboggz tools, such as oggz-merge.exe Can someone provide oggz-merge.exe? So I use ffmpeg (v19289) for muxing ffmpeg -y -i sync2.ogg -i sync.ogv -vcodec copy -acodec copy sync2.ogv but the framerate fluctuates wildly on playback, and ogginfo
2013 Jun 10
1
btrfs-cleaner Blocked on xfstests 068
I''m running into a problem with the btrfs-cleaner thread becoming blocked on xfstests 068. The test locks up indefinitely without completing (normally it finished in about 45 seconds on my test box). I''ve replicated the issue on 3.10.0_rc5 and the for-linus branch of 3.9.0. I ran a git bisect on the 3.9.0 for-linus branch, and tracked my issue to the following commit: commit
2009 May 24
1
Animal Morphology: Deriving Classification Equation with Linear Discriminat Analysis (lda)
Fellow R Users: I'm not extremely familiar with lda or R programming, but a recent editorial review of a manuscript submission has prompted a crash cousre. I am on this forum hoping I could solicit some much needed advice for deriving a classification equation. I have used three basic measurements in lda to predict two groups: male and female. I have a working model, low Wilk's lambda,
2012 Jul 31
2
Btrfs Intermittent ENOSPC Issues
I''ve been working on running down intermittent ENOSPC issues. I can only seem to replicate ENOSPC errors when running zlib compression. However, I have been seeing similar ENOSPC errors to a lesser extent when playing with the LZ4HC patches. I apologize for not following up on this sooner, but I had drifted away from using zlib, and didn''t notice there was still an issue. My
2012 Aug 01
7
[PATCH] Btrfs: barrier before waitqueue_active
We need an smb_mb() before waitqueue_active to avoid missing wakeups. Before Mitch was hitting a deadlock between the ordered flushers and the transaction commit because the ordered flushers were waiting for more refs and were never woken up, so those smp_mb()''s are the most important. Everything else I added for correctness sake and to avoid getting bitten by this again somewhere else.