similar to: help in R code

Displaying 20 results from an estimated 200 matches similar to: "help in R code"

2020 Oct 04
1
Help in R code
Hello , i am working in the functional time series using themultivariate time series data(hourly time series data). Sir? i am usingFAR model more than one order for which no statistical package is available inR, so for this i convert my data into functional form and obtained thefunctional principle component and from those FPCA i extract theircorresponding? FPCscores. Know i use the VAR model on
2020 Oct 18
1
Help in R code
Good morning,??Please help me to code this code in R. I working in the multivariate time series data, know my objective is that to one year forecast of the hourly time series data, using first five as a training set and the remaining one year as validation. For this??I transform the the data into functional data through Fourier basis functional, apply functional principle components as dimensional
2020 Oct 14
1
Help in Coding
Good morning dear administrators, Please help me to code this code in R. I working in the multivariate time series data, know my objective is that to one year forecast of the hourly time series data, using first five as a training set and the remaining one year as validation. For this??I transform the the data into functional data through Fourier basis functional, apply functional principle
2010 Mar 26
0
fda Data2fd
I would like to create a functional data object where both domain and range are matrices. My erroneous attempts are below. Can you suggest corrections? library(fda) domain<-matrix(c(1,1.1,1.2,2,2.1,2.1,3,3.1,3.2,4,4.1,4.2,5,5.1,5.2), nrow = 5, ncol=3, byrow=TRUE) range<-sin(domain)+matrix(rnorm(15,sd=0.1),nrow=5, ncol=3) myfd<-Data2fd(argvals=domain, y=range) #Error in
2005 Dec 22
2
tcpdump-smb won't work
I've read everything I've found on tcpdump-smb, and still can't get it to work right. I downloaded the binary from samba.org, and executed the command like so: (The command belowis directly from the README.smb that comes with tcpdump-3.4a5.tar.gz) ./tcpdump -i eth0 port 139 host 192.168.0.1 tcpdump: parse error How do I use it to get the decoded smb output? BTW: I also
2013 Apr 05
4
[LLVMdev] Integer divide by zero
On Fri, Apr 5, 2013 at 2:40 PM, Joshua Cranmer 🐧 <Pidgeot18 at gmail.com>wrote: ... > Per C and C++, integer division by 0 is undefined. That means, if it > happens, the compiler is free to do whatever it wants. It is perfectly > legal for LLVM to define r to be, say, 42 in this code; it is not required > to preserve the fact that the idiv instruction on x86 and x86-64 will
2008 Dec 16
2
"could not find function" error in "R CMD check"
Hi, All: What might cause "R CMD check" to report, "could not find function" for a function that has long been in the 'fda' package? Both Jim Ramsay in Ottawa, Canada, and I in San Jose, CA, get this same error. I replicated it with a fresh, anonymous checkout from R-Forge (svn checkout svn://svn.r-forge.r-project.org/svnroot/fda). With this, I did
2005 Apr 26
0
psy version 0.65 released
Dear R users, psy version 0.65 is on CRAN now. psy provides several methods used in psychometry (kappa, icc, cronbach, screeplot (with simulations), non linear mapping, etc.) A bug has been fixed in function wkappa (weighted kappa): in particular circumstances, the 2*2 table presented levels in an order different to what was suggested in the help file. The fpca function (PCA plot with
2005 Apr 26
0
psy version 0.65 released
Dear R users, psy version 0.65 is on CRAN now. psy provides several methods used in psychometry (kappa, icc, cronbach, screeplot (with simulations), non linear mapping, etc.) A bug has been fixed in function wkappa (weighted kappa): in particular circumstances, the 2*2 table presented levels in an order different to what was suggested in the help file. The fpca function (PCA plot with
2013 Apr 05
0
[LLVMdev] Integer divide by zero
On Apr 5, 2013, at 1:42 PM, Cameron McInally <cameron.mcinally at nyu.edu> wrote: > On Fri, Apr 5, 2013 at 2:40 PM, Joshua Cranmer 🐧 <Pidgeot18 at gmail.com> wrote: > ... > Per C and C++, integer division by 0 is undefined. That means, if it happens, the compiler is free to do whatever it wants. It is perfectly legal for LLVM to define r to be, say, 42 in this code; it is
2015 Dec 17
5
Assistance much appreciated
I have been struggling with this error message - and think I finally understand it's context. Start Line by line debugging shows me the function works: ... > saveRDS(val, mapfile) > val $variables $variables$IANA_HTTP_status_code_db [1] 0 1256 $variables$IANA_URI_scheme_db [1] 1256 3458 $variables$table_of_HTTP_status_codes [1] 4714 830 $references named list() $compressed
2013 Apr 05
0
[LLVMdev] Integer divide by zero
On 4/5/2013 1:23 PM, Cameron McInally wrote: > Hey guys, > > I'm learning that LLVM does not preserve faults during constant > folding. I realize that this is an architecture dependent problem, but > I'm not sure if it's safe to constant fold away a fault on x86-64. > > A little testcase: > > #include <stdio.h> > > int foo(int j, int d) { >
2018 Mar 21
0
Error in GDCprepare step of TCGAbiolinks
Dear Sir/ma'am, I'm using R-3.4.4 and TCGAbiolinks package for the analysis of GDC data. Till today i have reintalled R and R studio for 5 times but one error comes when i analyze the GDC data at the step GDCprepare. the data i am using is not a legacy data of GDC data portal. I think the problem is with my Laptop only because i have run the same commands in another PC and there was no
2013 Apr 05
3
[LLVMdev] Integer divide by zero
Hey guys, I'm learning that LLVM does not preserve faults during constant folding. I realize that this is an architecture dependent problem, but I'm not sure if it's safe to constant fold away a fault on x86-64. A little testcase: #include <stdio.h> int foo(int j, int d) { return j / d ; } int bar (int k, int d) { return foo(k + 1, d); } int main( void ) { int r =
2005 Jun 07
0
Smooth monotone estimation on R
Hello, We would like to apply the smooth monotone function to our data which correspond to a non-linear function. We follow the example posted on the web, but in our case it did not apply. We always get a straight line in response. Which parameters we should change. ind.basis = create.bspline.basis(c(min(time),max(time)),nbasis=38,norder=4) Wfdob =
2015 Dec 17
3
Assistance much appreciated
On 17/12/2015 9:06 AM, Michael Felt wrote: > More experimenting with calling commands: > > tools:::foobar() > Error: Line starting 'Package: tools ...' is malformed! > > tools::foobar() > Error: Line starting 'Package: tools ...' is malformed! These both do a loadNamespace("tools"). > > Tools:::foobar() > Error in loadNamespace(name)
2008 Nov 07
1
Problems with packages fda and splines (PR#13263)
Full_Name: David D Degras Version: 2.8.0 OS: Mac OS X Submission from: (NULL) (128.135.239.11) I have recently installed the version 2.8.0 of R along with package fda (v 2.0.2) and its dependencies (including package splines v. 2.8.0). Here are my problems: 1) The package splines should feature functions such a predict.bs, predict.bSpline and such and it does not! I can make calls to bs, ns,
2013 Apr 03
1
prop.test vs hand calculated confidence interval
Hi, This code: n=40 x=17 phat=x/n SE=sqrt(phat*(1-phat)/n) zstar=qnorm(0.995) E=zstar*SE phat+c(-E,E) Gives this result: [1] 0.2236668 0.6263332 The TI Graphing calculator gives the same result. Whereas this test: prop.test(x,n,conf.level=0.99,correct=FALSE) Give this result: 0.2489036 0.6224374 I'm wondering why there is a difference. D. -- View this message in context:
2008 May 13
1
Likelihood between observed and predicted response
Hi, I've two fitted models, one binomial model with presence-absence data that predicts probability of presence and one gaussian model (normal or log-normal abundances). I would like to evaluate these models not on their capability of adjustment but on their capability of prediction by calculating the (log)likelihood between predicted and observed values for each type of model. I found
2009 Mar 12
1
Cross-validation -> lift curve
Hi all, I'd like to do cross-validation on lm and get the resulting lift curve/table (or, alternatively, the estimates on 100% of my data with which I can get lift). If such a thing doesn't exist, could it be derived using cv.lm, or would we need to start from scratch? Thanks! -- Eric Siegel, Ph.D. President Prediction Impact, Inc. Predictive Analytics World Conference More info: