similar to: help needed with help

Displaying 20 results from an estimated 3000 matches similar to: "help needed with help"

2010 Oct 28
3
help with help()
Hi all Just this morning I upgraded to R 2.12.0 (for Mac OS X 10.6.4). All went well until I needed to run a help() or help.search() in my session, which I'm running within Emacs (ESS 5.3.7). Say I need help with the command 'density'. When I type help(density) or ?density the ESS help buffer opens, it is titled *help[R](density)* but it contains only a couple of lines saying,
2010 Sep 18
1
help manual on R on ESS
Hi folks, R on ESS Why it is unable to pop up help manual on "R on ESS"? ?layout Error in help("layout", htmlhelp = FALSE) : unused argument(s) (htmlhelp = FALS ?plot Error in help("plot", htmlhelp = FALSE) : unused argument(s) (htmlhelp = FALSE) etc. But they work on R console and popup the manual without problem. TIA B.R. Stephen L
2010 Jun 14
2
how to change default help settings from factory default html
Hi all Apologies if this is a trivial question- I have searched the lists and the online help files etc but have not managed to find anything. I recently downloaded the latest version of R, which has the help type set to htmlhelp as default (according to http://127.0.0.1:18380/library/utils/html/help.html) I would very much like to be able to access the help files when I am offline by typing
2005 Dec 13
3
Age of an object?
It would be nice to have a date stamp on an object. In S/Splus this was always available, because objects were files. I have looked around, but I presume this information is not available. -------------------------------------------------------------------- Trevor Hastie hastie at stanford.edu Professor, Department of Statistics, Stanford University Phone:
2004 Jun 24
3
problem with model.matrix
This works: > model.matrix(~I(pos>3),data=data.frame(pos=c(1:5))) (Intercept) I(pos > 3)TRUE 1 1 0 2 1 0 3 1 0 4 1 1 5 1 1 attr(,"assign") [1] 0 1 attr(,"contrasts") attr(,"contrasts")$"I(pos > 3)" [1] "contr.treatment"
2004 Aug 06
2
gam --- a new contributed package
I have contributed a "gam" library to CRAN, which implements "Generalized Additive Models". This implementation follows closely the description in the GAM chapter 7 of the "white" book "Statistical Models in S" (Chambers & Hastie (eds), 1992, Wadsworth), as well as the philosophy in "Generalized Additive Models" (Hastie & Tibshirani 1990,
2004 Aug 06
2
gam --- a new contributed package
I have contributed a "gam" library to CRAN, which implements "Generalized Additive Models". This implementation follows closely the description in the GAM chapter 7 of the "white" book "Statistical Models in S" (Chambers & Hastie (eds), 1992, Wadsworth), as well as the philosophy in "Generalized Additive Models" (Hastie & Tibshirani 1990,
2003 Sep 14
3
Re: Logistic Regression
Christoph Lehman had problems with seperated data in two-class logistic regression. One useful little trick is to penalize the logistic regression using a quadratic penalty on the coefficients. I am sure there are functions in the R contributed libraries to do this; otherwise it is easy to achieve via IRLS using ridge regressions. Then even though the data are separated, the penalized
2005 Aug 09
8
Digest reading is tedious
Like many, I am sure, I get R-Help in digest form. Its easy enough to browse the subject lines, but then if an entry interests you, you have to embark on this tedious search or scroll to find it. It would be great to have a "clickable" digest, where the topics list is a set of pointers, and clicking on a topic takes you to that entry. I can think of at least one way to do this via
2004 Jan 07
0
Statistical Learning and Datamining course based on R/Splus tools
Short course: Statistical Learning and Data Mining Trevor Hastie and Robert Tibshirani, Stanford University Sheraton Hotel Palo Alto, CA Feb 26-27, 2004 This two-day course gives a detailed overview of statistical models for data mining, inference and prediction. With the rapid developments in internet technology, genomics and other high-tech industries, we rely increasingly more on data
2004 Jul 12
0
Statistical Learning and Data Mining Course
Short course: Statistical Learning and Data Mining Trevor Hastie and Robert Tibshirani, Stanford University Georgetown University Conference Center Washington DC September 20-21, 2004 This two-day course gives a detailed overview of statistical models for data mining, inference and prediction. With the rapid developments in internet technology, genomics and other high-tech industries, we
2005 Jan 04
0
Statistical Learning and Data Mining Course
Short course: Statistical Learning and Data Mining Trevor Hastie and Robert Tibshirani, Stanford University Sheraton Hotel, Palo Alto, California February 24 & 25, 2005 This two-day course gives a detailed overview of statistical models for data mining, inference and prediction. With the rapid developments in internet technology, genomics and other high-tech industries, we rely
2002 Feb 05
1
htmlhelp() question
I wonder if anyone who has worked on the win32 version of R could help me with a HtmlHelp question? When you're building a win32 program using mingw (in my case, cross-compiling under GNU/Linux), what import library do you use to link against the HtmlHelp() function? I have got a copy of MS's htmlhelp.lib, but mingw doesn't seem to like this format; it wants an archive in .a format.
2006 Mar 07
0
Statistical Learning and Datamining Course
Short course: Statistical Learning and Data Mining II: tools for tall and wide data Trevor Hastie and Robert Tibshirani, Stanford University Sheraton Hotel, Palo Alto, California, April 3-4, 2006. This two-day course gives a detailed overview of statistical models for data mining, inference and prediction. With the rapid developments in internet technology, genomics, financial
2006 Jan 14
0
Data Mining Course
Short course: Statistical Learning and Data Mining II: tools for tall and wide data Trevor Hastie and Robert Tibshirani, Stanford University Sheraton Hotel, Palo Alto, California, April 3-4, 2006. This two-day course gives a detailed overview of statistical models for data mining, inference and prediction. With the rapid developments in internet technology, genomics, financial
2010 Nov 04
0
glmnet_1.5 uploaded to CRAN
This is a new version of glmnet, that incorporates some bug fixes and speedups. * a new convergence criterion which which offers 10x or more speedups for saturated fits (mainly effects logistic, Poisson and Cox) * one can now predict directly from a cv.object - see the help files for cv.glmnet and predict.cv.glmnet * other new methods are deviance() for "glmnet" and coef() for
2003 Apr 30
0
Least Angle Regression packages for R
Least Angle Regression software: LARS "Least Angle Regression" ("LAR") is a new model selection algorithm; a useful and less greedy version of traditional forward selection methods. LAR is described in detail in a paper by Brad Efron, Trevor Hastie, Iain Johnstone and Rob Tibshirani, soon to appear in the Annals of Statistics. The paper, as well as R and Splus packages, are
2003 Apr 30
0
Least Angle Regression packages for R
Least Angle Regression software: LARS "Least Angle Regression" ("LAR") is a new model selection algorithm; a useful and less greedy version of traditional forward selection methods. LAR is described in detail in a paper by Brad Efron, Trevor Hastie, Iain Johnstone and Rob Tibshirani, soon to appear in the Annals of Statistics. The paper, as well as R and Splus packages, are
2008 Jun 02
0
New glmnet package on CRAN
glmnet is a package that fits the regularization path for linear, two- and multi-class logistic regression models with "elastic net" regularization (tunable mixture of L1 and L2 penalties). glmnet uses pathwise coordinate descent, and is very fast. Some of the features of glmnet: * by default it computes the path at 100 uniformly spaced (on the log scale) values of the
2008 Jun 02
0
New glmnet package on CRAN
glmnet is a package that fits the regularization path for linear, two- and multi-class logistic regression models with "elastic net" regularization (tunable mixture of L1 and L2 penalties). glmnet uses pathwise coordinate descent, and is very fast. Some of the features of glmnet: * by default it computes the path at 100 uniformly spaced (on the log scale) values of the