Displaying 20 results from an estimated 2000 matches similar to: "venneuler() - customize a few things"
2010 Oct 07
1
venneuler() - customize a few things.
Esteemed UseRs and DevelopeRs,
Just coming to terms with the very attractive proportional venn
gernator, venneuler(), but would like to customize a few things.
Is it possible to-
-suppress all circle labels?
-suppress only certain circle labels?
-print specific text strings at specified locations within the circles?
and unions?
-specify circle colors?
-specify label font, size & color?
2010 Oct 10
1
venneuler (java?) color palette 0 - 1
Dear UseRs and DevelopeRs
It would be helpful to see the color palette available in the
venneuler() function.
The relevant par of ?venneuler states:
"colors: colors of the circles as values between 0 and 1"
-which explains color specification, but from what pallette? Short of
trial and error, i'd really appreciate if some one could help me locate
a "0 - 1" pallette
2009 Jul 13
0
adjusting survival using coxph
I have what I *think* should be a simple problem in R, and hope
someone might be able to help me.
I'm working with cancer survival data, and would like to calculate
adjusted survival figures based on the age of the patient and the
tumour classification. A friendly statistician told me I should use
Cox proportional hazards to do this, and I've made some progress with
using the
2011 Oct 02
1
generating Venn diagram with 6 sets
Dear r-helpers,
Here I would like to have your kind helps on generating Venn diagram.
There are some packages within R on this task, like venneuler, VennDiagram,
vennerable. But, vennerable can not be installed on my Mac book. It seems
VennDiagram can not work on my data. And, venneuler may have generated a
wrong Venn diagram to me.
Do you have any experience/expertise on those Venn diagram?
2012 Aug 30
1
segfault in gplots::heatmap.2
Hi all,
I am taking this over from r-help (see
http://permalink.gmane.org/gmane.comp.lang.r.general/273985).
I experience a segfault when calling gplots::heatmap.2(), but only when
certain other packages are loaded.
I am not sure for the correct place to send this bug report. Should I send
it to the package maintainers directly? If R-help is the wrong place,
please feel free to direct me to
2000 Feb 07
1
demo(nlm) error under R 0.99.0
I can't seem to get the demo(nlm) to run under R version 0.99.0
Anyone know a solution?
> fgh <- function(x) {
gr <- function(x1, x2) {
c(-400 * x1 * (x2 - x1 * x1) - 2 * (1 - x1), 200 * (x2 -
x1 * x1))
}
h <- function(x1, x2) {
a11 <- 2 - 400 * (x2 - x1 * x1) + 800 * x1 * x1
a21 <- -400 * .... [TRUNCATED]
> nlm(fgh,
2000 Feb 15
1
R installation
I've decided to install in R in a directory other than /usr/local, but
I'm having difficulty in setting the correct paths, i.e. PATH or path.
I've set my R_HOME, and sure enough there's a bin and lib directory
there but R just won't execute (something like GUI 'X11' is not
supported. However when R is installed in /usr/local then everything is
OK...Any help or comments
2012 Aug 30
2
segfault in gplots::heatmap.2
Hi all,
I experience a segfault when calling gplots::heatmap.2(), but only when
certain other packages are loaded.
I am not sure for the correct place to send this bug report. Should I send
it to the package maintainers directly? If R-help is the wrong place,
please feel free to direct me to the correct one.
I am on debian (testing) linux 64 with the binary R distribution
from the
2012 Aug 30
2
segfault in gplots::heatmap.2
Hi all,
I experience a segfault when calling gplots::heatmap.2(), but only when
certain other packages are loaded.
I am not sure for the correct place to send this bug report. Should I send
it to the package maintainers directly? If R-help is the wrong place,
please feel free to direct me to the correct one.
I am on debian (testing) linux 64 with the binary R distribution
from the
2002 Oct 10
1
make check when installing R-1.6.0
This is the result of my make check, could anyone help me out on this
one?
Formats: text example
running code in 'base-Ex.R' ...*** Error code 1
make: Fatal error: Command failed for target `base-Ex.Rout'
Current working directory /apps/R/R-1.6.0/tests/Examples
*** Error code 1
make: Fatal error: Command failed for target `test-Examples-Base'
Current working directory
2000 Feb 08
7
demo(dyn.load) error in R 0.99.0
I noticed this error in my demo from previous versions as well as
R 0.99.0. Is there a way around this one also? Thanks in advance...
> demo(dyn.load)
demo(dyn.load)
---- ~~~~~~~~
Type <Return> to start :
> dyn.load(file.path(R.home(), "demos", "dynload", paste("zero",
.Platform$dynlib.ext, sep = "")))
Error in
2009 Jul 10
1
[LLVMdev] fix for typo in llvm-c/Core.h
Is missing a *.
Peter
--
Peter O'Gorman
http://pogma.com
Index: include/llvm-c/Core.h
===================================================================
--- include/llvm-c/Core.h (revision 75249)
+++ include/llvm-c/Core.h (working copy)
@@ -853,7 +853,7 @@
template<typename T>
inline T **unwrap(LLVMValueRef *Vals, unsigned Length) {
#if DEBUG
- for
2005 Nov 16
1
PPC package-ppc.read.raw.nobatch (PR#8316)
Full_Name: Martin O'Gorman
Version:
OS:
Submission from: (NULL) (84.176.63.149)
I have been looking at the PPC package and have a question. As the input data is
comma separated, shouldn?t the command to read in the raw (no batch) mass spec
data indicate that sep=?,? (marked below) ? Otherwise, the data read in is the
pair of values (m/z,intensity). It is not obvious why that should be.
2000 Apr 17
1
xgobi
I can't ssem to get the the examples running. Can anyone offer a
solution?
Thanks in advance, and this what happens.
> data(laser)
> xgobi(laser)
xgobi -title 'laser' -std mmx /tmp/xgobi-laserR7316S41c6 &
> Neither the file /tmp/xgobi-laserR7316S41c6 nor /tmp/xgobi-laserR7316S41c6.dat exists
-------------- next part --------------
A non-text attachment was scrubbed...
2000 Apr 24
1
make problems
I'm having trouble making the manuals as per the INSTALL instructions.
Anyone have any hints as to what's wrong?
<123>-> make dvi
DVI/LaTeX documentation: reference index ...
*** Error code 255
make: Fatal error: Command failed for target `refman.dvi'
2000 Feb 29
1
R-1.0.0 make error
I'm getting this error when I try to make after my configure, any
solutions? Thanks in advance...
make[2]: Entering directory `/export/trichodon2/R/R-1.0.0/demos/dynload'
/usr/local/bin/make zero.so
make[3]: Entering directory `/export/trichodon2/R/R-1.0.0/demos/dynload'
/export/trichodon2/R/R-0.99.0a/bin/SHLIB:
/export/trichodon2/R/R-0.99.0a/bin/SHLIB: cannot open
make[3]: ***
2014 Jul 06
1
[Bug 80980] New: Suspend is broken on NV11
https://bugs.freedesktop.org/show_bug.cgi?id=80980
Priority: medium
Bug ID: 80980
Assignee: nouveau at lists.freedesktop.org
Summary: Suspend is broken on NV11
QA Contact: xorg-team at lists.x.org
Severity: normal
Classification: Unclassified
OS: Linux (All)
Reporter: a.p.huntley at btinternet.com
1997 Dec 08
0
R-beta: Version 6 of R: Use Gnu Make
make help (and make latex, make html) fails with:
"can't make target ../src/library/*/man/*.Rd"
with SUN's make under Solaris 2.5.1
However, Gnu-make 3.74 works ok.
---------------------
Ina Dau
Computer Administrator - Room 101 - Pearson Building - UCL
email: i.dau at ucl.ac.uk
Phone: +44-171-4193636
snail: Dep. of Statistical Sciences, University College London
Gower
2007 Jun 08
0
help.search and Baysian regression
Hi there,
two questions.
1) Is there any possibility to look up the help pages within R for more
complex combinations of character strings, for example "Bayesian" AND
"regression" but not necessarily "Bayesian regression"?
2) Is there a package/command that does fully Bayesian linear regression
(if possible with variable selection)?
Thanks,
Christian
*** --- ***
2010 Jan 19
1
Sampling theory
Hi there,
are there any R-packages for computations required in sampling theury
(such as confidence intervals under random, stratified, cluster sampling;
I'd be partoculary interested in confidence intervals for the population
variance, which is difficult enough to find even in books)?
Thanks,
Christian
*** --- ***
Christian Hennig
University College London, Department of Statistical