Displaying 20 results from an estimated 2000 matches similar to: "ggplot label problem"
2013 Apr 01
0
ggplot2 label
Hello all!
I have a problem to plot label (Year) only for significant values (in this
case spoz and sneg).
I use this code, but don't work with labels.
library(ggplot2)
ggplot(data1, aes(x = Year, y = value,fill=type,width=1))+
geom_bar(stat="identity",position="identity")+
scale_y_continuous(breaks = round(seq(-100, 100, by = 10),10))+
theme_bw()
the data used is:
2013 Apr 01
0
ggplot2 label problem
I have a problem to plot label (Year) only for significant values (in this
case spoz and sneg).
I use this code, but don't work with labels.
library(ggplot2)
ggplot(data1, aes(x = Year, y = value,fill=type,width=1))+
geom_bar(stat="identity",position="identity")+
scale_y_continuous(breaks = round(seq(-100, 100, by = 10),10))+
theme_bw()
Thank you!
the data used is:
2013 Apr 01
0
overlaping barplot
Hello all!
I want to make a barplot with ggplot2.
I want to view in the same chart the semn values (significant values
(pointer over 50)). I try this code, but only for pointer values.
ggplot(data, aes(x = Year, y = pointer)) + geom_bar(stat="identity")
please help me with this problem.
I use this data:
Year variable pointer variable semn
1 1901 neg 0.00 sneg NA
2 1902 neg 0.00 sneg
2013 Apr 01
1
polygon error
Hello all!
I have a problem to draw a polygon with R. My data is like this>
Year Nb.series Perc.pos Perc.neg Nature RGV_mean RGV_sd neg poz
1 1901 1 0.00 0.00 0 4.29 NA 0.00 0.00
2 1902 1 100.00 0.00 1 16.47 NA 0.00 100.00
3 1903 1 100.00 0.00 1 31.31 NA 0.00 100.00
4 1904 1 0.00 0.00 0 -9.62 NA 0.00 0.00
5 1905 1 0.00 100.00 -1 -22.55 NA -100.00 0.00
6 1906 1 0.00 100.00 -1 -12.09 NA
2013 Mar 29
1
problem with data
Hello all!
I have a problem with my data in R. When I want to plot the following data,
I have a problem with y scale. The maximum value is cc. 10 degrees and in R
is about 100.
I use this code:
fasy<-read.table("gridd1.txt",sep="\t",dec=",",header=T,row.names=1)
# here are the years:
x <- as.numeric(rownames(fasy))
# extract a series that you want to plot:
y
2013 Mar 13
2
merge datas
Hello all!
I have a problem with R. I try to merge data like this:
structure(c(2.1785, 1.868, 2.1855, 2.5175, 2.025, 2.435, 1.809,
1.628, 1.327, 1.3485, 1.4335, 2.052, 2.2465, 2.151, 1.7945, 1.79,
1.6055, 1.616, 1.633, 1.665, 2.002, 2.152, 1.736, 1.7985, 1.9155,
1.7135, 1.548, 1.568, 1.713, 2.079, 1.875, 2.12, 2.072, 1.906,
1.4645, 1.3025, 1.407, 1.5445, 1.437, 1.463, 1.5235, 1.609, 1.738,
1.478,
2008 Nov 04
1
perform Kruskal-Wallis test without using the built-in command in R
Hi,
again i am stuck in my presentation, and i have never learn R before in my
life but need this to be done, so please help me out for a favour:
http://www.nabble.com/file/p20333155/kew.dat kew.dat
run this in R and these comes up:
Month Year Rain
1 Jan 1900 74.400000
2 Feb 1900 80.500000
3 Mar 1900 23.600000
4 Apr 1900 23.600000
5 May 1900 25.100000
6
2013 Apr 07
2
group data in classes
Hello all!
I have a problem to group my data (years) in 10 years classes. For example
for year
year decade
1598 1590-1600
1599 1590-1600
1600 1590-1600
1601 1600-1610
---
my is like this>
[1] 1598 1599 1600 1601 1602 1603 1604 1605 1606 1607 1608 1609 1610 1611
1612
[16] 1613 1614 1615 1616 1617 1618 1619 1620 1621 1622 1623 1624 1625 1626
1627
[31] 1628 1629 1630 1631 1632 1633
2012 Jun 06
2
package zoo, function na.spline with option maxgap -> Error: attempt to apply non-function?
Hello,
I'm trying to use na.spline (package zoo) to fill some missing data in a time series.
this works fine, however, if I apply the 'maxgap' argument, I always get the error:
<------
Error in na.spline.vec(x., coredata(object.), xout = xout., ...) : attempt to apply non-function
------>
I couldn't find a similar error for this case in the mailing lists and zoo vignette,
2013 Mar 12
5
extract values
Hello all!
I have a problem to extract values greater that for example 1820.
I try this code: x[x[,1]>1820,]->x1
Please help me!
Thank you!
The data structure is:
structure(c(2.576, 1.728, 3.434, 2.187, 1.928, 1.886, 1.2425,
1.23, 1.075, 1.1785, 1.186, 1.165, 1.732, 1.517, 1.4095, 1.074,
1.618, 1.677, 1.845, 1.594, 1.6655, 1.1605, 1.425, 1.099, 1.007,
1.1795, 1.3855, 1.4065, 1.138, 1.514,
2016 Mar 12
0
Regression in strptime
OK, .Internal is not necessary to reproduce oddity in this area. I also see things like (notice 1980)
> strptime(paste0(sample(1900:1999,80,replace=TRUE),"/01/01"), "%Y/%m/%d", tz="CET")
[1] "1942-01-01 CEST" "1902-01-01 CET" "1956-01-01 CET" "1972-01-01 CET"
[5] "1962-01-01 CET" "1900-01-01 CET"
2011 May 30
0
gls and phi1 >1 (phi larger than one)
Dear all,
I am stuck with a problem that might be trivial for most of you (and
therefore is a bit embarrassing for me...):
I want to calculate a generalized least squares regression using two
time series (Y depending on X) with an autoregressive correlation
structure of order two (the data along time are given below). I use
'gls' from package 'nlme':
Calib.gls <- gls(Y~X,
2016 Jun 02
1
[PATCH -next 2/2] virtio_net: Read the advised MTU
Hi,
[auto build test ERROR on next-20160602]
url: https://github.com/0day-ci/linux/commits/Aaron-Conole/virtio-net-Advised-MTU-feature/20160603-000714
config: i386-allmodconfig (attached as .config)
compiler: gcc-6 (Debian 6.1.1-1) 6.1.1 20160430
reproduce:
# save the attached .config to linux build tree
make ARCH=i386
Note: the
2016 Jun 02
1
[PATCH -next 2/2] virtio_net: Read the advised MTU
Hi,
[auto build test ERROR on next-20160602]
url: https://github.com/0day-ci/linux/commits/Aaron-Conole/virtio-net-Advised-MTU-feature/20160603-000714
config: i386-allmodconfig (attached as .config)
compiler: gcc-6 (Debian 6.1.1-1) 6.1.1 20160430
reproduce:
# save the attached .config to linux build tree
make ARCH=i386
Note: the
2016 Mar 12
2
Regression in strptime
On 3/12/16 12:33 AM, peter dalgaard wrote:
>> On 12 Mar 2016, at 00:05 , Mick Jordan <mick.jordan at oracle.com> wrote:
>>
>> This is definitely obscure but we had a unit test that called .Internal(strptime, "1942/01/01", %Y/%m/%d") with timezone (TZ) set to CET.
> Umm, that doesn't even parse. And fixing the typo, it doesn't run:
>
>>
2013 Aug 26
4
transform variables
Dear all!
I have a data frame composed by 13 columns (year, and 12 months). I want to
transform this data base in another like this
year month values
1901 1
1901 2
1901 3
.....
1901 12
1902 1
1902 2
....
1902 12
Is there a possibility to succeed that in R?
Thank you!
best regards!
CR
--
---
Catalin-Constantin ROIBU
Lecturer PhD, Forestry engineer
Forestry Faculty of Suceava
Str.
2004 May 13
0
Rprof ignores top-level computation (PR#6883)
Full_Name: John Garvin
Version: 1.9.0
OS: Linux
Submission from: (NULL) (128.42.129.78)
This may or may not technically be a bug, but it's certainly an annoyance.
Rprof only takes into account computation that occurs inside functions. If a
time-consuming operation occurs outside a function, it doesn't record the time
it takes. Consider this program 'array.r':
Rprof()
foo <-
2018 Apr 22
0
[Bug 351] Conntrack loses connection entries
https://bugzilla.netfilter.org/show_bug.cgi?id=351
Shane <arlenslambert at gmail.com> changed:
What |Removed |Added
----------------------------------------------------------------------------
CC| |arlenslambert at gmail.com
Attachment #134 is|0 |1
patch|
2009 Oct 15
1
"Complex?" import of pdf files (criminal records) into R table
Hi there,
I'm facing the decision if it would be possible to transform several
more or less complex pdf files into an R Table-Format or if it has to be
done manually. I think it would be a impudent to expect a complete
solution, but I would be grateful if anyone could give me an advice on
how the structure of such a R-program could look like, and if it's
possible in general.
Here
2011 Jul 07
2
subset from a dataset after comparing its one column to a related vector
Hello R users,
I have two data sets like the following. Form of dataset:
data:
X1 X2 X3 X4 X5
1902 RE 3 594 9
1903 RE 3 1340 7
1904 AA 3 760 14
1908 RE 4 1759 18
1909 EX 2 387 1
2901 AU 6 3116