Displaying 20 results from an estimated 1100 matches similar to: "Exporting PDF"
2010 Jan 07
2
R treating time
Hi all,
I have imported a value 3:00 from Excel into R using read.csv. I want R to
recognise it as 3:00am (time data). How do I do it?
Thanks in advance,
Chris
--
View this message in context: http://n4.nabble.com/R-treating-time-tp1008608p1008608.html
Sent from the R help mailing list archive at Nabble.com.
2010 Apr 23
4
Remove duplicated rows
Hi all,
I have a dataset similar to the following
Name Date Value
A 1/01/2000 4
A 2/01/2000 4
A 3/01/2000 5
A 4/01/2000 4
A 5/01/2000 1
B 6/01/2000 2
B 7/01/2000 1
B 8/01/2000 1
I would like R to remove duplicates based on column 1 and 3 only. In
addition, I would like R to remove duplicates based on the underlying and
overlying row only. For example, for A, I would like to remove row 2 only
2010 Jan 08
2
R exponential regression
Hi all,
I have a dataset which consists of 2 columns. I'd like to plot them on a x-y
scatter plot and fit an exponential trendline. I'd like R to determine the
equation for the trendline and display it on the graph.
Since I am new to R (and statistics), any advice on how to achieve this will
be greatly appreciated.
Many thanks,
Chris
--
View this message in context:
2010 Apr 20
3
Words appear to be bolded in the PDF output
Hi all,
I have written a note near each of my graphs using mtext.
mtext(text,side=1,line=4,cex=0.5,adj=0)
Then I have exported the graphs as a PDF file.
pdf(file=name,paper='a4',width=7.27,height=10.69)
The mtext appears OK in R. But it looks like it is bolded in the PDF file.
http://n4.nabble.com/file/n2016971/graph.png
I am not sure if this is actually my monitor/computer's
2009 Sep 14
3
Exporting Numerous Graphs
Hi all,
I have got 27 graphs to export (not a lot...I know!). How can I fit all of
them into a single file like PNG without adjusting the size of the graphs?
What's in my mind is like pasting graphs into Word, in which I can just
scroll down to view the graphs.
Thanks for your attention. Much appreciated.
Chris
--
View this message in context:
2008 May 20
7
Problems sending large results with backgroundrb
I''m working on an application that does extensive database searching.
These searches can take a long time, so we have been working on moving
the searches to a backgroundrb worker task so we can provide a sexy AJAX
progress bar, and populate the search results as they are available.
All of this seems to work fine until the size of the search results gets
sufficiently large, when we start
2001 Jun 15
2
openssh 2.9p1: data loss when stdout sent to a pipe
We recently tried upgrading openssh from 2.5.2p2 to 2.9p1
and discovered that it no longer worked to feed the output from a remote
command into a pipe, unless the output was short and the pipe was very
fast at processing its input.
Example 1: ssh remote_machine some_command | less
(where "some_command" generates a lot of output) now fails after
the first screenful, with a
2010 Apr 21
3
User inputs
Hi everyone,
I have been searching for answers for the following questions but I don't
have much success. The following questions may actually be quite simple. Any
help would be greatly appreciated.
(1) I have written a script which requires user input. I am using the
readline() command.However, everytime when I run the script, R does not wait
for the user input and proceed to the next line.
2013 Aug 27
1
[plyr] Moving average filter with plyr
Dear all,
I'm stuck with a problem using plyr to process a rather large junk of data. What I'm trying to do is applying a moving average to all the subparts of the dataframe (the example data can be found here https://dl.dropboxusercontent.com/u/2414056/testData.Rdata).
require(plyr)
load("testData.Rdata")
applyfilter<-function(x){
return(filter(x,rep(1/5, times=5)))
}
2011 Oct 18
1
How to read data sequentially into R (line by line)?
I have a data set like this in one .txt file (cols separated by !):
APE!KKU!684!
APE!VAL!!
APE!UASU!!
APE!PLA!1!
APE!E!10!
APE!TPVA!17122009!
APE!STAP!1!
GG!KK!KK!
APE!KKU!684!
APE!VAL!!
APE!UASU!!
APE!PLA!1!
APE!E!10!
APE!TPVA!17122009!
APE!STAP!1!
GG!KK!KK!
APE!KKU!684!
APE!VAL!!
APE!UASU!!
APE!PLA!1!
APE!E!10!
APE!TPVA!17122009!
APE!STAP!1!
GG!KK!KK!
it contains over 14 000 000 records. Now
2004 Nov 06
3
Calling CreateFile on an instance of File - possible?
Hi all,
I''m going over win32-file this weekend. I''m creating instance methods for
setting (or unsetting) the various file attributes. So, you can do
something like:
f = File.open("foo.txt")
f.archive = true
f.hidden = true
f.close
This works for the basic attributes, but it requires extra work for others.
Specifically, I am having trouble trying to set the
2007 Jul 18
1
nested for loop
Hi,
I am new to programming and R. I am reading the manual and R books by Dalgaard and Veranzo to help answer my questions but I am unable to figure out the following:
I have a data file that contains 1080 data points. Here's a snippet of the file:
[241] 0.3603704000 0.1640741000 0.2912963000 NA 0.0159259300 0.0474074100
I would like to break the file up into 30
2017 Sep 28
5
rename multiple files by file.rename or other functions
Hi,
I have 50 files whose names are
XYZW01Genesis_ABC.mp3
XYZW02Genesis_ABC.mp3
.......
XYZW50Genesis_ABC.mp3
As you can tell, the only difference across the files are 01, 02,
03,....50.
I would like to rename them to
01Gen01.mp3
01Gen02.mp3
.......
01Gen50.mp3
If I store them in one folder and write an R code in that folder, how can
it be done?
Thanks,
John
[[alternative
1998 Nov 18
5
PC Backup Script?
I am trying to setup my Sun running samba 1.9.18p10 so that it can
backup my dept's PCs. I can get it to work manually using the
smbclient command with no problem. What I would like to do is have
a script that is able to take a list of PCs, determine if a PC is online,
backup that PC, then move on to the next one. If a PC is down,
it can report an error to the admin to state that the
2008 Apr 12
6
Cutting down BackgrounDRb memory issue
Hi Folks,
I have been working to fix BackgrounDRb memory usage as such. Ruby1.8 GC
isn''t fork friendly and it seems to use a lot of memory because when you
fork, it sets a bit in all the objects in global scope which causes OS
to all pages in child process.
A classic solution is to use fork and exec, rather than just fork. Its
working and pretty stable, but you loose ability to pass
2004 Nov 07
2
Problems with DeviceIoControl()
Hi all,
Thanks to Wayne and Park, I''ve got something like this
now:
static VALUE file_set_compressed(VALUE self, VALUE
rbBool){
HANDLE h;
BOOL rv;
DWORD dwBytesReturned;
int fn;
USHORT inBuf = COMPRESSION_FORMAT_DEFAULT;
if((rbBool != Qtrue) && (rbBool != Qfalse)){
rb_raise(rb_eTypeError,"Argument must be true or
false");
}
2006 Nov 21
3
Fw: re. win32-process
Hi all,
Any ideas for the question below? I know how to do this in theory - make the ''inherit'' flag true, and set the ''stdout'' and ''stderr'' startf_flags hash options to something in the startup_info hash, but I wasn''t sure how to do this in practice.
It would be nice if the answer could be something like this:
require
2013 Nov 01
7
[PATCH] construct listener_fds Hash in 1.8 compatible way
This renables the ability for Ruby 1.8 environments to perform reexecs
---
lib/unicorn/http_server.rb | 7 ++++---
1 file changed, 4 insertions(+), 3 deletions(-)
diff --git a/lib/unicorn/http_server.rb b/lib/unicorn/http_server.rb
index 2decd77..9a5795c 100644
--- a/lib/unicorn/http_server.rb
+++ b/lib/unicorn/http_server.rb
@@ -449,13 +449,14 @@ class Unicorn::HttpServer
end
2008 May 06
4
DeviceIoControl + IOCTL_DISK_GET_DRIVE_GEOMETRY problem
Hi all,
Ok, what am I doing wrong here?
require ''windows/device_io''
require ''windows/handle''
require ''windows/error''
include Windows::DeviceIO
include Windows::Handle
include Windows::Error
fh = File.open(''test.txt'') # Assume you have this
handle = get_osfhandle(fh.fileno)
if handle == INVALID_HANDLE_VALUE
puts
2009 Mar 18
1
Reading a file line by line - separating lines VS separating columns
Hello all.
I wish to read a large data set into R. My current issue is in getting the
data so that R would be able to access it. Using read.table won't work
since the data is over 1GB in size (and I am using windows XP), so my plan
was to read the file chunk by chunk and each time move it into bigmemory
(I'll play with that when the time will come, maybe ff is better ?!).
I encountered