Talita Perciano wrote:>
> Dear users,
>
> I'm generating some images in R to put into a document that I'm
producing
> using Latex. This document in Latex is following a predefined model, which
> does not accept compilation with pdflatex, so I have to compile with latex
> -> dvi -> pdf. Because of that, I have to generate the images in R
with
> postscript (I want a vector format to keep the quality). The problem is
> that
> the files of the images are very huge (10MB) and I have many images to put
> into the pdf document.
> I want to know if there is a way to reduce the size of those images
> generated by R using postscript.
>
> Thank you in advance,
>
> Talita
>
>
Not in any extremely easy way. The fundamental problem is
that if you have a whole lot of points in your graph, it's hard
to make them take less file space even if they're overplotted
(and hence not visible in the actual image).
This has been discussed in various forms on the R list in the past,
but I can't locate those posts easily. It's a little hard without
knowing
what kind of plot you're generating, but I'm assuming that you have
many, many points or lines in the graphic (or a very high-resolution
image plot), and that the details don't all show up in the figure anyway.
A few general strategies:
* thin the points down to a random subset
* use a 2D density plot or hexagonal binning
* create a bitmap (PNG) plot, then use image
manipulation tools (ImageMagick etc.) to convert that back to
a PostScript file
* there was some discussion earlier about whether one
could embed a bitmap of just the internals of the plot, leaving
the axes, labels etc. in vector format, but I don't think that
came to anything
good luck
Ben Bolker
--
View this message in context:
http://www.nabble.com/Help-with-postscript-%28huge-file-size%29-tp23003428p23004309.html
Sent from the R help mailing list archive at Nabble.com.