I am trying to write a definition that downloads and extracts a tar file if the destination directory does not exist. The tar file won''t change and does not need to be download and extracted if it changes. The definition I am using now is: define remote_tar($source, $directory) { remotefile { "/tmp/$name.tgz": source => $source, } exec { "tar-$name": command => "tar -xzf /tmp/$name.tgz", directory => $directory, creates => "$directory/$name", require => File["/tmp/$name.tgz"], } } The problem is that this keeps downloading the /tmp/foo.tgz file. Also, remotefile sucks at downloading large files. One possibility is use a pipe on the command. command => "curl $source | tar -xz", Another is to download the file to /tmp with multiple commands;. command => "curl -o /tmp/$name.tgz; tar -xzf $name.tgz" It might be nice to have two separate exec resources where the downloader one is only done when the tar one needs to run. Is there a way to do this? - Ian
On Jan 25, 2007, at 1:30 PM, Ian Burrell wrote:> I am trying to write a definition that downloads and extracts a tar > file if the destination directory does not exist. The tar file won''t > change and does not need to be download and extracted if it changes. > The definition I am using now is: > > define remote_tar($source, $directory) { > remotefile { "/tmp/$name.tgz": > source => $source, > } > exec { "tar-$name": > command => "tar -xzf /tmp/$name.tgz", > directory => $directory, > creates => "$directory/$name", > require => File["/tmp/$name.tgz"], > } > } > > The problem is that this keeps downloading the /tmp/foo.tgz file.It definitely should not be. Any idea why it is? I assume the file isn''t changing?> Also, remotefile sucks at downloading large files. One possibility is > use a pipe on the command. > command => "curl $source | tar -xz",This would be best, it looks like.> Another is to download the file to /tmp with multiple commands;. > > command => "curl -o /tmp/$name.tgz; tar -xzf $name.tgz" > > It might be nice to have two separate exec resources where the > downloader one is only done when the tar one needs to run. Is there a > way to do this?There is currently no way to do this, but it''s been getting requested a bit recently, so it might appear at some point in the future. -- The reasonable man adapts himself to the world; the unreasonable one persists in trying to adapt the world to himself. Therefore all progress depends on the unreasonable man. -- George Bernard Shaw --------------------------------------------------------------------- Luke Kanies | http://reductivelabs.com | http://madstop.com
Matthew Palmer
2007-Jan-25 21:25 UTC
Re: download and install tarballs [was: Conditional exec]
On Thu, Jan 25, 2007 at 11:30:38AM -0800, Ian Burrell wrote:> I am trying to write a definition that downloads and extracts a tar > file if the destination directory does not exist. The tar file won''t > change and does not need to be download and extracted if it changes.[...]> It might be nice to have two separate exec resources where the > downloader one is only done when the tar one needs to run. Is there a > way to do this?Yes, of course. It was how I used to do my downloads until I said "aaah, stuff it" and went to Debian packages (which simplified my distribution and caching network significantly). I even went to the point of writing a define for it, and I''ve just added it to PRMweb[1] as the install_tarball_via_http recipe. - Matt [1] http://prmweb.hezmatt.org/ -- "I invented the term object-oriented, and I can tell you I did not have C++ in mind." -- Alan Kay
On 1/25/07, Luke Kanies <luke@madstop.com> wrote:> On Jan 25, 2007, at 1:30 PM, Ian Burrell wrote: > > The problem is that this keeps downloading the /tmp/foo.tgz file. > > It definitely should not be. Any idea why it is? I assume the file > isn''t changing? >It isn''t downloading the file. I meant that it keeps the file around on disk.> > Also, remotefile sucks at downloading large files. One possibility is > > use a pipe on the command. > > command => "curl $source | tar -xz", > > This would be best, it looks like. >That is what I ended up using. - Ian
On Jan 25, 2007, at 5:40 PM, Ian Burrell wrote:> It isn''t downloading the file. I meant that it keeps the file > around on disk.Ah. *whew*>>> Also, remotefile sucks at downloading large files. One >>> possibility is >>> use a pipe on the command. >>> command => "curl $source | tar -xz", >> >> This would be best, it looks like. >> > > That is what I ended up using.Okay. -- I do not feel obliged to believe that the same God who has endowed us with sense, reason, and intellect has intended us to forgo their use. -- Galileo Galilei --------------------------------------------------------------------- Luke Kanies | http://reductivelabs.com | http://madstop.com