Hello folks,
We''re starting using Puppet in our production environment and now
we''re with some preformance issues.
For example, we''ve some large(200MB) recursives directories for
puppet''s deploy, and that was totally inefficient. (Minimize recursive
file serving: http://docs.puppetlabs.com/guides/scaling.html).
So we made a test creating our custom debian package (.deb) for our
files and libs. Doing that Puppet don''t need to recursively check all
the file''s md5sum.
And now we''ve this catalog :
file { "/tmp/my-custom.deb":
ensure => present,
source => "puppet:///modules/test/deb/my-custom.deb",
}
package {"my-custom":
require => File[''/tmp/my-custom.deb''],
ensure => installed,
source => "/tmp/my-custom.deb",
provider => dpkg,
}
That way, works great (less than 30 sec), but when updated our custom
package puppet just copy the file and do not execute the dpkg to
install.
How can I achive this goal?
And there is a best way to manage large files?
Can someone indicate me some references of the best deployment
practices (puppet+custom debian or something else) ?
Best regards,
Sidarta Oliveira
--
You received this message because you are subscribed to the Google Groups
"Puppet Users" group.
To post to this group, send email to puppet-users@googlegroups.com.
To unsubscribe from this group, send email to
puppet-users+unsubscribe@googlegroups.com.
For more options, visit this group at
http://groups.google.com/group/puppet-users?hl=en.
Use apt On Sep 20, 2011 11:24 AM, "Sidarta" <sidarta.rj@gmail.com> wrote:> Hello folks, > > We''re starting using Puppet in our production environment and now > we''re with some preformance issues. > For example, we''ve some large(200MB) recursives directories for > puppet''s deploy, and that was totally inefficient. (Minimize recursive > file serving: http://docs.puppetlabs.com/guides/scaling.html). > So we made a test creating our custom debian package (.deb) for our > files and libs. Doing that Puppet don''t need to recursively check all > the file''s md5sum. > And now we''ve this catalog : > > file { "/tmp/my-custom.deb": > ensure => present, > source => "puppet:///modules/test/deb/my-custom.deb", > } > > package {"my-custom": > require => File[''/tmp/my-custom.deb''], > ensure => installed, > source => "/tmp/my-custom.deb", > provider => dpkg, > } > > That way, works great (less than 30 sec), but when updated our custom > package puppet just copy the file and do not execute the dpkg to > install. > How can I achive this goal? > And there is a best way to manage large files? > Can someone indicate me some references of the best deployment > practices (puppet+custom debian or something else) ? > > Best regards, > Sidarta Oliveira > > -- > You received this message because you are subscribed to the Google Groups"Puppet Users" group.> To post to this group, send email to puppet-users@googlegroups.com. > To unsubscribe from this group, send email topuppet-users+unsubscribe@googlegroups.com.> For more options, visit this group athttp://groups.google.com/group/puppet-users?hl=en.>-- You received this message because you are subscribed to the Google Groups "Puppet Users" group. To post to this group, send email to puppet-users@googlegroups.com. To unsubscribe from this group, send email to puppet-users+unsubscribe@googlegroups.com. For more options, visit this group at http://groups.google.com/group/puppet-users?hl=en.
On Sep 20, 3:12 pm, Scott Smith <sc...@ohlol.net> wrote:> Use aptTo expand on that a little, make a local apt repo to put your packages in and disturibute an updated sources.list through puppet. I''m in the process of doing the same thing for my CentOS boxes so it''s good to know the performance increase will be that noticable.> On Sep 20, 2011 11:24 AM, "Sidarta" <sidarta...@gmail.com> wrote: > > > > > Hello folks, > > > We''re starting using Puppet in our production environment and now > > we''re with some preformance issues. > > For example, we''ve some large(200MB) recursives directories for > > puppet''s deploy, and that was totally inefficient. (Minimize recursive > > file serving:http://docs.puppetlabs.com/guides/scaling.html). > > So we made a test creating our custom debian package (.deb) for our > > files and libs. Doing that Puppet don''t need to recursively check all > > the file''s md5sum. > > And now we''ve this catalog : > > > file { "/tmp/my-custom.deb": > > ensure => present, > > source => "puppet:///modules/test/deb/my-custom.deb", > > } > > > package {"my-custom": > > require => File[''/tmp/my-custom.deb''], > > ensure => installed, > > source => "/tmp/my-custom.deb", > > provider => dpkg, > > } > > > That way, works great (less than 30 sec), but when updated our custom > > package puppet just copy the file and do not execute the dpkg to > > install. > > How can I achive this goal? > > And there is a best way to manage large files? > > Can someone indicate me some references of the best deployment > > practices (puppet+custom debian or something else) ? > > > Best regards, > > Sidarta Oliveira > > > -- > > You received this message because you are subscribed to the Google Groups > > "Puppet Users" group.> To post to this group, send email to puppet-users@googlegroups.com. > > To unsubscribe from this group, send email to > > puppet-users+unsubscribe@googlegroups.com.> For more options, visit this group at > > http://groups.google.com/group/puppet-users?hl=en. > > > >-- You received this message because you are subscribed to the Google Groups "Puppet Users" group. To post to this group, send email to puppet-users@googlegroups.com. To unsubscribe from this group, send email to puppet-users+unsubscribe@googlegroups.com. For more options, visit this group at http://groups.google.com/group/puppet-users?hl=en.