Displaying 1 result from an estimated 1 matches for "scrap_wiki_content".
2007 Dec 17
11
BackgrounDRb release 1.0 available now
...can process requests from these guys
asynchronously. This mouse trap can allow you to build truly
distributed workers across your network.
- Each worker comes with a "thread_pool" object, which can be used
to run tasks concurrently. For example:
thread_pool.defer(url) { |url| scrap_wiki_content(url) }
- Each worker has access to method "register_status" which can be
used to update status of worker or store results. Results of a worker
can be retrieved even after a worker has died.
By default the results would be saved in master process memory, but
you can configure Bac...