Displaying 4 results from an estimated 4 matches for "max_buffered_doc".
Did you mean:
max_buffered_docs
2008 Mar 04
0
Search memory usage
...a silly question. I was
wondering, where can I find details about search memory usage? I read
the O''Reilly booklet + googled but couldn''t find much info. There is
a good explanation of how memory is used at indexing time [bound by,
amongst other things, :max_buffer_memory and :max_buffered_docs]. But
how does it work at search time - do the same options apply?
Will parts of the index be cached as they are accessed? How about
search results, do they get cached until :max_buffer_memory and/or
:max_buffered_docs is reached? I understand that the OS will perform
page caching - but that is...
2006 Oct 11
0
Memory allocation bug with index.search
...don''t know
what i could add to help you, i''m trying to reproduce it in a little
code...
Ah, just another detail, this is how our index is created :
INDEX_OPTIONS = {
:path => path,
:auto_flush => false,
:max_buffer_memory => 0x4000000,
:max_buffered_docs => 100000,
:use_compound_file => false}
INDEX = Ferret::Index::Index.new(INDEX_OPTIONS)
but i tried without any max_* and it''s the same...
Tell me what i can do to help if i don''t manage to reproduce it in a
pastable code.
Thanks by advance,
Regards,
Jeremie ...
2007 Jul 14
1
performance bottleneck
I have got my database in Mysql. I used ferret to index a table with 10
million rows. On limiting the selection of data to 1000 initial retrieval,
it takes 200 seconds but for the whole table it took more than four hours
and after which i had to close my indexing application. I used the
StandardAnalyser for it. There is no problem from the database side as
retrieval of all the data in the table
2006 Oct 11
6
Indexing problem 10.9/10.10
Sorry if this is a repost- I wasn''t sure if the www.ruby-forum.com
list works for postings.
I''ve been having trouble with indexing a large amount of documents(2.4M).
Essentially, I have one process that is following the tutorial
dumping documents to an index stored on the file system. If I open the
index with another process, and run the size() method it is stuck at
a number