Displaying 20 results from an estimated 1000 matches similar to: "Ferret 0.10.7 released"
2006 Mar 09
1
Missing fields in search result
Hello ferret users,
I have a problem with ferret dropping stored fields in the index.
Not all fields I want to store get stored, so they can be searched, but
can''t be retrieved in a search.
Index creation:
INDEX = Index::Index.new(:path => ''/home/gregor/wisa/index'',
:analyzer => Analysis::WhiteSpaceAnalyzer.new)
SR =
2006 Aug 23
4
Ferret 0.10 and Fields
Hey ...
I just tried to convert my code to 0.10 .. But i''m currently not sure
how to use fields..
i really like some of the new api.. its leaner and i like the fact that
these strange consts are gone (like
Ferret::Search::BooleanClause::Occur::MUST) ..
I see that you''re now having Ferret::Index::FieldInfo to describe the
fields of the index.. thats good.. and i now see that
2006 Sep 22
2
Searching untokenized fields
Hi ..
I tried to exclude certain objects from my search, by adding appropriate
term queries ..
i = Ferret::Index::Index.new
i.field_infos.add_field(:type, :index => :untokenized, :term_vector => :no)
i << {:type => "Movie", :name => "Indiana" }
i << {:type => "Movie", :name => "Forrest" }
i << {:type =>
2006 Jun 27
2
Using QueryParser vs building my own query
Hello all
I finally caved in and decided I should build my own query instead of
relying on QueryParser to do the job for me, but I''ve hit a strange
problem..
Here''s how I build my query:
#Main query
query = Ferret::Search::BooleanQuery.new
#Build query to match types
typesquery = Ferret::Search::BooleanQuery.new
@selected_types.each{|type|
typesquery.add_query(
2006 Oct 20
0
Ferret 0.10.13 released
Hi Folks,
I''ve just release Ferret 0.10.13 (skip 0.10.12, it was a bad build).
There are two interesting additions to this release. You can now
access the Filter#bits method of the built in filters so you can can
use them in your own code, possibly within your own custom filters.
For example you could implement a custom filter like so:
class MultiFilter < Hash
def
2007 Mar 01
2
FerretHash
Dave, thank you so much for the 0.11 release(s). You have solved many
problems for me. As part of my appreciation for your good works, I am
offering up for public consideration a silly little class that I wrote.
(Code is below.) This class offers a simplified Hash-like interface to
(a very restricted subset of) Ferret. Hence I call it FerretHash.
FerretHash comes with its very own pet Ferret
2005 Dec 21
0
Ferret and Rails transaction
Hi,
following the discussion about acts_as_ferret on the Rails mailinglist,
there was an issue about transactions, which could result in beind the
database and ferret out of sync. I have taken a different approach from
acts_as_ferret trying to resolve the transaction problem. Instead of adding
things to the ferret index in the model, I have added it in the controller.
I have only the create part
2007 Jul 07
2
Extending/Modifying QueryParser
Hi,
I''ve implemented synonym searching in my rails application but have
an idea I''d like to implement but can''t figure out how to do. The
idea is that I''d like to give the end user the choice on whether to
search for the synonym of a word or not. Preferably by extending the
query language to parse a construct similar to ''%word1'' and
2007 Jan 05
3
Confused about Search Results
Hi everyone,
I''m pretty new to Lucene and Ferret, so I feel that this is most likely
myself not completely understanding the correct way to do this. I haved
indexed ~2200 text files (of various sizes), and I am now running
searches on the index to get a feel for Lucene and Ferret.
In my first program, which is using Lucene I search for ''influenza'' and
get the
2006 Jan 20
4
Questions about Searching
Hi,
I have some questions about searching with Ferret. I have a user
index with first_name, last_name and full_name (which is just first
plus last with a space).
Here are a couple of questions:
1) If I store the fields tokenized, it appears as though queries are
case-insensitive. However, for untokenized, the query is
case-sensitive. How can I make the untokenized searches
case-insensitive?
2007 Sep 07
5
Custom Analyser .. where to put it ??
Hi,
I m trying to use a custom analyser to add my french stop words... i m
reading the tutorial at :
http://projects.jkraemer.net/acts_as_ferret/wiki/AdvancedUsage
My problem is that i ve no idea where to put my custom Analyser class
like :
class GermanStemmingAnalyzer < Ferret::Analysis::Analyzer
include Ferret::Analysis
def initialize(stop_words = FULL_GERMAN_STOP_WORDS)
2006 Dec 07
8
crash on repeated search
I have found another crash in ferret; this one just uses a regular
search. It''s similar to an issue reported by Matt Schnitz a while ago,
but unlike his, mine does not go away if I turn off omit_norms. It does
go away if I turn on the garbage collector more often, but I''m not sure
that''s a stable workaround under the circumstances.
This one isn''t a
2006 Jul 15
2
FieldQuery not returning anything
Hey ..
The QueryParser RDoc page explains to me on how to search for a specific
value in a specific field. This is not working the way i thought it
should be, what am i doing wrong? Here''s an example ..
I''m storing model data in the index like this:
doc << Field.new( "object_id", object.id, Field::Store::YES)
doc << Field.new( "type",
2006 Oct 02
4
Another web app using Ferret
I am apart of a team that runs a student site called Studicious
(http://stu.dicio.us). We have been using Ferret from the beginning, and
recently added acts_as_ferret and sorting to the system.
As you can see if you try the search, sorting is not working as
expected. I am using this code (w/ find_by_content):
:sort => Ferret::Search::SortField.new(:school_sort, :reverse => false)
2007 Aug 03
0
StandardTokenizer Doesn''t Support token_stream method
According to the Analyzer doc and the StandardTokenizer doc:
http://ferret.davebalmain.com/api/classes/Ferret/Analysis/Analyzer.html
http://ferret.davebalmain.com/api/classes/Ferret/Analysis/StandardTokenizer.html
I ought to be able to construct a StandardTokenizer like this:
t = StandardTokenizer.new( true) # true to downcase tokens
and then later:
stream = token_stream(
2006 Sep 22
1
Query Objects vs. Query Strings
Hi ..
I tried to build some query objects to get some documents from my
index.. without success.. Is something wrong here?
q = Ferret::Search::BooleanQuery.new
q1 = Ferret::Search::TermQuery.new(:type, "movie")
q2 = Ferret::Search::TermQuery.new(:name, "Indiana")
q.add_query(q1, :should)
q.add_query(q2, :should)
Indexer.index.search_each(q) do |doc, score| puts doc end
0
2006 Mar 08
1
indexing a document object fails
Hi,
I''m trying out the example (more or less) straight from the tutorial:
doc = Document.new
doc << Field.new("id", "a", Field::Store::NO,
Field::Index::UNTOKENIZED)
doc << Field.new("title", "b", Field::Store::YES, Field::Index::UNTOKENIZED)
doc << Field.new("data", "c",
2006 Sep 09
2
search_each segmentation fault and parser anomoly
The included test script turned up the following anomolies (run
against Ferret 0.10.3, but had same problems with 0.10.2):
1. When the content word is not in the index the inclusion of a
wildcard file term causes search_each to throw a segmentation
fault.
$ ./test.rb zzz file:*.txt
query: +content:zzz +file:*.txt
./test.rb:28: [BUG] Segmentation fault
ruby 1.8.4 (2005-12-24)
2006 Dec 08
4
Using custom stem analyzer giving mongrel errors
I''m using the custom stem analyzer:
require ''rubygems''
require ''ferret''
include Ferret
module Ferret::Analysis
class FerretAnalyzer
def initialize(stop_words = FULL_ENGLISH_STOP_WORDS)
@stop_words = stop_words
end
def token_stream(field, text)
StemFilter.new(StopFilter.new(LowerCaseFilter.new(StandardTokenizer.new(text)),
2006 Sep 23
0
TermQuery problem
Hi,
Using the 0.10.4 gem under ruby 1.8.5 (2006-08-25) [i686-linux], I
get different results with a TermQuery and a search string. Namely,
using a search string seems to always work whereas using a TermQuery
often doesn''t return any entries.
For example:
> x=@i[450][:message_id]
=> "9e7db9110509070759732b21c4 at mail.gmail.com"
>