This article is misleading because the user is testing very small
amount of data (90MB - 300MB). Most likely the article and test was
done by someone who just recently learn about search engines or wants
to make his blog popular. Average server now has 16GB of memory
therefore even mediocre search engines as Lucene can show
satifsfactory results when less data is used than avaiable memory on
server.
If the user would use least 100GB of data Lucene and many other open
sources would be dead, where Xapian rock beyond 100GB of data with no
problems. The year now is 2009 and we talking in Terabyte and
Gigabytes not Megabytes. Who are these people writing these articles,
confusing and misleading people? If you dealing with Megabytes of data
MySQL is fine, you do not need to use search engine.
Here is proof how fast search goes on 500GB of data using Xapian, can
Lucene do that on single server? ... of course not.
http://myhealthcare.com
PS: When blind leading the blind, at once they all fall off the cliff.
This is Information Technology not Banking Industry, we know the
mathematics.
Thanks,
Kevin Duraj
http://myhealthcare.com
On Mon, Jul 6, 2009 at 7:19 AM, Charlie Hull<charlie at juggler.net>
wrote:> Hi all,
>
> You may find
>
http://developers.slashdot.org/story/09/07/06/131243/Open-Source-Search-Engine-Benchmarks
> interesting.
>
> Xapian was rather slated for large index sixes and slow indexing, but
> had comparable search performance to Lucene.
>
> Cheers
>
> Charlie
>
> _______________________________________________
> Xapian-discuss mailing list
> Xapian-discuss at lists.xapian.org
> http://lists.xapian.org/mailman/listinfo/xapian-discuss
>