RE: Full text search

This WebDNA talk-list message is from

1999


It keeps the original formatting.
numero = 22402
interpreted = N
texte = How about setting the max found items pref to a low number.> -----Original Message----- > From: Peter Ostry [mailto:po@ostry.com] > Sent: Thursday, January 07, 1999 11:40 AM > To: WebDNA-Talk@smithmicro.com > Subject: Re: Full text search > > > At 17:35 Uhr -0800 06.01.1999, PCS Technical Support wrote: > > >>So a complete search will look into 6000 fields for a match of > 3 fields per > >>line and at the same time in 4000 fields for the occurence of 3 > characters. > >> > >>Will this be slow? > > > >That's probably not too bad. There are sites searching thru 200,000 > >records in full-text descriptions along with other required fields. > > > Oops, I made a mistake in my calculation, the correct numbers are this: > One search session looks into 3x2000 fields (which are required) and into > 12x2,000 to find 3 letters. > That makes 30,000 fields. I will have one life database and three years > in the archive, so the biggest search will run through max. > 120.000 records > - I am a little scared about the heavy wildcard search with wo in 12 (!) > fields... > Do you still say this could be ok? > > I am just wondering, what has more effect on speed with wo???data: > Number of records? Number of fields? Can I multiply records x > fields to get > a rule of thumb for a specific machine? > My search requires grouping - does grouping speed up or slow down? > > I tried to push WebCat beyond the limit, with 500.000 records and 10 > fields. Like all databases it is still usable, when it finds one or few > records. But it gets exhausted (no wonder) when it finds many records. > Maybe I should set up a learning dirty words-system to store > all requests > which return more then 500 records or so and forbid them in the future. > This technique makes sense with 4D - but makes it sense for WebCat too? > > > Peter > > ++++++++++++++++++++++++++++++++++++++++++++++++ > Peter Ostry - Vienna/Austria - www.ostry.com > Fon ++43-1-877 74 54 Fax ++43-1-877 74 54-21 > ++++++++++++++++++++++++++++++++++++++++++++++++ > Associated Messages, from the most recent to the oldest:

    
  1. Re: full text search ? (Frank Nordberg 2002)
  2. Re: full text search ? (Kenneth Grome 2002)
  3. Re: Full text search (Peter Ostry 1999)
  4. Re: Full text search (PCS Technical Support 1999)
  5. RE: Full text search (Olin Lagon 1999)
  6. RE: Full text search (Olin Lagon 1999)
  7. Re: Full text search (Peter Ostry 1999)
  8. Re: Full text search (PCS Technical Support 1999)
How about setting the max found items pref to a low number.> -----Original Message----- > From: Peter Ostry [mailto:po@ostry.com] > Sent: Thursday, January 07, 1999 11:40 AM > To: WebDNA-Talk@smithmicro.com > Subject: Re: Full text search > > > At 17:35 Uhr -0800 06.01.1999, PCS Technical Support wrote: > > >>So a complete search will look into 6000 fields for a match of > 3 fields per > >>line and at the same time in 4000 fields for the occurence of 3 > characters. > >> > >>Will this be slow? > > > >That's probably not too bad. There are sites searching thru 200,000 > >records in full-text descriptions along with other required fields. > > > Oops, I made a mistake in my calculation, the correct numbers are this: > One search session looks into 3x2000 fields (which are required) and into > 12x2,000 to find 3 letters. > That makes 30,000 fields. I will have one life database and three years > in the archive, so the biggest search will run through max. > 120.000 records > - I am a little scared about the heavy wildcard search with wo in 12 (!) > fields... > Do you still say this could be ok? > > I am just wondering, what has more effect on speed with wo???data: > Number of records? Number of fields? Can I multiply records x > fields to get > a rule of thumb for a specific machine? > My search requires grouping - does grouping speed up or slow down? > > I tried to push WebCat beyond the limit, with 500.000 records and 10 > fields. Like all databases it is still usable, when it finds one or few > records. But it gets exhausted (no wonder) when it finds many records. > Maybe I should set up a learning dirty words-system to store > all requests > which return more then 500 records or so and forbid them in the future. > This technique makes sense with 4D - but makes it sense for WebCat too? > > > Peter > > ++++++++++++++++++++++++++++++++++++++++++++++++ > Peter Ostry - Vienna/Austria - www.ostry.com > Fon ++43-1-877 74 54 Fax ++43-1-877 74 54-21 > ++++++++++++++++++++++++++++++++++++++++++++++++ > Olin Lagon

DOWNLOAD WEBDNA NOW!

Top Articles:

Talk List

The WebDNA community talk-list is the best place to get some help: several hundred extremely proficient programmers with an excellent knowledge of WebDNA and an excellent spirit will deliver all the tips and tricks you can imagine...

Related Readings:

Online reference (1997) Requiring that certain fields be completed (1997) New Command prefs ... (1997) DON'T use old cart file! (1997) ShowIf & CountChars (2000) Hogging all processing cycles (2004) [WebDNA] Amazon EC2 (2009) Special delete ... (1997) Configuration Q (1998) listcookies works only sometimes (1997) Server Freeze (1998) [WebDNA] read database from other domain (2015) 2.0Beta Command Ref (can't find this instruction) (1997) Formating found categories (1997) do you have a webcatalog tool you want to sell? (1999) Did this just get cheaper ? (2003) [WebDNA] variable name limit - answer (2009) WebCatalog for guestbook ? (1997) Reversed words (1997) Bad card db - *mislabled post* (1999)