Re: how do we turn off staticpages page creation?
This WebDNA talk-list message is from 2003
It keeps the original formatting.
numero = 50119
interpreted = N
texte = That is why I suggested renaming the generate from from a tpl to some otherext. So even if it is crawled it will not get translated.For all it is worth remove the file if you are not using it.----- Original Message -----From: Glenn Busbin
To: WebDNA Talk Sent: Wednesday, May 07, 2003 9:41 AMSubject: Re: how do we turn off staticpages page creation?> >Set a robot.txt file in that directory and set whatever pages you do not> >want crawled.>>> That will work for good bots, but there are those that look for the namesof pages and ignore the DISALLOW tag. That's why it's best to have noreference at all, either on a template or the robots.txt file, to anythingyou don't want visited.>> The ^*#&@^$! spammer's address harvesters will crawl anything they can getto while searching for email addresses. Make a honeypot page that doesnothing but record hits and mention it only in the robots.txt file with aDISALLOW tag. I'll bet a nickle it gets hit.>> Glenn>> -------------------------------------------------------------> This message is sent to you because you are subscribed to> the mailing list .> To unsubscribe, E-mail to: > To switch to the DIGEST mode, E-mail to> Web Archive of this list is at: http://webdna.smithmicro.com/-------------------------------------------------------------This message is sent to you because you are subscribed to the mailing list .To unsubscribe, E-mail to: To switch to the DIGEST mode, E-mail to Web Archive of this list is at: http://webdna.smithmicro.com/
Associated Messages, from the most recent to the oldest:
That is why I suggested renaming the generate from from a tpl to some otherext. So even if it is crawled it will not get translated.For all it is worth remove the file if you are not using it.----- Original Message -----From: Glenn Busbin To: WebDNA Talk Sent: Wednesday, May 07, 2003 9:41 AMSubject: Re: how do we turn off staticpages page creation?> >Set a robot.txt file in that directory and set whatever pages you do not> >want crawled.>>> That will work for good bots, but there are those that look for the namesof pages and ignore the DISALLOW tag. That's why it's best to have noreference at all, either on a template or the robots.txt file, to anythingyou don't want visited.>> The ^*#&@^$! spammer's address harvesters will crawl anything they can getto while searching for email addresses. Make a honeypot page that doesnothing but record hits and mention it only in the robots.txt file with aDISALLOW tag. I'll bet a nickle it gets hit.>> Glenn>> -------------------------------------------------------------> This message is sent to you because you are subscribed to> the mailing list .> To unsubscribe, E-mail to: > To switch to the DIGEST mode, E-mail to> Web Archive of this list is at: http://webdna.smithmicro.com/-------------------------------------------------------------This message is sent to you because you are subscribed to the mailing list .To unsubscribe, E-mail to: To switch to the DIGEST mode, E-mail to Web Archive of this list is at: http://webdna.smithmicro.com/
WebCat @ Inkblot Media
DOWNLOAD WEBDNA NOW!
Top Articles:
Talk List
The WebDNA community talk-list is the best place to get some help: several hundred extremely proficient programmers with an excellent knowledge of WebDNA and an excellent spirit will deliver all the tips and tricks you can imagine...
Related Readings:
Nested tags count question (1997)
process SSI (1998)
off topic - dna snipets (1997)
[replace] has protection feature like [delete]? (2000)
Extended [ConvertChars] (1997)
Re[2]: Webcatalog 4.0 - When will we be able to beta test it (2000)
Possible solution to malformed pages in NN (2000)
[WriteFile] problems (1997)
[redirect] w/o showing args? (1999)
using showpage and showcart commands (1996)
Summing fields (1997)
quantity minimum problem (1997)
keep W* in front applescript? (1998)
PCS Frames (1997)
[WebDNA] character encodings in linux (2011)
Empty Shopping Carts? (1998)
A multi-processor savvy WebCatalog? (1997)
authenticating a second user (1997)
WebCat2 Append problem (B14Macacgi) (1997)
apostrophe in search item (1997)