Re: Multithreading of [replace]

This WebDNA talk-list message is from

1999


It keeps the original formatting.
numero = 23094
interpreted = N
texte = > Brian, > > What about another approach. How about using the following code instead: > > [appendfile logs/[date > %d%b%Y].txt][cart][date][thisurl]etc...[/appendfile] > > This way, every day you get a new logfile with all of your hits appended to > one large text file. Now every week, you can do a listFiles command and > sweep up these databases, summarize on the cart field (you need to create > .hdr files) to get unique records, the produce whatever reports you need. > > This shouldn't impact your server much as you are appending to an existing > file (first hit of day auto creates a new file). If you are logging a lot of > information that relies on logic, you may want to wrap this in a spawn > context. > > Aloha, Olin 3 reasons I don't do this. a) no visitor count on the fly. b) how do you figure first page and last page, start_visit_time and end_visit_time? c) how do you put this info into the shopping cart when the check out? d) BONUS - How do you read into ram the resulting 350 meg file,(assuming one lime per page per visitor, along with all of the other info captured)The database is the best solution, I just can't figure out how to make it work efficiently.If I spawn a process off, does it remember the variables of the page that created the spawn? (IE cart) Brian B. Burton BOFH - Department of Redundancy Department --------------------------------------------------------------- MMT Solutions - Specializing in Online Shopping Solutions 973-808-8644 http://www.safecommerce.comAre you a Web Programmer? I am today.> >> -----Original Message----- >> From: Brian B. Burton [mailto:brian@burtons.com] >> Sent: Friday, January 29, 1999 2:38 PM >> To: WebDNA-Talk >> Subject: Multithreading of [replace] >> >> >> This question is mostly for Grant H. as I think he probably would >> be the most >> qualified person to answer this. >> >> Given that I have written a database to do logging of visitors to >> the website, >> so that I can do my own analysis of who did what, I am running >> into a problem >> with the sheer number of write commands being issued to this >> database, and it >> affecting the speed of the whole website. The code shown below is >> at the top of >> every webpage. (the carriage returns are mine, for this email, >> and are not on >> the actual code; also the info appended and replaced is reduced >> for this email, >> in reality, almost 20 fields are set) >> >> [search >> db=carts.db&eqCARTdata=[cart]&DATEtype=date&DATEsort=1&DATEsdir=de] >> [showif [numfound]=0] >> [Append db=carts.db] >> cart=[cart]&date=[date]&firstpage=[thisurl]&lastpage=[thisurl] >> [/append] >> [/showif] >> [showif [numfound]=1] >> [replace db=carts.db&eqCARTdata=[cart]] >> lastpage=[thisurl] >> [/replace] >> [/showif] >> [/search] >> >> >> Now, I am using a search instead of two lookups, but I am under >> the impression >> that searching records isn't very time consuming. This code works >> just fine >> until there are about 25-30 connections simultaneously on the >> server, at which >> point this code almost locks the poor server up. Please remember, >> this is at the >> top of each and every page viewed. I can only assume that one >> page can not load >> until the last finishes, because this code causes a database >> write thus locking >> read access to the database until the write is complete. Also, >> with about 7000 >> records in the database (two days worth for one site) even at >> lower connections >> (10+) this runs kind of slow. I'm sure the server would be plenty >> snappy if I >> didn't have this on the pages, but unfortunately, the only other >> option is to >> write it into cart headers, causing the creation of a file (5000 >> a day) for >> every cart issued. >> >> So, Grant, here is my question: Is there anything that can be >> optimized that >> will alleviate the speed hit this code causes under load? Associated Messages, from the most recent to the oldest:

    
  1. Re: Multithreading of [replace] (Kenneth Grome 1999)
  2. Re: Multithreading of [replace] (Christer Olsson 1999)
  3. Re: Multithreading of [replace] (Kenneth Grome 1999)
  4. Re: Multithreading of [replace] (Brian B. Burton 1999)
  5. Re: Multithreading of [replace] (Grant Hulbert 1999)
  6. RE: Multithreading of [replace] (Olin Lagon 1999)
  7. Re: Multithreading of [replace] (Brian B. Burton 1999)
  8. RE: Multithreading of [replace] (Olin Lagon 1999)
  9. Multithreading of [replace] (Brian B. Burton 1999)
> Brian, > > What about another approach. How about using the following code instead: > > [appendfile logs/[date > %d%b%Y].txt][cart][date][thisurl]etc...[/appendfile] > > This way, every day you get a new logfile with all of your hits appended to > one large text file. Now every week, you can do a listFiles command and > sweep up these databases, summarize on the cart field (you need to create > .hdr files) to get unique records, the produce whatever reports you need. > > This shouldn't impact your server much as you are appending to an existing > file (first hit of day auto creates a new file). If you are logging a lot of > information that relies on logic, you may want to wrap this in a spawn > context. > > Aloha, Olin 3 reasons I don't do this. a) no visitor count on the fly. b) how do you figure first page and last page, start_visit_time and end_visit_time? c) how do you put this info into the shopping cart when the check out? d) BONUS - How do you read into ram the resulting 350 meg file,(assuming one lime per page per visitor, along with all of the other info captured)The database is the best solution, I just can't figure out how to make it work efficiently.If I spawn a process off, does it remember the variables of the page that created the spawn? (IE cart) Brian B. Burton BOFH - Department of Redundancy Department --------------------------------------------------------------- MMT Solutions - Specializing in Online Shopping Solutions 973-808-8644 http://www.safecommerce.comAre you a Web Programmer? I am today.> >> -----Original Message----- >> From: Brian B. Burton [mailto:brian@burtons.com] >> Sent: Friday, January 29, 1999 2:38 PM >> To: WebDNA-Talk >> Subject: Multithreading of [replace] >> >> >> This question is mostly for Grant H. as I think he probably would >> be the most >> qualified person to answer this. >> >> Given that I have written a database to do logging of visitors to >> the website, >> so that I can do my own analysis of who did what, I am running >> into a problem >> with the sheer number of write commands being issued to this >> database, and it >> affecting the speed of the whole website. The code shown below is >> at the top of >> every webpage. (the carriage returns are mine, for this email, >> and are not on >> the actual code; also the info appended and replaced is reduced >> for this email, >> in reality, almost 20 fields are set) >> >> [search >> db=carts.db&eqCARTdata=[cart]&DATEtype=date&DATEsort=1&DATEsdir=de] >> [showif [numfound]=0] >> [Append db=carts.db] >> cart=[cart]&date=[date]&firstpage=[thisurl]&lastpage=[thisurl] >> [/append] >> [/showif] >> [showif [numfound]=1] >> [replace db=carts.db&eqCARTdata=[cart]] >> lastpage=[thisurl] >> [/replace] >> [/showif] >> [/search] >> >> >> Now, I am using a search instead of two lookups, but I am under >> the impression >> that searching records isn't very time consuming. This code works >> just fine >> until there are about 25-30 connections simultaneously on the >> server, at which >> point this code almost locks the poor server up. Please remember, >> this is at the >> top of each and every page viewed. I can only assume that one >> page can not load >> until the last finishes, because this code causes a database >> write thus locking >> read access to the database until the write is complete. Also, >> with about 7000 >> records in the database (two days worth for one site) even at >> lower connections >> (10+) this runs kind of slow. I'm sure the server would be plenty >> snappy if I >> didn't have this on the pages, but unfortunately, the only other >> option is to >> write it into cart headers, causing the creation of a file (5000 >> a day) for >> every cart issued. >> >> So, Grant, here is my question: Is there anything that can be >> optimized that >> will alleviate the speed hit this code causes under load? Brian B. Burton

DOWNLOAD WEBDNA NOW!

Top Articles:

Talk List

The WebDNA community talk-list is the best place to get some help: several hundred extremely proficient programmers with an excellent knowledge of WebDNA and an excellent spirit will deliver all the tips and tricks you can imagine...

Related Readings:

BUG REPORT: Delete context ignores max parameter (1998) [WebDNA] mail header timezone problem (2008) sorting and summarizing (1998) The code, one more time. (2003) [taxrate] question (1997) WebCatalog 2.1 for NT (1998) [Shell] question (2000) AAgghh!! Help, please. SSL strikes again. (1997) RAM variables (1997) [WriteFile] problems (1997) [WebDNA] Solaris version? (2008) & in Lookups (1997) [OT] Weird Characters (2004) wild question (1998) WebCatalog can't find database (1997) 2nd WebCatalog2 Feature Request (1996) New Mac Public Beta Available (1997) WebCatalog/Mac 2.1b2 New Features (1997) Attention SMSI - DOCS Error (2004) ConvertChars (1998)