Re: [WebDNA] CORRECTION: 60% failure rate using replace in a loop

This WebDNA talk-list message is from

2010


It keeps the original formatting.
numero = 105623
interpreted = N
texte = > Ken, something else: do you use your code locally? > on a remote server? using WebDNA.fcgi 7.0? > with accelerator activated? I'm doing everything on the same laptop development computer running v6.2. I don't know what the "accelerator" is that you're referring to, but when it comes time to put this site online I'm hoping to use v7.x and probably your accelerator, assuming it's faster than what I'm using now. > i am doing some basic tests on an accelerated > WebDNA.fcgi 7.0 (local) and so far, i can append > 4,500 records per second and replacing 50,000 > records takes less than a second. Here my results > > Open a database (2 fields), append 50000 records using a loop 00:00:13 > Replaces 50000 records 00:00:00 > ReplaceFoundItems 50000 records 00:00:00 > Just delete the records 00:00:00 > global time: 00:00:13 I just ran your test on my computer and got these results: Open a database (2 fields), append 50000 records using a loop 00:00:06 Replaces 50000 records 00:00:00 ReplaceFoundItems 50000 records 00:00:00 Just delete the records 00:00:00 global time: 00:00:06 I ran it several times and came up with 5 seconds instead of 6 on two other tries, so mine is twice as fast as yours. By the way, you could use [elapsedtime] to produce more precise results. Here, try this code if you wish: [!] ## Check for the Database first - create it if not there --[/!] [showif T=[fileinfo file=speed.txt][exists][/fileinfo]] [closedatabase db=speed.txt] [deletefile file=speed.txt] [/showif] [showif F=[fileinfo file=speed.txt][exists][/fileinfo]] [writefile file=speed.txt]sku value [/writefile] [/showif] [!] ## Delete the records in the DB to ensure the search is 'clean' --[/!] [delete db=speed.txt&neSKUdata=find_all]
start time = [elapsedtime] ticks
Open a database (2 fields), append 50000 records using a loop = [math show=f]startTicks=[elapsedtime][/math] [math show=f]endTicks=[elapsedtime][/math] [math]thisTicks=endTicks-startTicks[/math] ticks
Replaces 50000 records = [math show=f]startTicks=[elapsedtime][/math] [math show=f]endTicks=[elapsedtime][/math] [math]thisTicks=endTicks-startTicks[/math] ticks
ReplaceFoundItems 50000 records = [math show=f]startTicks=[elapsedtime][/math] [math show=f]endTicks=[elapsedtime][/math] [math]thisTicks=endTicks-startTicks[/math] ticks
Just delete the records = [math show=f]startTicks=[elapsedtime][/math] [math show=f]endTicks=[elapsedtime][/math] [math]thisTicks=endTicks-startTicks[/math] ticks
total time for all tasks = [elapsedtime] ticks Here's my results: start time = 1 ticks Open a database (2 fields), append 50000 records using a loop = 469 ticks Replaces 50000 records = 0 ticks Just delete the records = 0 ticks total time for all tasks = 470 ticks Sincerely, Kenneth Grome > i am doing some basic tests on an accelerated WebDNA.fcgi 7.0 (local) and so far, i can append 4,500 records per second and replacing 50,000 records takes less than a second. > > > Here my results > > Open a database (2 fields), append 50000 records using a loop 00:00:13 > Replaces 50000 records 00:00:00 > ReplaceFoundItems 50000 records 00:00:00 > Just delete the records 00:00:00 > global time: 00:00:13 > > Here the test code: > ---------------------------------------------------------------------- > > > [!] ##Check for the Database first - create it if not there--[/!] > [showif T=[fileinfo file=speed.txt][exists][/fileinfo]] > [closedatabase db=speed.txt] > [deletefile file=speed.txt] > [/showif] > > [showif F=[fileinfo file=speed.txt][exists][/fileinfo]] > [writefile file=speed.txt]skuvalue > [/writefile] > [/showif] > > [!] ##Delete the records in the DB to ensure the search is 'clean'--[/!] > [delete db=speed.txt&neSKUdata=find_all] > > > [text]start_time_global=[time][/text] > > >
> Open a database (2 fields), append 50000 records using a loop > > [Math time]{[time]}-{[start_time]}[/Math] > >
> Replaces 50000 records > > [Math time]{[time]}-{[start_time]}[/Math] > >
> ReplaceFoundItems 50000 records > > [Math time]{[time]}-{[start_time]}[/Math] > >
> Just delete the records > > [Math time]{[time]}-{[start_time]}[/Math] > >
> global time: [Math time]{[time]}-{[start_time_global]}[/Math] > ---------------------------------------------------------------------- > > - chris > > > On Jul 19, 2010, at 9:25, Kenneth Grome wrote: > > > Here's the code in my trigger template which is requested once per second. Instead of trying to create 45 new records in a couple of seconds I have resorted to creating only 1-4 new records per second (although I want to create many more for a truly realistic auction simulation). > > > > The tickets.db holds all the bidderID (sku) values. I need to search this db and retrieve a series of random bidderID's for the replace contexts: > > > > > > *************************************************************** > > > > > > [hideif [getchars start=4&end=5][time][/getchars]\2] > > [showif [getchars start=7&end=8][time][/getchars]\11] > > [search db=test/tickets.db&eqemaildatarq=XXX&raemailsort=1&max=[math]ceil((1+[random])/25)[/math]][founditems] > > [replace db=test/bids.db&eqbidderIDdatarq=[blank]&asidxsort=1&idxtype=num&max=1]bidTime=[time]&bidderID=[sku][/replace] > > [/founditems][/search] > > [/showif] > > [/hideif] > > > > > > [showif [getchars start=4&end=5][time][/getchars]\6] > > [showif [getchars start=7&end=8][time][/getchars]\17] > > [search db=test/tickets.db&eqemaildatarq=XXX&raemailsort=1&max=[math]ceil((1+[random])/25)[/math]][founditems] > > [replace db=test/bids.db&eqbidderIDdatarq=[blank]&asidxsort=1&idxtype=num&max=1]bidTime=[time]&bidderID=[sku][/replace] > > [/founditems][/search] > > [/showif] > > [/showif] > > > > *************************************************************** > > > > > > Naturally I cannot use this technique with replacefounditems because I'm searching in a different db than where the records need to be replaced. But there may be another way to approach this problem. I think this might work faster: > > > > 1- Search the tickets.db for (example) 45 random sku values > > 2. Stores these skus in an indexed table > > 3- Search the bids.db for the next 45 records with blank bidderID values > > 4- Use replacefounditems to change these bidderID's based on lookups of the index values in the table > > > > I don't know if this will be faster or not, but it's probably worth a try as soon as I have some free time again. > > > > Sincerely, > > Kenneth Grome > > > > > > > > --------------------------------------------------------- > > This message is sent to you because you are subscribed to > > the mailing list . > > To unsubscribe, E-mail to: > > archives: http://mail.webdna.us/list/talk@webdna.us > > Bug Reporting: support@webdna.us > > --------------------------------------------------------- > This message is sent to you because you are subscribed to > the mailing list . > To unsubscribe, E-mail to: > archives: http://mail.webdna.us/list/talk@webdna.us > Bug Reporting: support@webdna.us > Associated Messages, from the most recent to the oldest:

    
  1. Re: [WebDNA] CORRECTION: 60% failure rate using replace in a loop (christophe.billiottet@webdna.us 2010)
  2. Re: [WebDNA] CORRECTION: 60% failure rate using replace in a loop (Kenneth Grome 2010)
  3. Re: [WebDNA] CORRECTION: 60% failure rate using replace in a loop (Kenneth Grome 2010)
  4. Re: [WebDNA] CORRECTION: 60% failure rate using replace in a loop (christophe.billiottet@webdna.us 2010)
  5. Re: [WebDNA] CORRECTION: 60% failure rate using replace in a loop (Kenneth Grome 2010)
  6. Re: [WebDNA] CORRECTION: 60% failure rate using replace in a loop (Kenneth Grome 2010)
  7. Re: [WebDNA] CORRECTION: 60% failure rate using replace in a loop (christophe.billiottet@webdna.us 2010)
  8. Re: [WebDNA] CORRECTION: 60% failure rate using replace in a loop (Kenneth Grome 2010)
  9. Re: [WebDNA] CORRECTION: 60% failure rate using replace in a loop (christophe.billiottet@webdna.us 2010)
  10. Re: [WebDNA] CORRECTION: 60% failure rate using replace in a loop (Kenneth Grome 2010)
  11. Re: [WebDNA] CORRECTION: 60% failure rate using replace in a loop (christophe.billiottet@webdna.us 2010)
  12. Re: [WebDNA] CORRECTION: 60% failure rate using replace in a loop (Kenneth Grome 2010)
  13. Re: [WebDNA] CORRECTION: 60% failure rate using replace in a loop (christophe.billiottet@webdna.us 2010)
  14. [WebDNA] CORRECTION: 60% failure rate using replace in a loop (Kenneth Grome 2010)
> Ken, something else: do you use your code locally? > on a remote server? using WebDNA.fcgi 7.0? > with accelerator activated? I'm doing everything on the same laptop development computer running v6.2. I don't know what the "accelerator" is that you're referring to, but when it comes time to put this site online I'm hoping to use v7.x and probably your accelerator, assuming it's faster than what I'm using now. > i am doing some basic tests on an accelerated > WebDNA.fcgi 7.0 (local) and so far, i can append > 4,500 records per second and replacing 50,000 > records takes less than a second. Here my results > > Open a database (2 fields), append 50000 records using a loop 00:00:13 > Replaces 50000 records 00:00:00 > ReplaceFoundItems 50000 records 00:00:00 > Just delete the records 00:00:00 > global time: 00:00:13 I just ran your test on my computer and got these results: Open a database (2 fields), append 50000 records using a loop 00:00:06 Replaces 50000 records 00:00:00 ReplaceFoundItems 50000 records 00:00:00 Just delete the records 00:00:00 global time: 00:00:06 I ran it several times and came up with 5 seconds instead of 6 on two other tries, so mine is twice as fast as yours. By the way, you could use [elapsedtime] to produce more precise results. Here, try this code if you wish: [!] ## Check for the Database first - create it if not there --[/!] [showif T=[fileinfo file=speed.txt][exists][/fileinfo]] [closedatabase db=speed.txt] [deletefile file=speed.txt] [/showif] [showif F=[fileinfo file=speed.txt][exists][/fileinfo]] [writefile file=speed.txt]sku value [/writefile] [/showif] [!] ## Delete the records in the DB to ensure the search is 'clean' --[/!] [delete db=speed.txt&neSKUdata=find_all]
start time = [elapsedtime] ticks
Open a database (2 fields), append 50000 records using a loop = [math show=f]startTicks=[elapsedtime][/math] [math show=f]endTicks=[elapsedtime][/math] [math]thisTicks=endTicks-startTicks[/math] ticks
Replaces 50000 records = [math show=f]startTicks=[elapsedtime][/math] [math show=f]endTicks=[elapsedtime][/math] [math]thisTicks=endTicks-startTicks[/math] ticks
ReplaceFoundItems 50000 records = [math show=f]startTicks=[elapsedtime][/math] [math show=f]endTicks=[elapsedtime][/math] [math]thisTicks=endTicks-startTicks[/math] ticks
Just delete the records = [math show=f]startTicks=[elapsedtime][/math] [math show=f]endTicks=[elapsedtime][/math] [math]thisTicks=endTicks-startTicks[/math] ticks
total time for all tasks = [elapsedtime] ticks Here's my results: start time = 1 ticks Open a database (2 fields), append 50000 records using a loop = 469 ticks Replaces 50000 records = 0 ticks Just delete the records = 0 ticks total time for all tasks = 470 ticks Sincerely, Kenneth Grome > i am doing some basic tests on an accelerated WebDNA.fcgi 7.0 (local) and so far, i can append 4,500 records per second and replacing 50,000 records takes less than a second. > > > Here my results > > Open a database (2 fields), append 50000 records using a loop 00:00:13 > Replaces 50000 records 00:00:00 > ReplaceFoundItems 50000 records 00:00:00 > Just delete the records 00:00:00 > global time: 00:00:13 > > Here the test code: > ---------------------------------------------------------------------- > > > [!] ##Check for the Database first - create it if not there--[/!] > [showif T=[fileinfo file=speed.txt][exists][/fileinfo]] > [closedatabase db=speed.txt] > [deletefile file=speed.txt] > [/showif] > > [showif F=[fileinfo file=speed.txt][exists][/fileinfo]] > [writefile file=speed.txt]skuvalue > [/writefile] > [/showif] > > [!] ##Delete the records in the DB to ensure the search is 'clean'--[/!] > [delete db=speed.txt&neSKUdata=find_all] > > > [text]start_time_global=[time][/text] > > >
> Open a database (2 fields), append 50000 records using a loop > > [Math time]{[time]}-{[start_time]}[/Math] > >
> Replaces 50000 records > > [Math time]{[time]}-{[start_time]}[/Math] > >
> ReplaceFoundItems 50000 records > > [Math time]{[time]}-{[start_time]}[/Math] > >
> Just delete the records > > [Math time]{[time]}-{[start_time]}[/Math] > >
> global time: [Math time]{[time]}-{[start_time_global]}[/Math] > ---------------------------------------------------------------------- > > - chris > > > On Jul 19, 2010, at 9:25, Kenneth Grome wrote: > > > Here's the code in my trigger template which is requested once per second. Instead of trying to create 45 new records in a couple of seconds I have resorted to creating only 1-4 new records per second (although I want to create many more for a truly realistic auction simulation). > > > > The tickets.db holds all the bidderID (sku) values. I need to search this db and retrieve a series of random bidderID's for the replace contexts: > > > > > > *************************************************************** > > > > > > [hideif [getchars start=4&end=5][time][/getchars]\2] > > [showif [getchars start=7&end=8][time][/getchars]\11] > > [search db=test/tickets.db&eqemaildatarq=XXX&raemailsort=1&max=[math]ceil((1+[random])/25)[/math]][founditems] > > [replace db=test/bids.db&eqbidderIDdatarq=[blank]&asidxsort=1&idxtype=num&max=1]bidTime=[time]&bidderID=[sku][/replace] > > [/founditems][/search] > > [/showif] > > [/hideif] > > > > > > [showif [getchars start=4&end=5][time][/getchars]\6] > > [showif [getchars start=7&end=8][time][/getchars]\17] > > [search db=test/tickets.db&eqemaildatarq=XXX&raemailsort=1&max=[math]ceil((1+[random])/25)[/math]][founditems] > > [replace db=test/bids.db&eqbidderIDdatarq=[blank]&asidxsort=1&idxtype=num&max=1]bidTime=[time]&bidderID=[sku][/replace] > > [/founditems][/search] > > [/showif] > > [/showif] > > > > *************************************************************** > > > > > > Naturally I cannot use this technique with replacefounditems because I'm searching in a different db than where the records need to be replaced. But there may be another way to approach this problem. I think this might work faster: > > > > 1- Search the tickets.db for (example) 45 random sku values > > 2. Stores these skus in an indexed table > > 3- Search the bids.db for the next 45 records with blank bidderID values > > 4- Use replacefounditems to change these bidderID's based on lookups of the index values in the table > > > > I don't know if this will be faster or not, but it's probably worth a try as soon as I have some free time again. > > > > Sincerely, > > Kenneth Grome > > > > > > > > --------------------------------------------------------- > > This message is sent to you because you are subscribed to > > the mailing list . > > To unsubscribe, E-mail to: > > archives: http://mail.webdna.us/list/talk@webdna.us > > Bug Reporting: support@webdna.us > > --------------------------------------------------------- > This message is sent to you because you are subscribed to > the mailing list . > To unsubscribe, E-mail to: > archives: http://mail.webdna.us/list/talk@webdna.us > Bug Reporting: support@webdna.us > Kenneth Grome

DOWNLOAD WEBDNA NOW!

Top Articles:

Talk List

The WebDNA community talk-list is the best place to get some help: several hundred extremely proficient programmers with an excellent knowledge of WebDNA and an excellent spirit will deliver all the tips and tricks you can imagine...

Related Readings:

Web Delivery First Time Setup Trouble (2000) TCPConnect (2002) [WebDNA] [thisurlplusget] not working (2012) hideif [x]=1,2,3.. (2003) syntax question, not in online refernce (1997) webcatalog 3.0 upgrade (1998) Almost a there but..bye bye NetCloak (1997) WebMerchant when CC network is down (1998) More on the email templates (1997) [CART] inside a [LOOP] (1997) SETCOOKIE Problems (2003) How can I Add several Items into the cart at once? (1997) Quickie question on the email templates (1997) Webcat/Webmerchant (1998) Moving Files (2000) Merging databases (1997) Tax Troubles (2003) autosensing lanague selection (1997) RE: WebDNA-Talk searchable? (1997) [shownext] and sort (1998)