Re: [WebDNA] Not even sure what to ask for help on. . . :(

This WebDNA talk-list message is from

2008


It keeps the original formatting.
numero = 101636
interpreted = N
texte = Ok.. sorry, but I gotta be brutally honest here... its just me, and nothing personal. I just want to try to understand the problem.... From your previous email you said: "Ok, so I just did a soft launch of a site on Friday and my site traffic jumped over 200%. Normally, that would be great, except the site has now slowed to a crawl." Which to me means that under normal conditions, low load, your lookups are working fine, and everything functions normally. If it was a problem in the coding, you would also see it in low load conditions as well... yes/no?....Lookups/searches, using WebDNA are normally extremely fast. Way faster then the available bandwidth as they are usually done in RAM, unless your pulling them via SQL or using an old 386/system 6 processor. Pulling from RAM means it doesn't have to read from disk, and I wouldn't use SQL unless I had several thousand records anyway. IMHO, I still think it's bandwidth. On a 6 mb/sec line you might max out, and start grinding to a halt at about 23 connections(avg 256kbs per connection) pulling streaming all at once. I currently have to split out loads for the same reason. I park all intensive loads on a high bandwidth network, and use our servers, on a completely separate network, to just serve out the code. It also has the advantage of differentiating between a coding/bandwidth problem. I actually purchase space on the backbone for the same reason for about $3.00-$4.00 per month/site from directnic. Just my 2 cents.... Rob On 15-Dec-08, at 7:00 PM, David Bastedo wrote: > I stream through the same pipe and can handle up to 6mb a second - > which I have come close to, but not quite attained. The max I have hit > in the last week is 5.5. The a/v is streaming fine - though that > server is also maxed out and is being updated and reconfigured. That > has been rectified temporarily - there is a memory leak in Flash Com > Server, though my connection can handle several hundred streaming > conncetions. > > I did do a test and am spending my night doing more. One culpret is > the looksups in that search. Doing a nested search is way faster. I > hope to go through all the major chunks and see what I can streamline. > > I'll post some results in a few hours if that helps of side by side > tests, which is pretty well what I need to do. > > D. > > On Mon, Dec 15, 2008 at 8:25 PM, Rob wrote: >> Sounds more like a bandwidth problem then a WebDNA problem.... What >> kind of >> line is this on? >> >> Rob >> >> >> On 15-Dec-08, at 2:59 PM, David Bastedo wrote: >> >>> Ok, so I just did a soft launch of a site on Friday and my site >>> traffic jumped over 200%. Normally, that would be great, except the >>> site has now slowed to a crawl. >>> >>> I have many images on a seperate server, I have just added 6gb to >>> the >>> server - emrgency like, hoping it will help - it has - marginally - >>> and now I am in the process of adding a third server - I also have >>> one >>> for streaming - and am planning on moving everything to MySQL - I >>> think - though it would not be my preference. >>> >>> Anyway before I can even contemplate that - doing that will take a >>> fair bit of time - I need to get the current site as fast as >>> possible, >>> to buy me some time to do this new update. >>> >>> I guess my biggest question is on tables. I am using tables on this >>> site and I think that this may be the biggest issue. I need to do a >>> lot of sorting and it "seemed" like the best, most convinient way to >>> do it, though now I am wondering if this has caused way more >>> problems >>> that it has solved. >>> >>> Is it better to write to a temp db and then sort those results, if I >>> have to, rather than a table: >>> >>> Here is a sample piece of code. (I am making custom music >>> playlists BTW) >>> >>> [table >>> name=MyPlayListData&fields=PlayListItemID,PlayListID,Sequence,ConcertID,FLV_FileName,UserID,DateCreated,LastUpdate,FLV_Length,PlayListName,PlayListDescription,AlbumID,HSPSongID,PlayListID,PlayListDescription,UserID,PlayListType,DateCreated,timeTotal,MySongName,AlbumName,releaseDate,rating,MyRating][/table] >>> >>> >>> [Search >>> db=[pagePath]databases/ >>> aaa >>> .db >>> &gePlayListIDdata >>> = >>> 0 >>> &eqAlbumIDdata >>> =303&albumIDtype=num&[SO]sort=1&[SO]dir=[SB]&[SO]Type=num] >>> [founditems] >>> [replace >>> table >>> = >>> MyPlayListData >>> &eqPlayListIDdatarq >>> = >>> [PlayListID >>> ]&PlayListIDtype >>> = >>> num >>> &eqUserIDdatarq >>> =[UserID]&UserIDtype=num&eqHSPSongIDdatarq=[HSPSongID]&append=T][!] >>> [/!]PlayListItemID=[PlayListItemID][!] >>> [/!]&PlayListID=[PlayListID][!] >>> [/!]&Sequence=[Sequence][!] >>> [/!]&ConcertID=[ConcertID][!] >>> [/!]&FLV_FileName=[FLV_FileName][!] >>> [/!]&UserID=[UserID][!] >>> [/!]&DateCreated=[DateCreated][!] >>> [/!]&LastUpdate=[LastUpdate][!] >>> [/!]&FLV_Length=[FLV_Length][!] >>> [/!]&PlayListName=[PlayListName][!] >>> [/!]&PlayListDescription=[PlayListDescription][!] >>> [/!]&AlbumID=[LOOKUP >>> >>> db=[pagePath]databases/ >>> yyy >>> .db&value=[PlayListID]&lookInField=PlayListID&returnField=AlbumID] >>> [!] >>> [/!]&HSPSongID=[HSPSongID][!] >>> [/!]&PlayListName=[LOOKUP >>> >>> db=[pagePath]databases/ >>> yyy >>> .db >>> &value >>> =[PlayListID]&lookInField=PlayListID&returnField=PlayListName][!] >>> [/!]&PlayListDescription=[LOOKUP >>> >>> db=[pagePath]databases/ >>> yyy >>> .db >>> &value >>> = >>> [PlayListID >>> ]&lookInField=PlayListID&returnField=PlayListDescription][!] >>> [/!]&UserID=[LOOKUP >>> >>> db=[pagePath]databases/ >>> yyy >>> .db&value=[PlayListID]&lookInField=PlayListID&returnField=UserID][!] >>> [/!]&PlayListType=[LOOKUP >>> >>> db=[pagePath]databases/ >>> yyy >>> .db >>> &value >>> =[PlayListID]&lookInField=PlayListID&returnField=PlayListType][!] >>> [/!]&DateCreated=[LOOKUP >>> >>> db=[pagePath]databases/ >>> yyy >>> .db >>> &value=[PlayListID]&lookInField=PlayListID&returnField=DateCreated] >>> [!] >>> [/!]&rating=[LOOKUP >>> >>> db=[pagePath]databases/ >>> yyy >>> .db >>> &value >>> = >>> [PlayListID >>> ]&lookInField=PlayListID&returnField=rating]&MyRating=[search >>> >>> db=[pagePath]databases/ >>> xxx.db&eqStoryIDdatarq=[PlayListID]&eqUserIDdatarq=[GETCOOKIE >>> name=xxx]][founditems][TheRating][/founditems][/search][/replace] >>> >>> [/founditems] >>> [/search] >>> >>> -> then I have to do two more seraches. One for the results and one >>> for next/prev >>> >>> [search >>> table >>> = >>> MyPlayListData >>> &gePlayListIDData >>> = >>> 0 >>> &eqalbumIDdatarq >>> = >>> 303 >>> &PlayListIDsumm >>> = >>> T >>> &[SB >>> ]sort=1&[SB]sdir=[SO]&[SB]type=[SB_type]&startAt=[startat]&max=10] >>> >>> >>> I know I can make this code more streamlined, but I am not sure if >>> it >>> is the tables that are a problem. >>> >>> Without a load, these pages work great, but with the increased >>> traffic, it now takes - well WAY too long to load a page. Anyway, >>> I am >>> going through and make my code thinner, as it were - I can get rid >>> of >>> a bunch of the lookups above and replace with another search, but >>> I am >>> wondering if I should replace all the tables in the site with a temp >>> .db. >>> >>> Any thoughts? or advice? Thanks in advance. >>> >>> D. >>> >>> >>> -- >>> David Bastedo >>> Ten Plus One Communications Inc. >>> http://www.10plus1.com >>> 416.603.2223 ext.1 >>> --------------------------------------------------------- >>> This message is sent to you because you are subscribed to >>> the mailing list . >>> To unsubscribe, E-mail to: >>> archives: http://mail.webdna.us/list/talk@webdna.us >>> old archives: http://dev.webdna.us/TalkListArchive/ >> >> --------------------------------------------------------- >> This message is sent to you because you are subscribed to >> the mailing list . >> To unsubscribe, E-mail to: >> archives: http://mail.webdna.us/list/talk@webdna.us >> old archives: http://dev.webdna.us/TalkListArchive/ >> > > > > -- > David Bastedo > Ten Plus One Communications Inc. > http://www.10plus1.com > 416.603.2223 ext.1 > --------------------------------------------------------- > This message is sent to you because you are subscribed to > the mailing list . > To unsubscribe, E-mail to: > archives: http://mail.webdna.us/list/talk@webdna.us > old archives: http://dev.webdna.us/TalkListArchive/ Associated Messages, from the most recent to the oldest:

    
  1. Re: [WebDNA] Not even sure what to ask for help on. . . :( ("David Bastedo" 2008)
  2. Re: [WebDNA] Not even sure what to ask for help on. . . :( ("David Bastedo" 2008)
  3. Re: [WebDNA] Not even sure what to ask for help on. . . :( (christophe.billiottet@webdna.us 2008)
  4. Re: [WebDNA] Not even sure what to ask for help on. . . :( (Kenneth Grome 2008)
  5. Re: [WebDNA] Not even sure what to ask for help on. . . :( ("David Bastedo" 2008)
  6. Re: [WebDNA] Not even sure what to ask for help on. . . :( (Frank Nordberg 2008)
  7. Re: [WebDNA] Not even sure what to ask for help on. . . :( ("David Bastedo" 2008)
  8. Re: [WebDNA] Not even sure what to ask for help on. . . :( (Kenneth Grome 2008)
  9. Re: [WebDNA] Not even sure what to ask for help on. . . :( (Rob 2008)
  10. Re: [WebDNA] Not even sure what to ask for help on. . . :( ("David Bastedo" 2008)
  11. Re: [WebDNA] Not even sure what to ask for help on. . . :( (Rob 2008)
  12. [WebDNA] Not even sure what to ask for help on. . . :( ("David Bastedo" 2008)
Ok.. sorry, but I gotta be brutally honest here... its just me, and nothing personal. I just want to try to understand the problem.... From your previous email you said: "Ok, so I just did a soft launch of a site on Friday and my site traffic jumped over 200%. Normally, that would be great, except the site has now slowed to a crawl." Which to me means that under normal conditions, low load, your lookups are working fine, and everything functions normally. If it was a problem in the coding, you would also see it in low load conditions as well... yes/no?....Lookups/searches, using WebDNA are normally extremely fast. Way faster then the available bandwidth as they are usually done in RAM, unless your pulling them via SQL or using an old 386/system 6 processor. Pulling from RAM means it doesn't have to read from disk, and I wouldn't use SQL unless I had several thousand records anyway. IMHO, I still think it's bandwidth. On a 6 mb/sec line you might max out, and start grinding to a halt at about 23 connections(avg 256kbs per connection) pulling streaming all at once. I currently have to split out loads for the same reason. I park all intensive loads on a high bandwidth network, and use our servers, on a completely separate network, to just serve out the code. It also has the advantage of differentiating between a coding/bandwidth problem. I actually purchase space on the backbone for the same reason for about $3.00-$4.00 per month/site from directnic. Just my 2 cents.... Rob On 15-Dec-08, at 7:00 PM, David Bastedo wrote: > I stream through the same pipe and can handle up to 6mb a second - > which I have come close to, but not quite attained. The max I have hit > in the last week is 5.5. The a/v is streaming fine - though that > server is also maxed out and is being updated and reconfigured. That > has been rectified temporarily - there is a memory leak in Flash Com > Server, though my connection can handle several hundred streaming > conncetions. > > I did do a test and am spending my night doing more. One culpret is > the looksups in that search. Doing a nested search is way faster. I > hope to go through all the major chunks and see what I can streamline. > > I'll post some results in a few hours if that helps of side by side > tests, which is pretty well what I need to do. > > D. > > On Mon, Dec 15, 2008 at 8:25 PM, Rob wrote: >> Sounds more like a bandwidth problem then a WebDNA problem.... What >> kind of >> line is this on? >> >> Rob >> >> >> On 15-Dec-08, at 2:59 PM, David Bastedo wrote: >> >>> Ok, so I just did a soft launch of a site on Friday and my site >>> traffic jumped over 200%. Normally, that would be great, except the >>> site has now slowed to a crawl. >>> >>> I have many images on a seperate server, I have just added 6gb to >>> the >>> server - emrgency like, hoping it will help - it has - marginally - >>> and now I am in the process of adding a third server - I also have >>> one >>> for streaming - and am planning on moving everything to MySQL - I >>> think - though it would not be my preference. >>> >>> Anyway before I can even contemplate that - doing that will take a >>> fair bit of time - I need to get the current site as fast as >>> possible, >>> to buy me some time to do this new update. >>> >>> I guess my biggest question is on tables. I am using tables on this >>> site and I think that this may be the biggest issue. I need to do a >>> lot of sorting and it "seemed" like the best, most convinient way to >>> do it, though now I am wondering if this has caused way more >>> problems >>> that it has solved. >>> >>> Is it better to write to a temp db and then sort those results, if I >>> have to, rather than a table: >>> >>> Here is a sample piece of code. (I am making custom music >>> playlists BTW) >>> >>> [table >>> name=MyPlayListData&fields=PlayListItemID,PlayListID,Sequence,ConcertID,FLV_FileName,UserID,DateCreated,LastUpdate,FLV_Length,PlayListName,PlayListDescription,AlbumID,HSPSongID,PlayListID,PlayListDescription,UserID,PlayListType,DateCreated,timeTotal,MySongName,AlbumName,releaseDate,rating,MyRating][/table] >>> >>> >>> [Search >>> db=[pagePath]databases/ >>> aaa >>> .db >>> &gePlayListIDdata >>> = >>> 0 >>> &eqAlbumIDdata >>> =303&albumIDtype=num&[SO]sort=1&[SO]dir=[SB]&[SO]Type=num] >>> [founditems] >>> [replace >>> table >>> = >>> MyPlayListData >>> &eqPlayListIDdatarq >>> = >>> [PlayListID >>> ]&PlayListIDtype >>> = >>> num >>> &eqUserIDdatarq >>> =[UserID]&UserIDtype=num&eqHSPSongIDdatarq=[HSPSongID]&append=T][!] >>> [/!]PlayListItemID=[PlayListItemID][!] >>> [/!]&PlayListID=[PlayListID][!] >>> [/!]&Sequence=[Sequence][!] >>> [/!]&ConcertID=[ConcertID][!] >>> [/!]&FLV_FileName=[FLV_FileName][!] >>> [/!]&UserID=[UserID][!] >>> [/!]&DateCreated=[DateCreated][!] >>> [/!]&LastUpdate=[LastUpdate][!] >>> [/!]&FLV_Length=[FLV_Length][!] >>> [/!]&PlayListName=[PlayListName][!] >>> [/!]&PlayListDescription=[PlayListDescription][!] >>> [/!]&AlbumID=[LOOKUP >>> >>> db=[pagePath]databases/ >>> yyy >>> .db&value=[PlayListID]&lookInField=PlayListID&returnField=AlbumID] >>> [!] >>> [/!]&HSPSongID=[HSPSongID][!] >>> [/!]&PlayListName=[LOOKUP >>> >>> db=[pagePath]databases/ >>> yyy >>> .db >>> &value >>> =[PlayListID]&lookInField=PlayListID&returnField=PlayListName][!] >>> [/!]&PlayListDescription=[LOOKUP >>> >>> db=[pagePath]databases/ >>> yyy >>> .db >>> &value >>> = >>> [PlayListID >>> ]&lookInField=PlayListID&returnField=PlayListDescription][!] >>> [/!]&UserID=[LOOKUP >>> >>> db=[pagePath]databases/ >>> yyy >>> .db&value=[PlayListID]&lookInField=PlayListID&returnField=UserID][!] >>> [/!]&PlayListType=[LOOKUP >>> >>> db=[pagePath]databases/ >>> yyy >>> .db >>> &value >>> =[PlayListID]&lookInField=PlayListID&returnField=PlayListType][!] >>> [/!]&DateCreated=[LOOKUP >>> >>> db=[pagePath]databases/ >>> yyy >>> .db >>> &value=[PlayListID]&lookInField=PlayListID&returnField=DateCreated] >>> [!] >>> [/!]&rating=[LOOKUP >>> >>> db=[pagePath]databases/ >>> yyy >>> .db >>> &value >>> = >>> [PlayListID >>> ]&lookInField=PlayListID&returnField=rating]&MyRating=[search >>> >>> db=[pagePath]databases/ >>> xxx.db&eqStoryIDdatarq=[PlayListID]&eqUserIDdatarq=[GETCOOKIE >>> name=xxx]][founditems][TheRating][/founditems][/search][/replace] >>> >>> [/founditems] >>> [/search] >>> >>> -> then I have to do two more seraches. One for the results and one >>> for next/prev >>> >>> [search >>> table >>> = >>> MyPlayListData >>> &gePlayListIDData >>> = >>> 0 >>> &eqalbumIDdatarq >>> = >>> 303 >>> &PlayListIDsumm >>> = >>> T >>> &[SB >>> ]sort=1&[SB]sdir=[SO]&[SB]type=[SB_type]&startAt=[startat]&max=10] >>> >>> >>> I know I can make this code more streamlined, but I am not sure if >>> it >>> is the tables that are a problem. >>> >>> Without a load, these pages work great, but with the increased >>> traffic, it now takes - well WAY too long to load a page. Anyway, >>> I am >>> going through and make my code thinner, as it were - I can get rid >>> of >>> a bunch of the lookups above and replace with another search, but >>> I am >>> wondering if I should replace all the tables in the site with a temp >>> .db. >>> >>> Any thoughts? or advice? Thanks in advance. >>> >>> D. >>> >>> >>> -- >>> David Bastedo >>> Ten Plus One Communications Inc. >>> http://www.10plus1.com >>> 416.603.2223 ext.1 >>> --------------------------------------------------------- >>> This message is sent to you because you are subscribed to >>> the mailing list . >>> To unsubscribe, E-mail to: >>> archives: http://mail.webdna.us/list/talk@webdna.us >>> old archives: http://dev.webdna.us/TalkListArchive/ >> >> --------------------------------------------------------- >> This message is sent to you because you are subscribed to >> the mailing list . >> To unsubscribe, E-mail to: >> archives: http://mail.webdna.us/list/talk@webdna.us >> old archives: http://dev.webdna.us/TalkListArchive/ >> > > > > -- > David Bastedo > Ten Plus One Communications Inc. > http://www.10plus1.com > 416.603.2223 ext.1 > --------------------------------------------------------- > This message is sent to you because you are subscribed to > the mailing list . > To unsubscribe, E-mail to: > archives: http://mail.webdna.us/list/talk@webdna.us > old archives: http://dev.webdna.us/TalkListArchive/ Rob

DOWNLOAD WEBDNA NOW!

Top Articles:

Talk List

The WebDNA community talk-list is the best place to get some help: several hundred extremely proficient programmers with an excellent knowledge of WebDNA and an excellent spirit will deliver all the tips and tricks you can imagine...

Related Readings:

Targeted Redirect (1999) WebCat2b13 Mac plugin - [sendmail] and checkboxes (1997) WebCat2.0 [format thousands .0f] no go (1997) BUG in [showif] using ^ (contains) (1997) WebCat2b13 Mac plugin - [sendmail] and checkboxes (1997) Sun doesn't have an emailer app? (2000) How do you cause a new window to appear behind the current window? (1999) Quickie question on the email templates (1997) List Address Changed! (1998) Re:HTTP header line is too long? (1997) thisurl & arguments (2000) Adding a product from another site (1997) Join us at MacWorld! - Part 2 (1998) SEARCH RESULTS (1998) Emailer (1997) [/application] error? (1997) Forms Search Questions (1997) HELP WITH DATES (1997) Problem (1997) Nested tags count question (1997)