texte = 1946--Apple-Mail=_42AD59E3-B43F-47BA-BAA1-E9732C2DAA70Content-Transfer-Encoding: quoted-printableContent-Type: text/plain;charset=utf-8I did a similar thing a while ago on 2 high traffic sitesBasically the concept was this. Every page had a 3 level deep menu =pulldowns. Those menus were driven by a backend CMS and stored across =various tables. The site was serving well over 1 million page views a =month and I was starting to feel the weight of it in performance.keep in mind that each pulldown was a recursive search for each level so =it really added up.Instead, I retooled the CMS so that the result of adding, deleting or =editing any menu in the admin resulted in a menu being written =completely to a single include file. This put the work of the recursive =searching on the backend and only when a change was needed.It made an immediate difference in page performance.Later, this was done on another site with similar traffic that was all =driven by SQL connections. Because of the ODBC performance hit, speed =was improved even more dramatically. This was especially true when the =tables had 1.5 million records in them.If you=E2=80=99re unsure of the impact throw an elapsedtime tag on the =page and you will instantly know just how much performance you squeezed =out of the system.HTHAlex> On Jul 3, 2018, at 2:24 PM, Lawrence Banahan
=wrote:>=20> I was more thinking of something like a CMS, with the engine Online.> Doesn't it make sense to have the content that change one a year to be =in static pages?> Wouldn't it be faster than having Webdna in the middle?> I'm working also on Wordpress websites, and it's so slow... That's how =I came through my searchs on some websites using static pages.--Apple-Mail=_42AD59E3-B43F-47BA-BAA1-E9732C2DAA70Content-Transfer-Encoding: quoted-printableContent-Type: text/html;charset=utf-8I =did a similar thing a while ago on 2 high traffic sites
Basically the concept was this. Every =page had a 3 level deep menu pulldowns. Those menus were driven by a =backend CMS and stored across various tables. The site was serving well =over 1 million page views a month and I was starting to feel the weight =of it in performance.
keep in mind that each pulldown was a recursive search for =each level so it really added up.
Instead, I retooled the CMS so that the =result of adding, deleting or editing any menu in the admin resulted in =a menu being written completely to a single include file. This put the =work of the recursive searching on the backend and only when a change =was needed.
It =made an immediate difference in page performance.
Later, this was done on another site =with similar traffic that was all driven by SQL connections. Because of =the ODBC performance hit, speed was improved even more dramatically. =This was especially true when the tables had 1.5 million records in =them.
If you=E2=80=99re unsure of the impact =throw an elapsedtime tag on the page and you will instantly know just =how much performance you squeezed out of the system.
HTH
Alex
I was =more thinking of something like a CMS, with the engine Online.
Doesn't= it make sense to have the content that change one a year to be in =static pages?
Wouldn't it be faster than having Webdna in the =middle?
I'm working also on Wordpress websites, and it's so =slow... That's how I came through my searchs on some websites using =static pages.
=---------------------------------------------------------This message is sent to you because you are subscribed tothe mailing list talk@webdna.usTo unsubscribe, E-mail to: talk-leave@webdna.usarchives: http://www.webdna.us/page.dna?numero=3D55Bug Reporting: support@webdna.us--Apple-Mail=_42AD59E3-B43F-47BA-BAA1-E9732C2DAA70--.
Associated Messages, from the most recent to the oldest:
1946--Apple-Mail=_42AD59E3-B43F-47BA-BAA1-E9732C2DAA70Content-Transfer-Encoding: quoted-printableContent-Type: text/plain;charset=utf-8I did a similar thing a while ago on 2 high traffic sitesBasically the concept was this. Every page had a 3 level deep menu =pulldowns. Those menus were driven by a backend CMS and stored across =various tables. The site was serving well over 1 million page views a =month and I was starting to feel the weight of it in performance.keep in mind that each pulldown was a recursive search for each level so =it really added up.Instead, I retooled the CMS so that the result of adding, deleting or =editing any menu in the admin resulted in a menu being written =completely to a single include file. This put the work of the recursive =searching on the backend and only when a change was needed.It made an immediate difference in page performance.Later, this was done on another site with similar traffic that was all =driven by SQL connections. Because of the ODBC performance hit, speed =was improved even more dramatically. This was especially true when the =tables had 1.5 million records in them.If you=E2=80=99re unsure of the impact throw an elapsedtime tag on the =page and you will instantly know just how much performance you squeezed =out of the system.HTHAlex> On Jul 3, 2018, at 2:24 PM, Lawrence Banahan =wrote:>=20> I was more thinking of something like a CMS, with the engine Online.> Doesn't it make sense to have the content that change one a year to be =in static pages?> Wouldn't it be faster than having Webdna in the middle?> I'm working also on Wordpress websites, and it's so slow... That's how =I came through my searchs on some websites using static pages.--Apple-Mail=_42AD59E3-B43F-47BA-BAA1-E9732C2DAA70Content-Transfer-Encoding: quoted-printableContent-Type: text/html;charset=utf-8I =did a similar thing a while ago on 2 high traffic sites
Basically the concept was this. Every =page had a 3 level deep menu pulldowns. Those menus were driven by a =backend CMS and stored across various tables. The site was serving well =over 1 million page views a month and I was starting to feel the weight =of it in performance.
keep in mind that each pulldown was a recursive search for =each level so it really added up.
Instead, I retooled the CMS so that the =result of adding, deleting or editing any menu in the admin resulted in =a menu being written completely to a single include file. This put the =work of the recursive searching on the backend and only when a change =was needed.
It =made an immediate difference in page performance.
Later, this was done on another site =with similar traffic that was all driven by SQL connections. Because of =the
ODBC performance hit, speed was improved even more dramatically. =This was especially true when the tables had 1.5 million records in =them.
If you=E2=80=99re unsure of the impact =throw an elapsedtime tag on the page and you will instantly know just =how much performance you squeezed out of the system.
HTH
Alex
I was =more thinking of something like a CMS, with the engine Online.
Doesn't= it make sense to have the content that change one a year to be in =static pages?
Wouldn't it be faster than having Webdna in the =middle?
I'm working also on Wordpress websites, and it's so =slow... That's how I came through my searchs on some websites using =static pages.
=---------------------------------------------------------This message is sent to you because you are subscribed tothe mailing list talk@webdna.usTo unsubscribe, E-mail to: talk-leave@webdna.usarchives: http://www.webdna.us/page.dna?numero=3D55Bug Reporting: support@webdna.us--Apple-Mail=_42AD59E3-B43F-47BA-BAA1-E9732C2DAA70--.
Alex Mccombie
DOWNLOAD WEBDNA NOW!
Top Articles:
Talk List
The WebDNA community talk-list is the best place to get some help: several hundred extremely proficient programmers with an excellent knowledge of WebDNA and an excellent spirit will deliver all the tips and tricks you can imagine...
Related Readings:
SAVECART (1997)
Problems with command=replace in realm (2000)
Where is f2? (1997)
Kaaaaahhhhhhhnnnnnnn! (1997)
[OT] How'd it go, Sal? (2006)
RE: redirect with more than 256 characters (1999)
ooops...WebCatalog [FoundItems] Problem - LONG - (1997)
lookup and two records? (1997)
Cart Number sequence (1997)
Fwd: Problems with Webcatalog Plug-in (1997)
[sendmail] questions... (1997)
Summing fields (1997)
Robots fill event log (1997)
[OT] Quick Java Help (2004)
Upgrade to 4.0 (2001)
[WebDNA] Clarifying talklist emails (2020)
Forms Search Questions (1997)
Nested tags count question (1997)
No comment (1997)
Sense/Disallow HTML tags during $Append (1997)