So if you are able to eliminate (more than) 90% of all requests based on simple caching, you will get a huge boost in rendering the page, especially if the latency to the server is higher. For me this boost is worth the additional effort of setting up a dispatcher.
(Usually I don't care about invalidating these files. When a new release is deployed on authoring, the cache is dropped as part of the deployment process, so that's not a problem.)
1 person found this helpful
More importantly if you access the author over a slow network do also configure gzip on the Apache so that the widgets.js ~6-7 MB and other heavy files come zipped
Thanks Jörg and marking as the correct answer, changing the /allowAuthorized to 1 does then give the benefits I was expecting. Perhaps someone from Adobe could update the files they provide (kb article) to have this as the default as it seems strange that the one they offer doesn't cache much.
Thanks Nainan - good point. I'd thought of setting expires headers to ensure browser caching but you're right that widgets.js would clearly benefit from a gzip.