This is likely due to the following waterfall:
- Page loads (documentReady fires)
- vw/base.js sees TOC present, and subsequently requests toc.js
- vw/base.js sees sections present, and subsequently requests section.js
- once toc.js is loaded, TOC links will work.
- once section.js is loaded, it jumps to the vwsec matching the requested anchor if that is actually a wiki section.
Unfortunately, to prevent a phishing technique, the user-generated anchors must be prefixed with vwsec and jumped to using Javascript after the document is ready, rather than using built-in browser or XenForo jumps. However, we may be able to reduce the time of the waterfall after that point. In general though, the timing delay is going to occur for any Javascript functionality, due to lazy-loading. The alternative to lazy-loading is to front-load scripts, which involves loading a lot of scripts when they aren't actually needed (something other requests have asked VaultWiki not to do). So the trick is to find a balance.
Unfortunately, in the case of anchor links, it is not possible to get this from the incoming request when PHP generates the page, because requested anchors are not actually sent to the server. It's only client-side. So we must rely on client-side technology like Javascript only, or we must assume that the toc.js is going to be needed in some cases and load it even if it may not be needed.
In 4.2, we eliminate dependency on YUI and move to JQuery, which is already loaded by XenForo, so there should not be a timing delay based on that. I will see if we can do more.
Two steps you can already take are as follows:
- Make sure your timing tests are not done in debug mode. While in debug mode, you are unable to benefit from some bundled script files that we already provide which reduce latency.
- Make sure you are using a CDN for static wiki assets (Options > VaultWiki: Site Config). This will further reduce latency, and avoid the problem of browsers only being able to load 6 files from the same domain at a time.
In the browser's tools, watch the request waterfall. Any further insight you can provide into the way it is flowing on your site can be helpful in determining the best ways to improve latency that will actually benefit your specific case.