Here's a number of wonderful bits I picked up from what I've been able to watch:
- The 99th percentile, or another percentile "deep into the long tail", is much more responsive to condition changes than is the median. Monitor that when possible.
- There's a lot of time between when a user begins to express their intention (by typing the URL or their search query) and when they act on it, so that idle time is a potential moment we can use to improve load speeds if we can figure out how to use it. Google Chrome uses prerendering with both Omnibox and Google search result pages. It sounds like a very good idea to help Firefox, Chrome, and any other browser that clues in on this with some <link rel="prefetch"> and <link rel="prerender"> tags when we have a strong hint where the user will go next.
- There's a Page Visibility API.
- IE now exposes information about a "restarted tokenizer", offering insight into how to make it better able to load things earlier. The speaker said that the IE Blog has more info; it looks like that might be referring to this post or perhaps this one about the Fiddler tool.
- WebPageTest has iPhone testing and ever-deeper tooling, including downloading detailed Chrome timeline information. Thanks to Akamai for open-sourcing the Blaze.io code that makes this possible!
- Google PageSpeed Insights has a Critical Path Explorer that traces asset loading to its cause, documenting what the blocking elements are.
- The book "How Complex Systems Fail" is reportedly critical reading. "Resilience engineering."