7 votes

Smaller HTML Payloads with Service Workers

7 comments

  1. smores
    Link
    This approach actually reminded me a lot of Tildes' usage of intercooler.js and HATEOAS. The service worker approach has a neat advantage of using service workers to save the common pieces (e.g....

    This approach actually reminded me a lot of Tildes' usage of intercooler.js and HATEOAS. The service worker approach has a neat advantage of using service workers to save the common pieces (e.g. the Tildes header and footer) across independent page loads (i.e. if you close your browser and open it again later). This lets you speed up and minimize all page loads after the first.

    4 votes
  2. [3]
    joplin
    Link
    This sounds neat at first, but given how awful Javascript has turned out, I don't want web pages to be able to send me any executable code. It's a huge vector for malware and tracking and doesn't...

    This sounds neat at first, but given how awful Javascript has turned out, I don't want web pages to be able to send me any executable code. It's a huge vector for malware and tracking and doesn't usually add anything.

    2 votes
    1. [2]
      Diff
      Link Parent
      I can totally get behind this line of thinking. I blacklist Javascript, but then I also will whitelist a site that doesn't break with it disabled. Just because there's a lot of bad behavior...

      I can totally get behind this line of thinking. I blacklist Javascript, but then I also will whitelist a site that doesn't break with it disabled. Just because there's a lot of bad behavior doesn't mean there's no value to be found. The vast majority of websites use Javascript to do nothing much of value while adding on entire mebibytes of weight.

      But I find articles like this interesting for my own use. And for the few sites that don't completely abuse JS. There are many ways to use Javascript to make tiny positive progressive enhancements, and this is a kind of interesting one. If you could just download the basic templates once then all you need to download is pure content after that, little to no waste on markup, without breaking Javascript-less websites. That kind of thing is Javascript at its best, sad we don't see it more often.

      3 votes
      1. ThatFanficGuy
        Link Parent
        You may find instant.page (GitHub repo) interesting. I am, for whatever reason, opposed to implementing JS on what are supposed to be a plain, static website. Instant.Page is one of the things I...

        You may find instant.page (GitHub repo) interesting.

        I am, for whatever reason, opposed to implementing JS on what are supposed to be a plain, static website. Instant.Page is one of the things I want to implement: if it works, it improves the feel during navigation – it gives joy, which is the opposite of friction. If it fails, it fails gracefully and seamlessly, without ever interrupting navigation.

        1 vote
  3. [2]
    ThatFanficGuy
    Link
    It's an interesting approach. I've been researching HTML templating and partials for a while now, given the obvious positive implications of their usage. Most approaches I've seen rely either on...

    It's an interesting approach.

    I've been researching HTML templating and partials for a while now, given the obvious positive implications of their usage. Most approaches I've seen rely either on client-end rendering (meaning they have to also download a massive JS payload they won't need otherwise) or on build-process rendering (which may well be automated, though it feels like too big an overhead during development for me).

    Server-side rendering might work better than either of those, but I haven't had the opportunity to make sense of it yet.

    The service worker approach has overhead of its own. Having to make two versions of the same page, instead of the service worker retrieving only the relevant portion, makes no sense to me. Creating a separate codebase only to serve the full page anyway should it fail, just as much.

    I'm also not seeing anything about cache invalidation. Suppose you change your shell partials: how does this system handle retrieving the new version? You could manually ask users to clear their cache... which makes the whole idea needlessly more complicated and invalidates any sort of convenience such an approach might have.

    But I like the general idea. It's something I would employ myself, should I come upon a better solution.

    2 votes
    1. Greg
      (edited )
      Link Parent
      A lot of people here seem to really dislike React and similar libraries, and based on how they're sometimes used I can't say that's entirely unjustified, but with proper configuration they can be...

      I've been researching HTML templating and partials for a while now, given the obvious positive implications of their usage. Most approaches I've seen rely either on client-end rendering (meaning they have to also download a massive JS payload they won't need otherwise) or on build-process rendering (which may well be automated, though it feels like too big an overhead during development for me).

      Server-side rendering might work better than either of those, but I haven't had the opportunity to make sense of it yet.

      A lot of people here seem to really dislike React and similar libraries, and based on how they're sometimes used I can't say that's entirely unjustified, but with proper configuration they can be incredibly efficient. React itself is ~30KB gzipped, and Vue is a third smaller again. Even after adding all of your components and templates you're likely looking at less than 100KB gzipped, as a one-off cached download - in the context of the modern web that's next to nothing. [Edit to clarify] Perhaps more importantly, that small JS download is happening in the background after the initial page load; it doesn't delay rendering.

      There's a non-trivial amount of setup to be done, for sure, so I wouldn't necessarily recommend it in situations where a static site generator will be enough for the foreseeable future, but in the more general case a good setup looks something like this:

      • Server-side rendering is enabled for all pages, and the full state can be constructed based solely on the URL and any cookies. This gives a fast initial paint, but more importantly gives a fallback which can be used even if JS is disabled entirely. Near-zero extra development overhead, as the same React components are running on the server and the client.
      • The initial server-rendered pages also include the state object that they were built from inside a <script> tag. Once the JS bundle has loaded, this transparently initialises React on the client with the same data and allows it to take over rendering. This is a slight inefficiency as duplicated data is sent (rendered HTML plus the same state info as JSON).
      • The site then functions as a standard SPA - any user actions are handled client side, with only minimal JSON exchanges with the server where necessary. Page renders can often be made fast enough to be imperceptible at this point. If the user suddenly decides to refresh the page, or turn off JS, or whatever, the experience will be uninterrupted (although slightly slower) as the site falls back to server rendering.

      The service worker technique here could actually be applied to this, as well - rather than requesting the whole server-rendered HTML on fresh page loads, the worker could request only the JSON state object and pass it straight to React for client-side rendering - it removes the inefficiency I mentioned in the second step. This is a double win as it's faster for the client and takes load off the server by reducing the number of requests for rendered HTML. It neatly avoids the issue you mentioned with duplicated development work, too.

      Cache invalidation in my case would be handled by invalidating the JS bundle containing the React components (i.e. the templates) - this wouldn't invalidate the service worker, but would cause it to fall back to server side rendering for that specific load, while the new JS was downloaded in the background.

      I might actually spend a bit of time looking into this. The HTML partial technique in the article isn't entirely convincing to me, but the concept of using the worker to intercept initial loads seems sound. I haven't seen anyone doing service worker accelerated React, and it could definitely be an efficiency gain.

      3 votes
  4. Greg
    Link
    I think there's a typo in the title - looks like the article is discussing service workers, which are client side.

    I think there's a typo in the title - looks like the article is discussing service workers, which are client side.

    2 votes