17 votes

Remote code execution vulnerability in the cdnjs Javascript CDN run by Cloudflare, which could have enabled tampering with over 10% of all websites

10 comments

  1. [10]
    suspended
    Link
    For the technically uninitiated, is there anything that we could do to protect our websites from something like this?

    For the technically uninitiated, is there anything that we could do to protect our websites from something like this?

    4 votes
    1. [9]
      Greg
      Link Parent
      Subresource integrity was pretty much designed for this - you can add a hash to the script or link tag to ensure the file you receive is the one you expected. Alternatively, you can stick to...

      Subresource integrity was pretty much designed for this - you can add a hash to the script or link tag to ensure the file you receive is the one you expected.

      Alternatively, you can stick to hosting your own dependencies, at which point they're as secure as the rest of your site. You can even still pipe them through your choice of CDN if you like, although you do have the very slight loss of efficiency in that CDN edges and users' browsers need to cache your copy of awesomeLibrary.js rather than using the copy they've already got that's shared with someone else's site if both are using cdnjs.

      Come to think of it, browsers should probably be able to cache based on matching subresource hash, regardless of URL. Assuming collisions are sufficiently unlikely, at least. If it weren't already past midnight here I'd do some digging on that...

      7 votes
      1. [6]
        skybrian
        Link Parent
        Cross-site caching of JavaScript, fonts, and other resources is now disabled to prevent privacy leaks, so there's no longer any performance benefit from this. You might as well host them on your...

        Cross-site caching of JavaScript, fonts, and other resources is now disabled to prevent privacy leaks, so there's no longer any performance benefit from this. You might as well host them on your own domain.

        Or at least they are in Safari and Chrome. I'd be surprised if Firefox didn't do this, but didn't find a suitable web page.

        It's still a good idea to protect your entire site using a CDN such as Cloudflare, though, to protect against denial-of-service attacks.

        (And although this is a serious bug, I'd trust Cloudflare to fix bugs faster than most web hosting services.)

        9 votes
        1. [4]
          Wes
          (edited )
          Link Parent
          I wish it were a setting. I'd gladly take the performance boost of cross-site caching vs the rather niche concern of privacy leaks. Especially if it were turned on for common CDNs or even the...

          Cross-site caching of JavaScript, fonts, and other resources is now disabled to prevent privacy leaks, so there's no longer any performance benefit from this.

          I wish it were a setting. I'd gladly take the performance boost of cross-site caching vs the rather niche concern of privacy leaks. Especially if it were turned on for common CDNs or even the most-commonly-linked versions of files. I can only imagine how often the latest version of each major jQuery version is requested, or the top ten webfonts on Google Fonts.

          It feels like we're just throwing bits away.

          edit: I'd suggest installing the top Google Fonts locally to save network requests, but the generated CSS doesn't even prioritize a local lookup. It would take a browser extension to intercept the request to even make that work.

          3 votes
          1. [3]
            skybrian
            Link Parent
            It’s questionable how much bandwidth it would save when the maintainers release new versions of resources often. The result is that even when a resource is popular, most websites use a different...

            It’s questionable how much bandwidth it would save when the maintainers release new versions of resources often. The result is that even when a resource is popular, most websites use a different version of it and no bandwidth is saved. (This also makes privacy leaks easier.) But maybe that’s more true of JavaScript than fonts?

            4 votes
            1. [2]
              Wes
              Link Parent
              I think if we looked at heatmaps for the most common distributions of JS libraries, there'd be a handful that have significantly more downloads than others. My example above was the latest jQuery...

              I think if we looked at heatmaps for the most common distributions of JS libraries, there'd be a handful that have significantly more downloads than others. My example above was the latest jQuery version of each branch (1.x, 2.x, 3.x), but another example is the version automatically bundled with WordPress sites (a modified 2.x if I'm not mistaken).

              I expect there's similarly often-bundled versions of jQueryUI, Modernizr, moment.js, and others. Bootstrap would be another good candidate (though more on the CSS side).

              I don't know how practical it would be for browsers to play favourites and have a special cache for these assets, but I've long felt there was room for optimization here.

              It could be argued as being leaky, but if they were vetted to be included on more than 10,000 sites, as an example, that really wouldn't tell you anything about where that user has been.

              Of course maybe that traffic is inconsequential in the grand scheme of things and this is all needless complexity. I'd be curious to know, honestly.

              2 votes
              1. skybrian
                Link Parent
                There might be a way to do it but it’s tricky. Usage won’t get to 10,000 sites unless you make the resource available on the CDN when it has zero sites. Someone has to be the first to link to a...

                There might be a way to do it but it’s tricky. Usage won’t get to 10,000 sites unless you make the resource available on the CDN when it has zero sites. Someone has to be the first to link to a new version. Furthermore you can’t take it down once you’ve published it or websites will break, and usage of any version will drop below 10,000 sites at some point.

                Each new version of a popular library will start out used by nobody. Usage will gradually increase as many sites start using it, then trail off after a newer version is released. If new versions are released frequently enough then the peak for each version won’t go above 10,000 sites, even though the library is popular.

                2 votes
        2. Greg
          Link Parent
          Interesting, thanks - I wasn't aware of that change. I'd already come down on the self hosting + CDN side on sites I control, even at the cost of a little extra bandwidth overhead, so it's good to...

          Interesting, thanks - I wasn't aware of that change. I'd already come down on the self hosting + CDN side on sites I control, even at the cost of a little extra bandwidth overhead, so it's good to know that even the bandwidth trade-off is no longer in play.

          3 votes
      2. [2]
        Bauke
        Link Parent
        The Decentraleyes webextension does this sort of, it blocks connections to CDNs and serves local copies directly.

        The Decentraleyes webextension does this sort of, it blocks connections to CDNs and serves local copies directly.

        7 votes
        1. Greg
          Link Parent
          Oh that's cool - sounds kind of like what @Wes was hoping for, too, although I guess it doesn't really make a dent in whole-network efficiency unless it's preinstalled.

          Oh that's cool - sounds kind of like what @Wes was hoping for, too, although I guess it doesn't really make a dent in whole-network efficiency unless it's preinstalled.

          1 vote