A problem with this extended metaphor is that the Internet never was very robust, so there's nothing to go back to. It started as an experimental academic network, with no encryption and lots of...
A problem with this extended metaphor is that the Internet never was very robust, so there's nothing to go back to. It started as an experimental academic network, with no encryption and lots of security holes. The most widely-used software that runs the modern Internet is probably more secure than it ever was, because it's been frequently attacked and it has expert maintainers that keep on improving it.
There's still plenty to do. There's a lot of obscure software that isn't reviewed very much. We need to be better at preventing supply-chain attacks.
Still, if you asked me whether to go with a mainstream browser, or try an obscure one, from people I never heard of, that makes a lot of grand privacy claims, I'll stick with the mainstream browser. For encryption, I would rather go with well-known algorithms that have been scrutinized by and implemented by security experts, published as widely-used open source libraries.
This doesn't seem like a "wild" approach at all? It's a design-oriented approach. There's still evolution of a sort, but it doesn't favor diversity for core infrastructure. Once a good-enough open standard takes over, it's very difficult to dislodge. There's no viable competitor to Unicode, and look at how the IPv6 upgrade is going.
Instead, we see plenty of diversity around the edges, in specialized software that relies on underlying infrastructure and takes it for granted.
I felt the essay at least partially addressed this point: In any case, I didn't feel that the point being made was that we should aim to continuously update protocols even, e.g., there is a...
I felt the essay at least partially addressed this point:
It’s important to share that ecological rewilding is a work in progress. What do you rewild to? Humans have shaped and cultivated landscapes for tens of thousands of years, so what does “wild” even mean? Just as there’s no ecosystem on Earth untouched by human actions, there’s no “true” wildness to return habitats to. And what scale is needed for rewilding to succeed? It’s one thing to reintroduce wolves to the 3,472 square miles of Yellowstone, quite another to cordon off about 20 square miles of a reclaimed polder near Amsterdam. Large and diverse Yellowstone is likely complex enough to adapt to change, but the small Dutch reserve known as Oostvaardersplassen has struggled.
In any case, I didn't feel that the point being made was that we should aim to continuously update protocols even, e.g., there is a perfectly good one in place. Rather, the point I took away was that we should prioritize — in fact, enforce — interoperability in standards and infrastructure as a means of maintaining an environment where new approaches can be developed and niches carved out.
Thanks. It's a long article and I'll admit I skimmed it. Getting into a situation where we can't upgrade key components certainly is a problem. One approach is to use randomization to prevent...
Thanks. It's a long article and I'll admit I skimmed it.
Getting into a situation where we can't upgrade key components certainly is a problem. One approach is to use randomization to prevent counterparties from relying on protocol details that should be allowed to change. Here's an example in Chrome.
Maybe that could be seen as artificially injecting diversity?
A problem with this extended metaphor is that the Internet never was very robust, so there's nothing to go back to. It started as an experimental academic network, with no encryption and lots of security holes. The most widely-used software that runs the modern Internet is probably more secure than it ever was, because it's been frequently attacked and it has expert maintainers that keep on improving it.
There's still plenty to do. There's a lot of obscure software that isn't reviewed very much. We need to be better at preventing supply-chain attacks.
Still, if you asked me whether to go with a mainstream browser, or try an obscure one, from people I never heard of, that makes a lot of grand privacy claims, I'll stick with the mainstream browser. For encryption, I would rather go with well-known algorithms that have been scrutinized by and implemented by security experts, published as widely-used open source libraries.
This doesn't seem like a "wild" approach at all? It's a design-oriented approach. There's still evolution of a sort, but it doesn't favor diversity for core infrastructure. Once a good-enough open standard takes over, it's very difficult to dislodge. There's no viable competitor to Unicode, and look at how the IPv6 upgrade is going.
Instead, we see plenty of diversity around the edges, in specialized software that relies on underlying infrastructure and takes it for granted.
I felt the essay at least partially addressed this point:
In any case, I didn't feel that the point being made was that we should aim to continuously update protocols even, e.g., there is a perfectly good one in place. Rather, the point I took away was that we should prioritize — in fact, enforce — interoperability in standards and infrastructure as a means of maintaining an environment where new approaches can be developed and niches carved out.
Thanks. It's a long article and I'll admit I skimmed it.
Getting into a situation where we can't upgrade key components certainly is a problem. One approach is to use randomization to prevent counterparties from relying on protocol details that should be allowed to change. Here's an example in Chrome.
Maybe that could be seen as artificially injecting diversity?