Damn. This looks incredible. I've been loosely following P2P Web tech for a while: Beaker, IPFS... I like the idea a lot. That said: how hard is this sort of P2P on the computer hosting it? Would...
Damn. This looks incredible.
I've been loosely following P2P Web tech for a while: Beaker, IPFS... I like the idea a lot.
That said: how hard is this sort of P2P on the computer hosting it? Would I need a gaming-level rig to be able to run it reliably for an unknown number of users?
Also, is this something I could reliably do in parallel with mainstream server-hosted Web? I don't think I can switch completely to P2P unless I'm ready to eschew mainstream Web users entirely, but supporting a new democratic technology is something I could do.
These are good questions! I will try to answer to the best of my ability. As far as I know, networking is usually the bottleneck, which means most of your computer's specs don't really matter...
These are good questions! I will try to answer to the best of my ability.
how hard is this sort of P2P on the computer hosting it? Would I need a gaming-level rig to be able to run it reliably for an unknown number of users?
As far as I know, networking is usually the bottleneck, which means most of your computer's specs don't really matter except for bandwidth. You definitely don't need an expensive graphics card or anything (in fact, you don't need a graphics card at all). Also, each peer rehosts the site that it's looking at (and any they voluntarily choose to help), which means that as your site gets more popular, there is more collective bandwidth serving it. Think about the difference between a torrent with one dedicated peer versus a popular one with 150. You'll see a dramatic difference in download speeds without putting too much pressure on your upload.
I think they are still working on how to best control how much of your resources are dedicated at a time, but it should be very reasonable hardware requirements.
Also, is this something I could reliably do in parallel with mainstream server-hosted Web?
Yes! Dat/Hypercore just deals with straight up files. Like Git, it leaves all of the files as-is, and stores all state in a dot file. You can definitely run a Hypercore daemon and an HTTP server pointed at the same files. There are even tools explicitly aimed at this dual-hosting set up1!
How is the "cache invalidation" equivalent of P2P hosting resolved, in this case? How does the browser/protocol know which version to trust to be real and the most up-to-date?
Also, each peer rehosts the site that it's looking at
How is the "cache invalidation" equivalent of P2P hosting resolved, in this case? How does the browser/protocol know which version to trust to be real and the most up-to-date?
So, in this case, a feed is append-only1 and should never fork. Since feeds are validated by private keys, this means you can always be sure that if someone offers you a newer version, that it is...
So, in this case, a feed is append-only1 and should never fork. Since feeds are validated by private keys, this means you can always be sure that if someone offers you a newer version, that it is in fact a newer version. Since the peers form a swarm around data, they can very quickly communicate the newest changes to each other.
The only ways, I believe, for you to not get the most up-to-date data is either for (1) you to have no peers or be offline, but this should be obvious, or (2) for all of your peers to coordinate against you, which should be very difficult for any site with more than a few peers already.
1: This site is an older explanation and doesn't represent the current state of the Dat ecosystem, but I believe the relevant principles still apply.
The Beaker Browser uses the Hypercore protocol in addition to normal HTTP to share files. Hypercore is a peer-to-peer protocol, similar to bittorrent, but much better for sharing mutable data like...
The Beaker Browser uses the Dat protocolHypercore protocol in addition to normal HTTP to share files. Hypercore is a peer-to-peer protocol, similar to bittorrent, but much better for sharing mutable data like websites.
Not needing to run a server nor depend on a tech giant to host a personal (or even popular) website is a powerful feeling, and will hopefully help reduce the strict dichotomy between producer/consumer that we see on the modern web.
Beaker Browser is the flagship project for the Hypercore protocol. While they have been in a private beta for a while now, they just announced their public beta today. They have put a lot of effort into both the underlying tech and the UX, leading to a really cool release! I really recommend downloading it and poking around, it's hard to explain it's appeal just in words.
EDIT: Dat -> Hypercore. They used to use Dat. It is unclear to me if Dat now uses Hypercore or if Dat has been superseded by Hypercore. Either way, I figure this is more accurate.
I think this is mostly a branding update. There has always been two different dats: the dat project/foundation and dat the protocol. Dat the protocol used to use various hyper*-prefixed libraries...
EDIT: Dat -> Hypercore. They used to use Dat. It is unclear to me if Dat now uses Hypercore or if Dat has been superseded by Hypercore. Either way, I figure this is more accurate.
I think this is mostly a branding update. There has always been two different dats: the dat project/foundation and dat the protocol. Dat the protocol used to use various hyper*-prefixed libraries under the hood, and it seems like with this update, hypercore was promoted to be the main user-facing library. Maybe they thought that hyprercore and hyperdrives are better names than dat and dat archives.
This covers alot about the browser. https://www.youtube.com/watch?v=PDpTkbBxyz0 One concern: how will a peer handle websites that are pretty big (several hundred megs or even a gig) ?
They talk about that some in the announcement for Hyperdrive v10: But, I think this thread is much more impressive, where they show efficiently handling a site that is 255 GB!
They talk about that some in the announcement for Hyperdrive v10:
Importantly, drives support efficient random-access file reads, meaning that you can seek through a video and it will download only the portions of the video you're viewing, on-demand. We call this property "sparse downloading", and it's great for things like large websites (think all of Wikipedia mirrored to a drive) where readers only view single pages at a time.
But, I think this thread is much more impressive, where they show efficiently handling a site that is 255 GB!
You should read @9000's comment or the blog post. It's not just another browser. It uses peer-to-peer to host websites on your local machine. I don't like Blink either but this is actually really...
You should read @9000's comment or the blog post. It's not just another browser. It uses peer-to-peer to host websites on your local machine. I don't like Blink either but this is actually really cool.
Damn. This looks incredible.
I've been loosely following P2P Web tech for a while: Beaker, IPFS... I like the idea a lot.
That said: how hard is this sort of P2P on the computer hosting it? Would I need a gaming-level rig to be able to run it reliably for an unknown number of users?
Also, is this something I could reliably do in parallel with mainstream server-hosted Web? I don't think I can switch completely to P2P unless I'm ready to eschew mainstream Web users entirely, but supporting a new democratic technology is something I could do.
These are good questions! I will try to answer to the best of my ability.
As far as I know, networking is usually the bottleneck, which means most of your computer's specs don't really matter except for bandwidth. You definitely don't need an expensive graphics card or anything (in fact, you don't need a graphics card at all). Also, each peer rehosts the site that it's looking at (and any they voluntarily choose to help), which means that as your site gets more popular, there is more collective bandwidth serving it. Think about the difference between a torrent with one dedicated peer versus a popular one with 150. You'll see a dramatic difference in download speeds without putting too much pressure on your upload.
I think they are still working on how to best control how much of your resources are dedicated at a time, but it should be very reasonable hardware requirements.
Yes! Dat/Hypercore just deals with straight up files. Like Git, it leaves all of the files as-is, and stores all state in a dot file. You can definitely run a Hypercore daemon and an HTTP server pointed at the same files. There are even tools explicitly aimed at this dual-hosting set up1!
1: I'm not sure if this project supports Hyperdrive v10 or not (v10 was released yesterday concurrently with the beta Beaker release), but it's managed by the same people who make Beaker: https://github.com/beakerbrowser/homebase
How is the "cache invalidation" equivalent of P2P hosting resolved, in this case? How does the browser/protocol know which version to trust to be real and the most up-to-date?
So, in this case, a feed is append-only1 and should never fork. Since feeds are validated by private keys, this means you can always be sure that if someone offers you a newer version, that it is in fact a newer version. Since the peers form a swarm around data, they can very quickly communicate the newest changes to each other.
The only ways, I believe, for you to not get the most up-to-date data is either for (1) you to have no peers or be offline, but this should be obvious, or (2) for all of your peers to coordinate against you, which should be very difficult for any site with more than a few peers already.
1: This site is an older explanation and doesn't represent the current state of the Dat ecosystem, but I believe the relevant principles still apply.
The Beaker Browser uses the
Dat protocolHypercore protocol in addition to normal HTTP to share files. Hypercore is a peer-to-peer protocol, similar to bittorrent, but much better for sharing mutable data like websites.Not needing to run a server nor depend on a tech giant to host a personal (or even popular) website is a powerful feeling, and will hopefully help reduce the strict dichotomy between producer/consumer that we see on the modern web.
Beaker Browser is the flagship project for the Hypercore protocol. While they have been in a private beta for a while now, they just announced their public beta today. They have put a lot of effort into both the underlying tech and the UX, leading to a really cool release! I really recommend downloading it and poking around, it's hard to explain it's appeal just in words.
EDIT: Dat -> Hypercore. They used to use Dat. It is unclear to me if Dat now uses Hypercore or if Dat has been superseded by Hypercore. Either way, I figure this is more accurate.
I think this is mostly a branding update. There has always been two different dats: the dat project/foundation and dat the protocol. Dat the protocol used to use various hyper*-prefixed libraries under the hood, and it seems like with this update, hypercore was promoted to be the main user-facing library. Maybe they thought that hyprercore and hyperdrives are better names than dat and dat archives.
This covers alot about the browser. https://www.youtube.com/watch?v=PDpTkbBxyz0
One concern: how will a peer handle websites that are pretty big (several hundred megs or even a gig) ?
They talk about that some in the announcement for Hyperdrive v10:
But, I think this thread is much more impressive, where they show efficiently handling a site that is 255 GB!
Is this finally a platform independent, simple to use way to transfer files peer to peer without a thumb drive? The last big challenge finally solved?
Yeah, this software would definitely fit!
Other solutions I know of, with varying trade offs:
Don't we have enough Blink based browsers?
You should read @9000's comment or the blog post. It's not just another browser. It uses peer-to-peer to host websites on your local machine. I don't like Blink either but this is actually really cool.