I'd love to see an increase in browser and browser engine diversity. Hopefully this and other browser projects succeed. It seems like an incredibly daunting task, but they do seem to have an...
I'd love to see an increase in browser and browser engine diversity. Hopefully this and other browser projects succeed. It seems like an incredibly daunting task, but they do seem to have an optimistic outlook.
In the meantime, I'm using Firefox almost purely because it's an underdog.
Gods bless them, building a new browser on a novel engine today seems basically impossible. Browsers are so complicated now they're basically little virtual operating systems. Even companies like...
Gods bless them, building a new browser on a novel engine today seems basically impossible. Browsers are so complicated now they're basically little virtual operating systems. Even companies like Microsoft have basically thrown in the towel on maintaining their own rendering engine.
I like this from the FAQ: I think it's one of those things that writing a new browser is substantially easier if you're willing to just discard overly complicated stuff like legacy and JIT...
I like this from the FAQ:
Q: Why bother? You can’t make a new browser engine without billions of dollars and hundreds of staff.
Sure you can. Don’t listen to armchair defeatists who never worked on a browser.
I think it's one of those things that writing a new browser is substantially easier if you're willing to just discard overly complicated stuff like legacy and JIT support. If you're not a megacorp, you don't need to worry as much about breaking things.
I mean… This is true, in a sense, but requires that you bend the definition of "browser" significantly. I'd argue that you need a definition like "usably renders more than some n percentage of web...
I mean…
This is true, in a sense, but requires that you bend the definition of "browser" significantly. I'd argue that you need a definition like "usably renders more than some n percentage of web pages, normalized by traffic volume", otherwise some pedantic ass like me will try to argue that curl is a "browser" as a reductio ad absurdum.
And for any n that passes the sniff test, you do need all that complicated stuff, or at least enough of it that yeah, you do (apparently) need multiple millions of dollars worth of engineers working on your project. (I'm guessing that the difference in investment between 50% and 90% is marginal; both require that you do everything, which costs too much. And of course, a "browser" which only works for 50% of pageviews is not a viable replacement for The Big One Rendering Engine or Gecko.) This is the great tragedy of the web; site authors use all these stupid features, frequently on pages where they provide little to no additional value and just make the page significantly more difficult to render.
So… what do we do with browsers running engines other than The Big One and Gecko? Regrettably, we evidently don't use them to displace use of Google products. It would be an interesting experiment to try to establish a minimalist web using only HTTP/HTML features that are widely and easily supported by fringe browser engines (a la the Gopher renaissance or Gemini), but unfortunately, being embedded in the broad, complicated web makes that challenging because it's easy to stumble outside the bounds of this minimalist subset (a Gemini document can contain HTTP links, but a Gemini browser is certainly not going to silently follow them) or for formerly-compatible sites to adopt incompatible features.
I don't have an answer for this other than to throw my hands in the air and say "the web's fucked". I hate that it is this way, and I truly despise Google for making it like this. (I mean, not just for that, obviously. But it's high on the list.) And maybe I'm wrong, and you can hit an adequate n with a small-ish community project! I certainly hope that's true, and I absolutely wish Ladybird all the luck in the world, even if I'm not going to bet on their success.
Honestly what I think is most frustrating is that it is often fairly critical online services that are really bad at standards compliance. It would be one thing if it was mostly random,...
I'd argue that you need a definition like "usably renders more than some n percentage of web pages, normalized by traffic volume"
Honestly what I think is most frustrating is that it is often fairly critical online services that are really bad at standards compliance. It would be one thing if it was mostly random, fly-by-night edge cases that were goofing around, but mostly it's like, banks and utility companies, and airlines, and these other big and clumsy institutions that don't understand the web.
The random small businesses just use Wordpress or Squarespace or something and it comes out fine. It's always when I'm trying to like, check the status of my tax refund or schedule a doctor's appointment or something when I'm confronted with bullshit about how this site doesn't work in Safari. If they're not bothering to test on anything but vanilla Chrome to where shit breaks in Webkit what hope does anyone else have?
My question to the both you and @dblohm7 is.... and what's the problem with a semi-compatible subset? People love Typescript. A lot of the web is complicated merely because we've turned it into a...
My question to the both you and @dblohm7 is.... and what's the problem with a semi-compatible subset? People love Typescript.
A lot of the web is complicated merely because we've turned it into a JavascriptVM app delivery and multimedia platform and not a text based one. Not to say that is a bad thing...it has a lot of benefits. But those are things that are not intrinsically necessary (and sometimes counterproductive to) a good web experience. How much of browser complication is due to features most people don't actually want...like messing with the scrollbar or right click menus? Or aren't really websites, but platform-agnostic applications like Office or Discord? And to that you might say "Oh but that's the important part of the web." And for a significant percentage of the internet....that's not as true anymore. Phone apps have replaced large swaths of those SPA needs. How much complexity (and thus, security risk) is due to having a virtual machine that is perpetually running random, arbitrary code from every website you visit (especially if using default browser configs with no addons).
If we harken back to the early days of the web, when there was programming plugin language hell...The vast majority of use cases outside of multimedia and game delivery were for UI. A lot of which has since been integrated into raw CSS and HTML.
If I go through my most-used sites on Lynx...which is about as broken-web as you can get.... Nothing "works," but little is straight-up unusable.
Usable:
Wikipedia
XKCD, which lets me load image in native image viewer.
Tildes. Some interesting double-posting of text and some HTML errors, but usable enough on its own.
The Guardian. Actually quite nice for a news-reading experience.
Medium. Lot of broken stuff, but a lot of working stuff too.
Stackoverflow
Unusable:
Youtube
My banks (mostly due to janky login buttons)
I'd wager (as a non-dev) if you can get ACID compliance, display images, and deploy just enough javascript to let most form submission to work... you've got a huge percentage of the not-an-application web working again, even if not perfect.
Even then, incremental improvement is a thing. It might take time, but starting a fresh, ground-up browser could be a massive gamechanger. Especially if there's a good regression-testing suite built in from the early phases.
I would say nothing, however at the same time I think that all the stakeholders need to agree and understand that. If their users hear "the Web" and find, for example, that WebRTC doesn't work,...
My question to the both you and @dblohm7 is.... and what's the problem with a semi-compatible subset?
I would say nothing, however at the same time I think that all the stakeholders need to agree and understand that. If their users hear "the Web" and find, for example, that WebRTC doesn't work, then they're going to be unsatisfied. Maybe this is just my developer side talking, especially these days now that I spend so much time working with networking protocols, but naming is important so that everybody involved understands what their expectations should be.
I'd wager (as a non-dev) if you can get ACID compliance
I think their focus on ACID3 is misguided. It was important for its time, but it's a really old test. Since its day, some parts of the specs that produced ACID3 were revised such that the major engines are no longer perfectly compliant with ACID3, even though they're compliant with the actual specs. Furthermore, ACID3 does not capture additional changes that were made to layout specs since 2008. Flexbox and CSS grid, in particular, come to mind. Both of those are really important to support, IMHO.
I would suggest that in 2022, the better way to ensure compliance is to use the web platform tests. I know that WPTs are not as exciting as being able to view a test result in a single page, but the WPT test suite is continuously updated along with specs, and has been a huge boon with respect to improving compatibility between browser engines.
Yeah, it would be good if somebody wrote a front-end to aggregate WPT results and present them in a simple, understandable way for end users. Mozilla runs WPT on every push, and their failures are...
Yeah, it would be good if somebody wrote a front-end to aggregate WPT results and present them in a simple, understandable way for end users.
Mozilla runs WPT on every push, and their failures are reported like most test failures: just a bunch of textual error messages dumped into a log -- not exactly exciting presentation!
Just so you know, there already is a browser that's sort of close to what you're talking about - NetSurf. It's actually one of my favorite open source projects because it's so tiny, it runs...
Just so you know, there already is a browser that's sort of close to what you're talking about - NetSurf. It's actually one of my favorite open source projects because it's so tiny, it runs everywhere (it started as a RiscOS project, and it still runs and is maintained for it), and renders webpages beautifully. It just doesn't do dynamic pages.
One thing that would be really nice is if more websites gracefully fell back on simple HTML/CSS in case of browser incompatibility. Force site builders to make the minimum viable version as simple...
One thing that would be really nice is if more websites gracefully fell back on simple HTML/CSS in case of browser incompatibility. Force site builders to make the minimum viable version as simple as possible first, so that every browser has a chance, then put all the other complex layers on top so that the Chrome-verse can have their fancy web apps where it’s supported.
If only user agents could advertise their capabilities somehow so that minimum standards could be met (rather than every browser that isn’t one of the top 3 advertising a user agent string from one of those top 3).
Generally speaking, browsers already do this, however they do not necessarily do so by listing their capabilities "up front," which is perhaps what you're thinking of. As browser developers, we...
If only user agents could advertise their capabilities somehow
Generally speaking, browsers already do this, however they do not necessarily do so by listing their capabilities "up front," which is perhaps what you're thinking of. As browser developers, we don't really want sites using the user agent string for anything, however that cat was let out of the bag 25 years ago and it isn't going back in, it seems...
Agreed. As somebody who used to work on browsers, they’re clearly redefining “the web” to mean what they want it to mean, and then using that as their target.
Agreed. As somebody who used to work on browsers, they’re clearly redefining “the web” to mean what they want it to mean, and then using that as their target.
I'd love to see an increase in browser and browser engine diversity. Hopefully this and other browser projects succeed. It seems like an incredibly daunting task, but they do seem to have an optimistic outlook.
In the meantime, I'm using Firefox almost purely because it's an underdog.
Gods bless them, building a new browser on a novel engine today seems basically impossible. Browsers are so complicated now they're basically little virtual operating systems. Even companies like Microsoft have basically thrown in the towel on maintaining their own rendering engine.
I like this from the FAQ:
I think it's one of those things that writing a new browser is substantially easier if you're willing to just discard overly complicated stuff like legacy and JIT support. If you're not a megacorp, you don't need to worry as much about breaking things.
I mean…
This is true, in a sense, but requires that you bend the definition of "browser" significantly. I'd argue that you need a definition like "usably renders more than some n percentage of web pages, normalized by traffic volume", otherwise some pedantic ass like me will try to argue that
curl
is a "browser" as a reductio ad absurdum.And for any n that passes the sniff test, you do need all that complicated stuff, or at least enough of it that yeah, you do (apparently) need multiple millions of dollars worth of engineers working on your project. (I'm guessing that the difference in investment between 50% and 90% is marginal; both require that you do everything, which costs too much. And of course, a "browser" which only works for 50% of pageviews is not a viable replacement for The Big One Rendering Engine or Gecko.) This is the great tragedy of the web; site authors use all these stupid features, frequently on pages where they provide little to no additional value and just make the page significantly more difficult to render.
So… what do we do with browsers running engines other than The Big One and Gecko? Regrettably, we evidently don't use them to displace use of Google products. It would be an interesting experiment to try to establish a minimalist web using only HTTP/HTML features that are widely and easily supported by fringe browser engines (a la the Gopher renaissance or Gemini), but unfortunately, being embedded in the broad, complicated web makes that challenging because it's easy to stumble outside the bounds of this minimalist subset (a Gemini document can contain HTTP links, but a Gemini browser is certainly not going to silently follow them) or for formerly-compatible sites to adopt incompatible features.
I don't have an answer for this other than to throw my hands in the air and say "the web's fucked". I hate that it is this way, and I truly despise Google for making it like this. (I mean, not just for that, obviously. But it's high on the list.) And maybe I'm wrong, and you can hit an adequate n with a small-ish community project! I certainly hope that's true, and I absolutely wish Ladybird all the luck in the world, even if I'm not going to bet on their success.
Honestly what I think is most frustrating is that it is often fairly critical online services that are really bad at standards compliance. It would be one thing if it was mostly random, fly-by-night edge cases that were goofing around, but mostly it's like, banks and utility companies, and airlines, and these other big and clumsy institutions that don't understand the web.
The random small businesses just use Wordpress or Squarespace or something and it comes out fine. It's always when I'm trying to like, check the status of my tax refund or schedule a doctor's appointment or something when I'm confronted with bullshit about how this site doesn't work in Safari. If they're not bothering to test on anything but vanilla Chrome to where shit breaks in Webkit what hope does anyone else have?
My question to the both you and @dblohm7 is.... and what's the problem with a semi-compatible subset? People love Typescript.
A lot of the web is complicated merely because we've turned it into a JavascriptVM app delivery and multimedia platform and not a text based one. Not to say that is a bad thing...it has a lot of benefits. But those are things that are not intrinsically necessary (and sometimes counterproductive to) a good web experience. How much of browser complication is due to features most people don't actually want...like messing with the scrollbar or right click menus? Or aren't really websites, but platform-agnostic applications like Office or Discord? And to that you might say "Oh but that's the important part of the web." And for a significant percentage of the internet....that's not as true anymore. Phone apps have replaced large swaths of those SPA needs. How much complexity (and thus, security risk) is due to having a virtual machine that is perpetually running random, arbitrary code from every website you visit (especially if using default browser configs with no addons).
If we harken back to the early days of the web, when there was programming plugin language hell...The vast majority of use cases outside of multimedia and game delivery were for UI. A lot of which has since been integrated into raw CSS and HTML.
If I go through my most-used sites on Lynx...which is about as broken-web as you can get.... Nothing "works," but little is straight-up unusable.
Usable:
Unusable:
I'd wager (as a non-dev) if you can get ACID compliance, display images, and deploy just enough javascript to let most form submission to work... you've got a huge percentage of the not-an-application web working again, even if not perfect.
Even then, incremental improvement is a thing. It might take time, but starting a fresh, ground-up browser could be a massive gamechanger. Especially if there's a good regression-testing suite built in from the early phases.
I would say nothing, however at the same time I think that all the stakeholders need to agree and understand that. If their users hear "the Web" and find, for example, that WebRTC doesn't work, then they're going to be unsatisfied. Maybe this is just my developer side talking, especially these days now that I spend so much time working with networking protocols, but naming is important so that everybody involved understands what their expectations should be.
I think their focus on ACID3 is misguided. It was important for its time, but it's a really old test. Since its day, some parts of the specs that produced ACID3 were revised such that the major engines are no longer perfectly compliant with ACID3, even though they're compliant with the actual specs. Furthermore, ACID3 does not capture additional changes that were made to layout specs since 2008. Flexbox and CSS grid, in particular, come to mind. Both of those are really important to support, IMHO.
I would suggest that in 2022, the better way to ensure compliance is to use the web platform tests. I know that WPTs are not as exciting as being able to view a test result in a single page, but the WPT test suite is continuously updated along with specs, and has been a huge boon with respect to improving compatibility between browser engines.
I did not realize! That said, there is immense value in being able to toss a page at a user to say "yes your browser does support XYZ"
Yeah, it would be good if somebody wrote a front-end to aggregate WPT results and present them in a simple, understandable way for end users.
Mozilla runs WPT on every push, and their failures are reported like most test failures: just a bunch of textual error messages dumped into a log -- not exactly exciting presentation!
Just so you know, there already is a browser that's sort of close to what you're talking about - NetSurf. It's actually one of my favorite open source projects because it's so tiny, it runs everywhere (it started as a RiscOS project, and it still runs and is maintained for it), and renders webpages beautifully. It just doesn't do dynamic pages.
It also doesn't do flexbox :-(
One thing that would be really nice is if more websites gracefully fell back on simple HTML/CSS in case of browser incompatibility. Force site builders to make the minimum viable version as simple as possible first, so that every browser has a chance, then put all the other complex layers on top so that the Chrome-verse can have their fancy web apps where it’s supported.
If only user agents could advertise their capabilities somehow so that minimum standards could be met (rather than every browser that isn’t one of the top 3 advertising a user agent string from one of those top 3).
Generally speaking, browsers already do this, however they do not necessarily do so by listing their capabilities "up front," which is perhaps what you're thinking of. As browser developers, we don't really want sites using the user agent string for anything, however that cat was let out of the bag 25 years ago and it isn't going back in, it seems...
Agreed. As somebody who used to work on browsers, they’re clearly redefining “the web” to mean what they want it to mean, and then using that as their target.