I haven't finished reading the proposal yet, so this is just my understanding of it so far and I might be wrong about something. This would create what amounts to a drm scheme that would allow...
I haven't finished reading the proposal yet, so this is just my understanding of it so far and I might be wrong about something.
This would create what amounts to a drm scheme that would allow websites to block you from accessing them unless you are using an approved and unmodified (by things such as adblockers) browser.
From the diagram in the proposal, it appears that how it would work is that:
a user visits a website
the website requests "environment attestation" of the browser/device being used
the browser contacts a third-party (such as google, mozilla, or apple) to "attest" to the integrity of their environment, sending the details of their browser/device and receives an "attestation" of the integrity of the environment
the browser passes that along to the website
the website checks that the attestation is valid and from an approved attester and decides whether to allow the user access
They list many different reasons for this relating to security and protecting users, but it seems clear to me that the main goal of this is to prevent adblocking.
With how outrageous this proposal is, it seems like it will probably not succeed, but the fact that they even felt confident enough to publish it is worrying.
It may fail now, but I doubt that’ll stop them from trying to get buy-in from other browsers/large websites for more attempts in the future until some version of it succeeds.
EDIT: I just noticed this on their current specification:
6. Security and privacy considerations
6.1. Security considerations
6.1.1. Secure context only
Web environment integrity MUST only be enabled in a secure context. This is to ensure that the website is not spoofed.
Todo
6.2. Privacy considerations
Todo
I get that it's still a work in progress, but it's funny that months after the "explainer" document was written, the part of the specification most related to their claimed reasons for doing this is just "todo".
My knee jerk reaction is that I don’t want google or any other huge company proposing any standards because it’s almost certainly an embrace/extend/extinguish plan to take over more of the...
My knee jerk reaction is that I don’t want google or any other huge company proposing any standards because it’s almost certainly an embrace/extend/extinguish plan to take over more of the internet. See also AMP links https://en.wikipedia.org/wiki/Accelerated_Mobile_Pages
They don't need to propose standards to create standards. The vast majority of development languages and frameworks have or had corporate backing. It's not necessarily a bad thing for corporations...
They don't need to propose standards to create standards. The vast majority of development languages and frameworks have or had corporate backing. It's not necessarily a bad thing for corporations to assist open source projects.
The trouble is when the standards aren't open or reproducible. We're getting very close to chromium becoming the modern internet explorer where Google can just decide to create systems that websites must follow to stay relevant (like you mentioned with AMP).
Which is why I try to use Firefox whenever possible, or even WebKit (Safari) based browsers where available. But still, I already have come across a couple of websites that wouldn’t allow me to...
We're getting very close to chromium becoming the modern internet explorer where Google can just decide to create systems that websites must follow to stay relevant (like you mentioned with AMP).
Which is why I try to use Firefox whenever possible, or even WebKit (Safari) based browsers where available.
But still, I already have come across a couple of websites that wouldn’t allow me to view their actual content outside of a Chromium environment – this was typically while using Safari or one of its derivatives. Of course with the recommendation to download Google Chrome or MS Edge included.
Every single site doing this is one too many…
I heard a disturbing-yet-valid argument for making your website accessible, a while back. It goes: The percentage of web users on Firefox is about 2%. The number of blind people is slightly above...
We're getting very close to chromium becoming the modern internet explorer where Google can just decide to create systems that websites must follow to stay relevant (like you mentioned with AMP).
I heard a disturbing-yet-valid argument for making your website accessible, a while back. It goes:
The percentage of web users on Firefox is about 2%. The number of blind people is slightly above that. Why are people testing their website on Firefox but not testing their website with a screen-reader?
Maybe but they don’t seem to be acting on it yet. Chrome has done things to discourage browser-specific websites for a long time. For example, they use origin trials to try out experimental...
Maybe but they don’t seem to be acting on it yet.
Chrome has done things to discourage browser-specific websites for a long time. For example, they use origin trials to try out experimental features, to make sure they don’t become accidental standards. Websites have to register to use them and they automatically expire.
That very article you link to doesn't talk about standardizing things at all, just about testing them before broad release. Google themselves have had no problem distributing browser specific...
That very article you link to doesn't talk about standardizing things at all, just about testing them before broad release. Google themselves have had no problem distributing browser specific websites.. Gmail mobile for instance was broken for years because they only used prefixed css values and didn't specify the non prefixed ones. Afaik Google docs to this day still has an offline version based on websql and not indexed db.
Giving power over the web to a single ad company is an insane concept to me, but people and webdevs seem deadset on doing it.
It’s true that Google often tests experimental web API’s on its own websites. This is what origin trials are for, and they do them too. I don’t use the GMail or Google Docs websites on mobile and...
It’s true that Google often tests experimental web API’s on its own websites. This is what origin trials are for, and they do them too.
I don’t use the GMail or Google Docs websites on mobile and wouldn’t trust them for offline access. But I suspect they’re closer to “working” than “not working” for most people in most browsers? Is the trend towards or away from sticking to web standards?
This is literally their first use case scenario. It's pretty unambiguously about ads, with everything else being an added bonus. The word "integrity" to me was a red flag. It's a weasel word; it...
Users like visiting websites that are expensive to create and maintain, but they often want or need to do it without paying directly. These websites fund themselves with ads, but the advertisers can only afford to pay for humans to see the ads, rather than robots. This creates a need for human users to prove to websites that they're human, sometimes through tasks like challenges or logins.
This is literally their first use case scenario. It's pretty unambiguously about ads, with everything else being an added bonus.
The word "integrity" to me was a red flag. It's a weasel word; it could have a lot of different meanings. It talks about a third party for attestation but the fact that this is a Google proposal is practically guaranteeing that Google is going to be effectively the only attestation service anyone is going to use.
I wouldn't disregard the possibility of this becoming a reality, because I forsee websites that will lock people out if they fail the integrity check.
This is why we have to stop relying on google so much and degoogle our lifes, starting and maybe most importantly, google chrome and chromium. We need a free internet, which heavily relies on a...
This is why we have to stop relying on google so much and degoogle our lifes, starting and maybe most importantly, google chrome and chromium. We need a free internet, which heavily relies on a free browser such as Firefox. Pretty much everything not chromium based, so google can't dictate what web technology is used by the masses.
Google maps is still the Google product I kinda rely on, last week I unfortunately had to use Meet for work for the first time though. Apple Maps exists but there are no directions where I live,...
Google maps is still the Google product I kinda rely on, last week I unfortunately had to use Meet for work for the first time though.
Apple Maps exists but there are no directions where I live, so functionally useless
Google maps is a great product indeed but I never relied that heavily on it. There are several android apps based on open street map that do a good enough job. At least for my use case. They are...
Google maps is a great product indeed but I never relied that heavily on it. There are several android apps based on open street map that do a good enough job. At least for my use case. They are not as good as google maps but using a worse product is unfortunately the price we have to pay now or we will pay a much higher price later.
This will probably be what pushes me to become a luddite, or at least use the web far less. Over the years I have developed a great disdain for ads and they have grown so numerous on the web that...
This will probably be what pushes me to become a luddite, or at least use the web far less. Over the years I have developed a great disdain for ads and they have grown so numerous on the web that the only way I can tolerate watching videos or going through social media is with an ad blocker. The moment I can no longer do that is the moment cut as much of the web out of my life as possible.
My understanding is that it wouldn't directly block dns level adblocking, but because it prevents adblock extensions, I think anti-adblocker libraries (something that many adblock extensions stop)...
My understanding is that it wouldn't directly block dns level adblocking, but because it prevents adblock extensions, I think anti-adblocker libraries (something that many adblock extensions stop) would be able to detect that the ads failed to load.
Mozilla is against it. Although, they do leave the door open slightly for some future proposal:
Mozilla is against it. Although, they do leave the door open slightly for some future proposal:
Detecting fraud and invalid traffic is a challenging problem that we're interested in helping address. However this proposal does not explain how it will make practical progress on the listed use cases, and there are clear downsides to adopting it.
I don't have enough context to evaluate this (though it seems pretty early-stage), but I found this bit interesting: If these attestations were randomly disabled then I think it might go a long...
I don't have enough context to evaluate this (though it seems pretty early-stage), but I found this bit interesting:
To protect against both risks, we are evaluating whether attestation signals must sometimes be held back for a meaningful number of requests over a significant amount of time (in other words, on a small percentage of (client, site) pairs, platforms would simulate clients that do not support this capability). Such a holdback would encourage web developers to use these signals for aggregate analysis and opportunistic reduction of friction, as opposed to a quasi-allowlist: A holdback would effectively prevent the attestation from being used for gating feature access in real time, because otherwise the website risks users in the holdback population being rejected.
Although a holdback would prevent the attestation signal from being used for per-request enforcement decisions, there remains immense value for measurement in aggregate populations.
However, a holdback also has significant drawbacks. In our use cases and capabilities survey, we have identified a number of critical use cases for deterministic platform integrity attestation. These use cases currently rely on client fingerprinting. A deterministic but limited-entropy attestation would obviate the need for invasive fingerprinting here, and has the potential to usher in more privacy-positive practices in the long-term.
We ask for feedback from the community group on the idea of a holdback, and are very interested in alternative suggestions that would allow both goals to be met.
If these attestations were randomly disabled then I think it might go a long way towards making this proposal acceptable? Sure, it nerfs it, but it's probably good enough for advertisers to know that sometimes their ads aren't blocked, for example.
Here's a message on the mailing list that also advocates for holdback groups and gives some context around previous things Chromium has done.
After reading that, I don't know if they even know what problem they're trying to solve. 3/4 of the use cases in the intro would be obviously broken under that scheme. The only one that wouldn't...
After reading that, I don't know if they even know what problem they're trying to solve. 3/4 of the use cases in the intro would be obviously broken under that scheme. The only one that wouldn't is the first one, about ads, because that's the only one that'll work at all on probabilistic data.
The immediate conspiracy theory answer is "they only care about that one, the others were thrown in as distractions" but I'm reluctant to believe that because it's not even a good way to solve any of these problems, even scoped to ads. It doesn't increase privacy (ad targeting is a thing, at most you added another bit of signal to fingerprint on). I don't think it can be implemented correctly on anything without SGX-like hardware functionality ("hi, attester! I am totally a normal chrome instance with no extensions, mind signing this for me?").
I sometimes read something I so thoroughly do not understand that I start wondering "is the author of this significantly smarter than me, or significantly dumber?". This is one of those times.
I believe this is about fraud. Fraud and anti-fraud are big businesses. Companies have fraud detection engines to figure out how risky a transaction is, and if it seems risky they will ask for...
Exemplary
I believe this is about fraud. Fraud and anti-fraud are big businesses. Companies have fraud detection engines to figure out how risky a transaction is, and if it seems risky they will ask for more verification.
A simple example of this is captchas. There are services to do bot detection like Google's reCAPCHA service. They use a wide variety of signals, none of which are entirely accurate, and combine them to compute a score. Credit card transaction risk scores are another example.
There's a fundamental tradeoff with privacy. If the reCAPTCHA service knows who you are (for example, you are a logged in Google user) then the risk calculation is easy and it can return a lower risk score than otherwise. Browser fingerprinting is also used to detect configurations that are suspicious. Using the reCAPTCHA service is sort of a way for a website to benefit from browser fingerprinting without doing it themselves or learning anything any more private than "probably not a bot."
Unfortunately, people who use browser configurations that avoid leaking any privacy data tend to see captchas a lot more. If they have no idea who you are then they're going to be suspicious.
One way to do fraud is to run client software in the cloud, get the user to enter their information into a phishing website, and then have the bot enter the same information into the real website. To detect this, it would be really useful to have a signal to tell that the user is actually running a browser on their phone and it's not all fake. Even if the signal isn't always there, they can let the more verifiable users go through and concentrate on verifying the others.
These systems always need to have fallbacks, which is why it doesn't have to work 100% of the time. Companies often do want to serve people who care about privacy, since it's still another sale. Also, companies do tolerate a certain amount of fraud, as long as it's low enough.
Deliberately making this signal not work 100% of the time is a way to make sure the fallbacks are still built and still work. This can be seen as a way of making sure that the open web is still a thing, despite having better anti-fraud measures. You can judge for yourself how sincere they are about that, but I believe that at least some of the engineers working on these things are sincere, and they know that other browser vendors aren't going to go along with a new standard if they don't do something along those lines. This proposal is the start of a negotiation between the browser vendors, and they know they'll need to make design tradeoffs to satisfy the other vendors.
Anti-fraud measures are controversial because they're fairly closely related to other things people want to do all the time. You could even think of ad blocking as a very small-scale sort of fraud (using a website without paying for it), though they'd never prosecute for it. Cheating in video games is sort of like fraud since there's a similar cat-and-mouse game.
The kinds of fraud they're really worried about tend to be larger scale. Fraudsters use bots to click on ads to increase ads revenue, and Google will likely be more concerned about that than ad blocking.
Dude... They claimed in their proposal that they "looked forward to feedback". They got a lot of feedback on Github. And immediately locked all issues "over the weekend" as a response. And still...
You can judge for yourself how sincere they are about that, but I believe that at least some of the engineers working on these things are sincere
Dude... They claimed in their proposal that they "looked forward to feedback". They got a lot of feedback on Github. And immediately locked all issues "over the weekend" as a response. And still haven't unlocked it well past the weekend.
These are straight-up evil people, acting in bad faith and fucking over society to get theirs.
Who do you suppose they want to hear from? Other browser vendors, I’d imagine? People who might actually use these API’s? Probably not random people who have only a vague idea what this new API is...
Who do you suppose they want to hear from? Other browser vendors, I’d imagine? People who might actually use these API’s? Probably not random people who have only a vague idea what this new API is for, but are sure they don’t like it.
When a Github repo starts attracting that kind of attention, they can either lock it down or it will become useless. It’s like any other community that lacks good moderation.
Fully agreed - Corporate interests are the only stakeholder that Google cares about here, and they're willing to steamroll over smaller web developers and technical users to push their proposal...
Who do you suppose they want to hear from? Other browser vendors, I’d imagine? People who might actually use these API’s?
Fully agreed - Corporate interests are the only stakeholder that Google cares about here, and they're willing to steamroll over smaller web developers and technical users to push their proposal forward.
Probably not random people who have only a vague idea what this new API is for, but are sure they don’t like it.
Don't you think that's a bit of an aggressively contemptuous description of the professional web developers that make up Github's userbase, myself included? But I do think you're capturing Google's viewpoint well here.
When a Github repo starts attracting that kind of attention, they can either lock it down or it will become useless. It’s like any other community that lacks good moderation.
Given that >99% of the feedback is civil and well-considered, I'm not sure I agree that it's a moderation issue. Depends on what you consider useless though: if demonstrating unanimous opposition from the technical community is useless in your book, then sure.
Consider that they may have already deleted some abusive comments, and we can only guess about what those were. I don’t know what they’ve seen, but I think I’m being realistic about social...
Consider that they may have already deleted some abusive comments, and we can only guess about what those were.
I don’t know what they’ve seen, but I think I’m being realistic about social dynamics on the Internet. What do you think is going to happen as a response to negative press articles that links to the repo, considering many people’s well-known opinions about privacy and Google? Are people going to be well-behaved? Anyone can create a Github account.
Fair points about comments already being cleaned up and the potential for mob mentality to get fueled by the media. Totally agreed on the social dynamics too - the fact that they have a...
Fair points about comments already being cleaned up and the potential for mob mentality to get fueled by the media.
Totally agreed on the social dynamics too - the fact that they have a well-earned reputation for abusing their monopoly position to undermine privacy (manifest v3/widevine being classic examples) is exactly why I'm arguing it's flat-out duplicitous for them to be pretending to care about user feedback: they already know what they're going to hear and don't care to take it into consideration.
I think you're right that they probably aren't looking for broad feedback from the general public. This was shared in an obscure repo and on a mailing list. They weren't exactly putting out press...
I think you're right that they probably aren't looking for broad feedback from the general public. This was shared in an obscure repo and on a mailing list. They weren't exactly putting out press releases calling for feedback from all over. Someone who's serious about that would probably run a survey instead, and they'd probably wait until the project is further along.
Why is it public at all? Web standards people like to do things in public (it's how the process works), and Chromium is committed to working in public. But it's usually fairly obscure and technical. Everything Google does in public risks getting negative attention if it's controversial. I doubt they're all that surprised.
But I think you have more of a nitpick about wording than a serious lie. It's polite to say you're looking forward to feedback and may even be true in some cases, but they'd probably be happier to get some kinds of feedback than others. And I think that's true of everyone?
On the Internet, people really want to leave feedback on anything that catches their attention, and often think of it as their right. They get very upset that their feedback isn't being considered, but I think that's the truth of things. We are often just observers.
When it arrived the web seemed to fill all of those niches at once. The web was surprisingly good at emulating a TV, a newspaper, a book, or a radio. Which meant that people expected it to answer the questions of each medium, and with the promise of advertising revenue as incentive, web developers set out to provide those answers. As a result, people in the newspaper industry saw the web as a newspaper. People in TV saw the web as TV, and people in book publishing saw it as a weird kind of potential book. But the web is not just some kind of magic all-absorbing meta-medium. It's its own thing. And like other media it has a question that it answers better than any other. That question is:
On this point, I do passionately disagree with you. I'm very familiar with web standards, and I think it's a dogwater proposal from a technical perspective that will cause massive externalities...
On this point, I do passionately disagree with you.
I'm very familiar with web standards, and I think it's a dogwater proposal from a technical perspective that will cause massive externalities without actually delivering the proposed value. Scrapers/scalpers/spammers/scammers will just pivot to buying clean devices, screen scraping, and scripting peripheral inputs; TPM doesn't stop that.
You can be fatalist and let Google bend you over if you like, but like you say, web standards are a public process open to stakeholders, and I am one. It's absolutely anti-competitive and evil of Google to abuse their position to push through this proposal. Their monopoly should be shattered and their devs should be blackballed at every respectable shop.
I'm not a fatalist, I just don't think we have any real say here. Not directly, anyway. Do you? Web standards are decided by browser vendors. However, Mozilla and Apple do get a say, and they do...
I'm not a fatalist, I just don't think we have any real say here. Not directly, anyway. Do you? Web standards are decided by browser vendors. However, Mozilla and Apple do get a say, and they do care about privacy, in their own ways. (And Mozilla just came out against this proposal.)
Where'd you pick that idea up from? De facto, Chrome can implement whatever they want. That's absolutely not how W3C standards are intended to work though - it's a public process designed to take...
Web standards are decided by browser vendors.
Where'd you pick that idea up from?
De facto, Chrome can implement whatever they want. That's absolutely not how W3C standards are intended to work though - it's a public process designed to take industry and user input into account.
To be fair, the W3C has been usurped by the WHATWG for like... 15 years now. I know they signed that Memorandum of Understanding a few years back, but they mostly just tend the HTML spec while the...
To be fair, the W3C has been usurped by the WHATWG for like... 15 years now. I know they signed that Memorandum of Understanding a few years back, but they mostly just tend the HTML spec while the real work on CSS and JS is driven by the browser vendors.
Just as an FYI, @skybrian is an ex-software engineer for a large tech company and is generally pretty informed on these topics.
Specifically, Google. It’s in my bio. It’s been five years since I left, though, so my knowledge is increasing out of date. I haven’t participated in writing web standards, though I do read them.
Specifically, Google. It’s in my bio. It’s been five years since I left, though, so my knowledge is increasing out of date. I haven’t participated in writing web standards, though I do read them.
In theory, Chrome could do whatever they want, but in practice they’re unlikely to stick with something that other browser vendors veto. For example, Mozilla vetoed WebSQL and it’s deprecated and...
In theory, Chrome could do whatever they want, but in practice they’re unlikely to stick with something that other browser vendors veto. For example, Mozilla vetoed WebSQL and it’s deprecated and being removed in Chrome. A similar thing happened with Google’s Portable Native Client. Firefox refused and went with asm.js which was replaced by WebAssembly.
That’s for things that actually shipped in Chrome, before they had entirely learned this lesson. Other proposals don’t make it that far and got improved or replaced by something better.
The reason this roughly works is that Chrome is actually pretty committed to following web standards. They want new web API’s to solve problems Google has, but they don’t want to go their own way. Google’s websites need to work in other browsers.
You are right that it’s a public process. As an individual, it’s possible to have influence over web standards. But nobody is required to listen, and drive-by posts aren’t likely to be all that influential. When someone acts like a troll then they have the influence of a troll, and who listens to them?
I saw a nice blog post about how to participate in web standards efforts on Hacker News, but I can’t find it now.
Edit: here it is. The author is apparently a Googler who stepped in after this blew up.
Google and others must be keenly aware that they have outsized market power, and they've been enjoying it more or less unchallenged by the government for a long time, despite having been issued a...
Google and others must be keenly aware that they have outsized market power, and they've been enjoying it more or less unchallenged by the government for a long time, despite having been issued a series of puny-tive fines, har har. But no tech giant has to be broken up.
Yet, each action is one more straw laid on the camel's back.
Whether or not the primary intention of web attestations is to protect user privacy and combat bots online, a consequence, intended or not, will the effective gatekeeping of new browser entrants in the market.
I haven't finished reading the proposal yet, so this is just my understanding of it so far and I might be wrong about something.
This would create what amounts to a drm scheme that would allow websites to block you from accessing them unless you are using an approved and unmodified (by things such as adblockers) browser.
From the diagram in the proposal, it appears that how it would work is that:
They list many different reasons for this relating to security and protecting users, but it seems clear to me that the main goal of this is to prevent adblocking.
With how outrageous this proposal is, it seems like it will probably not succeed, but the fact that they even felt confident enough to publish it is worrying.
It may fail now, but I doubt that’ll stop them from trying to get buy-in from other browsers/large websites for more attempts in the future until some version of it succeeds.
EDIT: I just noticed this on their current specification:
I get that it's still a work in progress, but it's funny that months after the "explainer" document was written, the part of the specification most related to their claimed reasons for doing this is just "todo".
My knee jerk reaction is that I don’t want google or any other huge company proposing any standards because it’s almost certainly an embrace/extend/extinguish plan to take over more of the internet. See also AMP links
https://en.wikipedia.org/wiki/Accelerated_Mobile_Pages
They don't need to propose standards to create standards. The vast majority of development languages and frameworks have or had corporate backing. It's not necessarily a bad thing for corporations to assist open source projects.
The trouble is when the standards aren't open or reproducible. We're getting very close to chromium becoming the modern internet explorer where Google can just decide to create systems that websites must follow to stay relevant (like you mentioned with AMP).
Which is why I try to use Firefox whenever possible, or even WebKit (Safari) based browsers where available.
But still, I already have come across a couple of websites that wouldn’t allow me to view their actual content outside of a Chromium environment – this was typically while using Safari or one of its derivatives. Of course with the recommendation to download Google Chrome or MS Edge included.
Every single site doing this is one too many…
I heard a disturbing-yet-valid argument for making your website accessible, a while back. It goes:
The percentage of web users on Firefox is about 2%. The number of blind people is slightly above that. Why are people testing their website on Firefox but not testing their website with a screen-reader?
Maybe but they don’t seem to be acting on it yet.
Chrome has done things to discourage browser-specific websites for a long time. For example, they use origin trials to try out experimental features, to make sure they don’t become accidental standards. Websites have to register to use them and they automatically expire.
(Also, AMP works in all major browsers.)
That very article you link to doesn't talk about standardizing things at all, just about testing them before broad release. Google themselves have had no problem distributing browser specific websites.. Gmail mobile for instance was broken for years because they only used prefixed css values and didn't specify the non prefixed ones. Afaik Google docs to this day still has an offline version based on websql and not indexed db.
Giving power over the web to a single ad company is an insane concept to me, but people and webdevs seem deadset on doing it.
It’s true that Google often tests experimental web API’s on its own websites. This is what origin trials are for, and they do them too.
I don’t use the GMail or Google Docs websites on mobile and wouldn’t trust them for offline access. But I suspect they’re closer to “working” than “not working” for most people in most browsers? Is the trend towards or away from sticking to web standards?
This is literally their first use case scenario. It's pretty unambiguously about ads, with everything else being an added bonus.
The word "integrity" to me was a red flag. It's a weasel word; it could have a lot of different meanings. It talks about a third party for attestation but the fact that this is a Google proposal is practically guaranteeing that Google is going to be effectively the only attestation service anyone is going to use.
I wouldn't disregard the possibility of this becoming a reality, because I forsee websites that will lock people out if they fail the integrity check.
This is why we have to stop relying on google so much and degoogle our lifes, starting and maybe most importantly, google chrome and chromium. We need a free internet, which heavily relies on a free browser such as Firefox. Pretty much everything not chromium based, so google can't dictate what web technology is used by the masses.
Google maps is still the Google product I kinda rely on, last week I unfortunately had to use Meet for work for the first time though.
Apple Maps exists but there are no directions where I live, so functionally useless
Google maps is a great product indeed but I never relied that heavily on it. There are several android apps based on open street map that do a good enough job. At least for my use case. They are not as good as google maps but using a worse product is unfortunately the price we have to pay now or we will pay a much higher price later.
This will probably be what pushes me to become a luddite, or at least use the web far less. Over the years I have developed a great disdain for ads and they have grown so numerous on the web that the only way I can tolerate watching videos or going through social media is with an ad blocker. The moment I can no longer do that is the moment cut as much of the web out of my life as possible.
At a glance, this would prevent browser extensions from blocking ads but wouldn't do anything for DNS level blocking. Is that correct?
My understanding is that it wouldn't directly block dns level adblocking, but because it prevents adblock extensions, I think anti-adblocker libraries (something that many adblock extensions stop) would be able to detect that the ads failed to load.
Mozilla is against it. Although, they do leave the door open slightly for some future proposal:
Not usually a fan of these links, but Hacker News doesn’t like this either.
I don't have enough context to evaluate this (though it seems pretty early-stage), but I found this bit interesting:
If these attestations were randomly disabled then I think it might go a long way towards making this proposal acceptable? Sure, it nerfs it, but it's probably good enough for advertisers to know that sometimes their ads aren't blocked, for example.
Here's a message on the mailing list that also advocates for holdback groups and gives some context around previous things Chromium has done.
After reading that, I don't know if they even know what problem they're trying to solve. 3/4 of the use cases in the intro would be obviously broken under that scheme. The only one that wouldn't is the first one, about ads, because that's the only one that'll work at all on probabilistic data.
The immediate conspiracy theory answer is "they only care about that one, the others were thrown in as distractions" but I'm reluctant to believe that because it's not even a good way to solve any of these problems, even scoped to ads. It doesn't increase privacy (ad targeting is a thing, at most you added another bit of signal to fingerprint on). I don't think it can be implemented correctly on anything without SGX-like hardware functionality ("hi, attester! I am totally a normal chrome instance with no extensions, mind signing this for me?").
I sometimes read something I so thoroughly do not understand that I start wondering "is the author of this significantly smarter than me, or significantly dumber?". This is one of those times.
I believe this is about fraud. Fraud and anti-fraud are big businesses. Companies have fraud detection engines to figure out how risky a transaction is, and if it seems risky they will ask for more verification.
A simple example of this is captchas. There are services to do bot detection like Google's reCAPCHA service. They use a wide variety of signals, none of which are entirely accurate, and combine them to compute a score. Credit card transaction risk scores are another example.
There's a fundamental tradeoff with privacy. If the reCAPTCHA service knows who you are (for example, you are a logged in Google user) then the risk calculation is easy and it can return a lower risk score than otherwise. Browser fingerprinting is also used to detect configurations that are suspicious. Using the reCAPTCHA service is sort of a way for a website to benefit from browser fingerprinting without doing it themselves or learning anything any more private than "probably not a bot."
Unfortunately, people who use browser configurations that avoid leaking any privacy data tend to see captchas a lot more. If they have no idea who you are then they're going to be suspicious.
One way to do fraud is to run client software in the cloud, get the user to enter their information into a phishing website, and then have the bot enter the same information into the real website. To detect this, it would be really useful to have a signal to tell that the user is actually running a browser on their phone and it's not all fake. Even if the signal isn't always there, they can let the more verifiable users go through and concentrate on verifying the others.
These systems always need to have fallbacks, which is why it doesn't have to work 100% of the time. Companies often do want to serve people who care about privacy, since it's still another sale. Also, companies do tolerate a certain amount of fraud, as long as it's low enough.
Deliberately making this signal not work 100% of the time is a way to make sure the fallbacks are still built and still work. This can be seen as a way of making sure that the open web is still a thing, despite having better anti-fraud measures. You can judge for yourself how sincere they are about that, but I believe that at least some of the engineers working on these things are sincere, and they know that other browser vendors aren't going to go along with a new standard if they don't do something along those lines. This proposal is the start of a negotiation between the browser vendors, and they know they'll need to make design tradeoffs to satisfy the other vendors.
Anti-fraud measures are controversial because they're fairly closely related to other things people want to do all the time. You could even think of ad blocking as a very small-scale sort of fraud (using a website without paying for it), though they'd never prosecute for it. Cheating in video games is sort of like fraud since there's a similar cat-and-mouse game.
The kinds of fraud they're really worried about tend to be larger scale. Fraudsters use bots to click on ads to increase ads revenue, and Google will likely be more concerned about that than ad blocking.
Dude... They claimed in their proposal that they "looked forward to feedback". They got a lot of feedback on Github. And immediately locked all issues "over the weekend" as a response. And still haven't unlocked it well past the weekend.
These are straight-up evil people, acting in bad faith and fucking over society to get theirs.
Who do you suppose they want to hear from? Other browser vendors, I’d imagine? People who might actually use these API’s? Probably not random people who have only a vague idea what this new API is for, but are sure they don’t like it.
When a Github repo starts attracting that kind of attention, they can either lock it down or it will become useless. It’s like any other community that lacks good moderation.
Fully agreed - Corporate interests are the only stakeholder that Google cares about here, and they're willing to steamroll over smaller web developers and technical users to push their proposal forward.
Don't you think that's a bit of an aggressively contemptuous description of the professional web developers that make up Github's userbase, myself included? But I do think you're capturing Google's viewpoint well here.
Given that >99% of the feedback is civil and well-considered, I'm not sure I agree that it's a moderation issue. Depends on what you consider useless though: if demonstrating unanimous opposition from the technical community is useless in your book, then sure.
https://github.com/RupertBenWiser/Web-Environment-Integrity/issues
Consider that they may have already deleted some abusive comments, and we can only guess about what those were.
I don’t know what they’ve seen, but I think I’m being realistic about social dynamics on the Internet. What do you think is going to happen as a response to negative press articles that links to the repo, considering many people’s well-known opinions about privacy and Google? Are people going to be well-behaved? Anyone can create a Github account.
Fair points about comments already being cleaned up and the potential for mob mentality to get fueled by the media.
Totally agreed on the social dynamics too - the fact that they have a well-earned reputation for abusing their monopoly position to undermine privacy (manifest v3/widevine being classic examples) is exactly why I'm arguing it's flat-out duplicitous for them to be pretending to care about user feedback: they already know what they're going to hear and don't care to take it into consideration.
I think you're right that they probably aren't looking for broad feedback from the general public. This was shared in an obscure repo and on a mailing list. They weren't exactly putting out press releases calling for feedback from all over. Someone who's serious about that would probably run a survey instead, and they'd probably wait until the project is further along.
Why is it public at all? Web standards people like to do things in public (it's how the process works), and Chromium is committed to working in public. But it's usually fairly obscure and technical. Everything Google does in public risks getting negative attention if it's controversial. I doubt they're all that surprised.
But I think you have more of a nitpick about wording than a serious lie. It's polite to say you're looking forward to feedback and may even be true in some cases, but they'd probably be happier to get some kinds of feedback than others. And I think that's true of everyone?
On the Internet, people really want to leave feedback on anything that catches their attention, and often think of it as their right. They get very upset that their feedback isn't being considered, but I think that's the truth of things. We are often just observers.
I'm reminded of an old article by Paul Ford.
On this point, I do passionately disagree with you.
I'm very familiar with web standards, and I think it's a dogwater proposal from a technical perspective that will cause massive externalities without actually delivering the proposed value. Scrapers/scalpers/spammers/scammers will just pivot to buying clean devices, screen scraping, and scripting peripheral inputs; TPM doesn't stop that.
You can be fatalist and let Google bend you over if you like, but like you say, web standards are a public process open to stakeholders, and I am one. It's absolutely anti-competitive and evil of Google to abuse their position to push through this proposal. Their monopoly should be shattered and their devs should be blackballed at every respectable shop.
I'm not a fatalist, I just don't think we have any real say here. Not directly, anyway. Do you? Web standards are decided by browser vendors. However, Mozilla and Apple do get a say, and they do care about privacy, in their own ways. (And Mozilla just came out against this proposal.)
Where'd you pick that idea up from?
De facto, Chrome can implement whatever they want. That's absolutely not how W3C standards are intended to work though - it's a public process designed to take industry and user input into account.
https://www.w3.org/standards/review/
To be fair, the W3C has been usurped by the WHATWG for like... 15 years now. I know they signed that Memorandum of Understanding a few years back, but they mostly just tend the HTML spec while the real work on CSS and JS is driven by the browser vendors.
Just as an FYI, @skybrian is an ex-software engineer for a large tech company and is generally pretty informed on these topics.
Specifically, Google. It’s in my bio. It’s been five years since I left, though, so my knowledge is increasing out of date. I haven’t participated in writing web standards, though I do read them.
In theory, Chrome could do whatever they want, but in practice they’re unlikely to stick with something that other browser vendors veto. For example, Mozilla vetoed WebSQL and it’s deprecated and being removed in Chrome. A similar thing happened with Google’s Portable Native Client. Firefox refused and went with asm.js which was replaced by WebAssembly.
That’s for things that actually shipped in Chrome, before they had entirely learned this lesson. Other proposals don’t make it that far and got improved or replaced by something better.
The reason this roughly works is that Chrome is actually pretty committed to following web standards. They want new web API’s to solve problems Google has, but they don’t want to go their own way. Google’s websites need to work in other browsers.
You are right that it’s a public process. As an individual, it’s possible to have influence over web standards. But nobody is required to listen, and drive-by posts aren’t likely to be all that influential. When someone acts like a troll then they have the influence of a troll, and who listens to them?
I saw a nice blog post about how to participate in web standards efforts on Hacker News, but I can’t find it now.
Edit: here it is. The author is apparently a Googler who stepped in after this blew up.
Google and others must be keenly aware that they have outsized market power, and they've been enjoying it more or less unchallenged by the government for a long time, despite having been issued a series of puny-tive fines, har har. But no tech giant has to be broken up.
Yet, each action is one more straw laid on the camel's back.
Whether or not the primary intention of web attestations is to protect user privacy and combat bots online, a consequence, intended or not, will the effective gatekeeping of new browser entrants in the market.
That's a heavier-than-normal straw to add on.