The biggest problem from my perspective is that we're flying blind. We know that our personal information is valuable—I mean "they" (even this part is nebulous) are paying big bucks to get at it....
The biggest problem from my perspective is that we're flying blind. We know that our personal information is valuable—I mean "they" (even this part is nebulous) are paying big bucks to get at it. Intuitively I think it makes sense that some bad actors could use our information for something undesirable, but we haven't seen a lot of concrete evidence of it. What I have seen:
Targetted ads. To be honest, I don't see these as a big deal per se. They're kind of a canary in a coal mine, in that they reveal to us how much we're being spied on, but I don't think they're interesting in of themselves.
Differential pricing. There were examples several years ago of OS X users paying more for flight tickets than Windows users (websites were showing different prices based on your browser's user agent string). This has become a bit more subtle now. Companies not only know what we want to buy, but they (probably) have a good idea of how much we're willing to pay for it.
International travel. We've now seen lots of examples of people being banned from flights, refused visas, etc., based on their social media. We can say "serves you right for being stupid enough to put contentious info on a public social media profile", but that's really only stage 1. What's coming next is discrimination not just on your public social media posts, but your private posts, even your most private and intimate messages. We know via Snowden that government agencies have a direct pipe to information that we explicitly mark as non-public.
Similarly to above, companies discriminating when hiring, based on our private information.
Things are kind of vaguely creepy right now, but from a practical standpoint, the average person isn't really affected in any meaningful way. But, it's the nature of information, that once it's out there, it can't be taken back. I suspect part of the reason our information is so valuable is that it has potential value for the future. Some company in the future is going to figure out how to make a fortune on discriminating on our private interactions.
Some things that I think lead us to this point:
People have no idea how electronic devices work. I'm a CS educator and had high hopes for the "digital natives", but ended up being alarmed to find that children born in 2000 are more ignorant of how a computer works than children born in 1980 were at the same age.
People have no idea how the Internet works. People have really bizarre behaviours, like pointing to a company's Privacy Policy to decide if they're information's safe, which suggest they don't really understand what's going on.
Tech capital has concentrated itself in only a few giants. 15 years ago, if you wanted access to 90% of the population's private email, you'd have to deal with probably hundreds of companies and organizations. Now you have to deal with 1. You can hardly fault Google for it—they flat-out just did email better than everyone else—but it sucks.
The network effect, which exasperates the oligopoly. If Google decides it wants to scrap IMAP (an open standard), we can't do anything about it. If Google decides it wants to scrap HTTP, well it actually got frighteningly close to doing that (thankfully it was going to replace it with another open standard, but it might not be so benevolent forever). Google controls the world's email, searches, smartphones, maps, and more. If it wants to squeeze out a competitor, even a free software competitor, it can use its heft in another domain to make things incompatible.
I don't know exactly what privacy should be. I'm an Orwell fan and I think his description of how people act in privacy vs non-privacy in 1984 has a lot of truth to it. We've seen psychological studies that back up the Chilling Effect: people behave less creatively when they know they're being watched. I don't think it's psychologically healthy to act as if you're watched 24/7. At some level you need a private space to let something out, knowing that it's not going to affect your social standing or your bank account or your ability to travel or your ability to find a job.
I think the prevalence of [locked down] phones are part of what leads to people being more technologically illiterate, along with there being no in-school education on the basics of how...
I think the prevalence of [locked down] phones are part of what leads to people being more technologically illiterate, along with there being no in-school education on the basics of how electronics and computers in general work.
Could you provide an example of this? I had the same high hopes, and now you've crushed them. I'm curious about how this manifests in the classroom and why it may be the case.
People have no idea how electronic devices work. I'm a CS educator and had high hopes for the "digital natives", but ended up being alarmed to find that children born in 2000 are more ignorant of how a computer works than children born in 1980 were at the same age.
Could you provide an example of this? I had the same high hopes, and now you've crushed them. I'm curious about how this manifests in the classroom and why it may be the case.
While there are some great responses here, I think another big reason is that too many things are being oversimplified when presented to schools and kids. It's really a bunch of little things...
While there are some great responses here, I think another big reason is that too many things are being oversimplified when presented to schools and kids. It's really a bunch of little things molding together into one big mound of technological illiteracy.
For example, take all of the Hour of Code activities schools push on students–these are generally drag and drop "move left", "move right" type activities where it's really just logical reasoning. Sure, this is an important skill when it comes to coding, but this is by no means what writing code is like.
Also, Chromebooks. These things are being touted in laptops and being put in every classroom they can cram them in to. They are by no means laptops and are really tablets at best. Through their extremely simple interface, it teaches kids to expect this from all computers and not understand how to (a) use a regular computer, or (b) figure stuff like this out on their own.
Some tasks that have a very high failure right among middle school (and even many high school) students: "Open your browser". Of course they have all used a browser a million times, but many of...
Some tasks that have a very high failure right among middle school (and even many high school) students:
"Open your browser". Of course they have all used a browser a million times, but many of them don't know what it's called. I've learned that many of them just call their browser (even if it's Edge or Firefox or something) "Google".
Write a URL on the whiteboard and say "go here". Most of them have never typed in a URL before (some will say they've never seen one before. Others will say "isn't that a programming code in Google?" or something)
"Download this file and then open it later". The task of "download this file and have your browser open it immediately" is easy, but finding it and opening it later can be very challenging. This is a broader topic, but many students do not have a solid concept of a filesystem, the concept of local storage vs cloud storage, etc. The idea of a file being stored on their computer (and not accessible from other computers) can often be confusing.
Many students seem to intuitively think in terms of "programs" (or "apps") instead of "data". The concept of creating something in one program and then opening it in another program can really mess with them. As an example, the steps of: (1) Open up Notepad; (2) Write a small amount of HTML in Notepad; (3) Save it; (4) See that a new file has been created on your computer; (5) Right-click the file and open in your browser. Each one of those steps in isolation can be okay, but trying to conceptually say how 2 steps are related to one another can be a real mindfuck.
Edit: A pet peeve of mine, but they are generally awful at typing. When I was a kid, we had mandatory typing classes: one in grade 2-ish (mostly just "here's what a keyboard looks like, get comfortable with it"), one in grade 6-ish (learning how to touch type, a bit of a struggle) and one in grade 9-ish (this one was for real, being tested in speed and accuracy). The kids that I've talked to, none of them have ever learned how to type. Some of them pick it up a little bit, of course, but even the best students are generally pretty poor at it.
How old are the kids you're working with? I kind of forced my younger sister into learning some comp stuff mostly by refusing to do basic things for her on the pc and setting up touch typing...
How old are the kids you're working with?
I kind of forced my younger sister into learning some comp stuff mostly by refusing to do basic things for her on the pc and setting up touch typing tutorials with pocket money incentives. She was miles ahead of the class when she had it (age 12) a few years later, teacher gave them a sheet to fill out and said to use the computer, she just googled the title and found the answer sheet. I was very proud.
This is so unexpected for me. I was born in the late 90s, so I grew up with technology. I would've thought that those born after me would be far better with technology than I have. Do these...
This is so unexpected for me. I was born in the late 90s, so I grew up with technology. I would've thought that those born after me would be far better with technology than I have.
Do these students regularly use PCs, or are they using iPads? I can't imagine anyone using a PC and not having a basic understanding of file systems, URLs and typing. Also, what's happened to the nerds? When I was at school there was always a small group in class that would be messing around with the command line etc.
The big problem is the transparency and privacy-invasion isn't turned on the people who control and build the tech and isn't turned on politicians. There was an article where Steve Jobs wouldn't...
The big problem is the transparency and privacy-invasion isn't turned on the people who control and build the tech and isn't turned on politicians. There was an article where Steve Jobs wouldn't let his kids use iPads, and Mark Zuckerberg bought the houses surrounding his house to have some semblance of privacy. If privacy and a clear mind undisturbed by interruption are so valuable to billionaires, why do we subjugate the rest of the population?
I bet all those ad execs pushing for more targeted advertising either use ad-block themselves or never click on ads and make a conscious effort to avoid them, or just don't use the internet as the same way as everyone else.
Firstly, the majority of ad-tech companies don't actually no much about you. They are mostly making wild guesses and selling their assets based on that. The people who buy ad space know this and...
Firstly, the majority of ad-tech companies don't actually no much about you. They are mostly making wild guesses and selling their assets based on that. The people who buy ad space know this and adjust their payments accordingly. The company that by far has the most information about you is Facebook, because most people give the information that advertisers want when they sign up. This is why facebook ads are worth more than most ad tech companies. Google also has a lot of personalized data that's inferred from searches and google services logins.
Secondly, most adtech companies including facebook and google don't hand out very much information. A client will attempt to buy an ad for x dollars with y demographic and get data back on how many impressions, clicks, and conversions they got. They rarely have an idea of who it actually served to.
There is also tracking software on most websites but the tracking technology is often not tied directly to ad tech vendors or even indirectly -- although google and facebook is probably a different story. Often analytics services are used to find out what a websites demographics are for direct ad sales and so that company can target the correct demographic for advertisements they are purchasing elsewhere.
In my opinion the greatest threats to your personal privacy are Google, Facebook, and data breaches.
Google and Facebook know basically everything there is to know about you and are using it as effectively as it can to serve ads and do who knows what else. The main worry with these two companies is that they are so big it's hard to know what they are doing and insane amount of market space they own.
Data breaches are a privacy nightmare because a lot of companies store data in a way that says "we will never get hacked" and then they get hacked. So your name, address, social, and a bunch of other data are often easy to pull down in one big chunk.
If I had to make one law to improve privacy and data protection it would be a law that forces transparency. Users would be able to easily understand how and what their data is being used for. This would probably put the brakes on the most nefarious business practices from the people with the most data and make it obvious how little data most other ad tech companies actually have.
There could also be some kind of rating system that reveals much data a company has. D1 would be the highest level, this would be google or facebook meaning they basically know everything about you. It could go to down to D10 there is zero data.
Side Note: I find it hilarious how people complain about ad tech companies giving them super poorly targeted ads and also complain how much data they have. If they had a lot of data the ads wouldn't be poorly targeted. :)
I guess we know different people, the people I know complain about getting ads that are unrelated to them. I suppose that's the problem with anecdotes over data.
I guess we know different people, the people I know complain about getting ads that are unrelated to them. I suppose that's the problem with anecdotes over data.
My annoyance with it is the bait and switch. You start with a good product, people join it, they depend on it, then you slowly strip out the core functionality of what made it good and start to...
My annoyance with it is the bait and switch. You start with a good product, people join it, they depend on it, then you slowly strip out the core functionality of what made it good and start to introduce trackers and advertising and more advertising. Then no one wants to delete their account because they're already in the ecosystem and they'll then "live with it". It makes me very angry.
The sad thing about this is that this could be said about the world wide web in general and not just "services" like facebook and google. It's like a world wide "This is why we can't have nice...
The sad thing about this is that this could be said about the world wide web in general and not just "services" like facebook and google. It's like a world wide "This is why we can't have nice things" phenomenon and it all boils down to money and power and it's fucking infuriating.
Well, it's not really about privacy. It's about power. We (as in, anyone not in the ruling or wealthy classes) live in a society where state and corporations wield vast power over our lives, both...
Well, it's not really about privacy. It's about power. We (as in, anyone not in the ruling or wealthy classes) live in a society where state and corporations wield vast power over our lives, both subtly and not.
When states and corporations lose privacy we gain power over them --- this is why whistleblowers are special and require protection. Whistleblowers can show abuse, opression, and exploitations against us. And this means we can do something about it.
When we give up privacy to states and corporations, they gain power. This seems an awful thing to me. More specifically, given up privacy means:
elections can be manipulated, undermining democracy in favour of authoritarianism
our political preferences can similarly be undermined
consumer preferences have no real choice, due to minimal real competition between producers, and ever-present advertising
all of our debates and conversations are at risk of manipulation, by "smart newsfeeds" or censorship or the chilling effect
controversial ideas such as free culture, anarchism, offensive art can disappear from public discourse
And so on. I'm sure for people involved in the contemporary privacy conversation the above list is familiar --- it shouldn't be hard to find corresponding literature.
Dealing with the above issues is hard. Getting off facebook, twitter, google et al is doable, although you may find out who your real friends are if people can't communicate through a corporate channel. Fixing email is tough (I do have some work in this space). Dealing with the state is a near impossible task. It seems all throughout the west all goverments are bent on total monitoring of civilians, a total perversion of the role of a representative democracy. You may have some success in engaging in the political process --- write letters, protest, vote --- but I fear that most of this is just normal background noise to pollies. I don't have a good solution on this point.
I think Riseup's security guide does a good job at explaining why privacy matters in an easy way: The rest of their security guide does a good job at providing an easy-to-follow guide to protect...
I think Riseup's security guide does a good job at explaining why privacy matters in an easy way:
Why security matters
The increasing importance of information and communication has brought with it another phenomenon: the rise of a surveillance society. You can think of surveillance as an attempt by the powerful to maintain their dominance by asserting control over communication.
Nation states have responded to new communications technology by pursuing an infrastructure that facilitates mass surveillance and can easily be re-purposed for total social control. Many governments also contract with unethical private corporations to track activists and break into their devices.
Corporations have discovered that the gathering and analysis of massive amounts of personal data is necessary if they want to remain competitive in an information-rich world. In particular, nearly all advertising is shifting toward surveillance-based tracking of our personal behavior.
Criminals have discovered that it is very lucrative to attack personal devices and cloud accounts to ransom data or blackmail the user.
In this context, digital security has become vitally important.
State surveillance has a long history of resulting in the repression of social movements.
Even indirectly, rampant surveillance has a chilling effect on social movements.
Corporate surveillance is just as serious as state surveillance. Not only can the massive amounts of data kept on internet users be easily re-purposed for direct state repression, but corporations are now on the verge of obtaining unprecedented power over consumers.
When people start to learn about the rise in surveillance they start to feel overwhelmed. Some decide that it is impossible to be secure, so they resign themselves to live under perpetual surveillance or to forsake all forms of digital communication. At Riseup, we believe there is a third way: our goal is to make a high degree of security easy and accessible for everyone.
The rest of their security guide does a good job at providing an easy-to-follow guide to protect your privacy & security. I like how they've split it into human, device, message, and network security.
My biggest privacy concern is how all the privacy noise is drowning the real issues. Privacy is far from all or nothing, yet a lot of "privacy advocates" problematize causes that are simply...
My biggest privacy concern is how all the privacy noise is drowning the real issues.
Privacy is far from all or nothing, yet a lot of "privacy advocates" problematize causes that are simply non-issues.
These people that keep crying wolf lead people to zone out regarding privacy completely. There are real, important issues that are drowning unaddressed.
You can't fight every battle at once. You need to get at the most important things first, then branch out. There can be many large issues at once, but if you try to take n more than a handful, it...
You can't fight every battle at once. You need to get at the most important things first, then branch out. There can be many large issues at once, but if you try to take n more than a handful, it becomes overwhelming for anyone who isn't deeply invested in the topic themselves.
My greatest privacy concern right now is that the people who advocate for taking privacy seriously often make it out to be this all-or-nothing thing, then go to insane lengths to hide all information about themselves.
Essentially, privacy advocates often come off as crazy folks who obviously must be hiding something serious because they go to absolutely extreme lengths to hide all trace of their online activity.
This leads people to zone out from the real issues, like election meddling, large volume of data everyone gives away to anyone who asks without a thought, government spying on its citizens without legal oversight and so on.
Regular people don't give a shit about those issues in a meaningful way because the people speaking on those topics seem crazy since they do so many over-the-top things to protect absolutely all information about themselves. It makes privacy concerns seem like a giant overreaction from a reactionary group that's way outside the norm.
Ah OK I understand although I find most people the places I frequent online to be somewhat reasonable towards the privacy issue I can see what you mean.
Ah OK I understand although I find most people the places I frequent online to be somewhat reasonable towards the privacy issue I can see what you mean.
I'm concerned with how corporations have trained people to not even consider pseudo-anonymity as a valid option when it was one of the pillars of the early internet. So many issues become huge...
I'm concerned with how corporations have trained people to not even consider pseudo-anonymity as a valid option when it was one of the pillars of the early internet. So many issues become huge problems when you just have one identity online and it's the same as that of your social security, your day job or your family portraits. We don't have this problem in real life, so why are we creating it online? To give ads valid clicks?
The problem with privacy is that people are stupid. The news is saturated with stories about how companies screw over the masses because they gave their personal information to them, and yet...
The problem with privacy is that people are stupid. The news is saturated with stories about how companies screw over the masses because they gave their personal information to them, and yet people keep using these company's services and making money for them. Even before facebook hit the scene, it was common knowledge to never give identifying information online, and yet people flocked to these services and sold themselves out. They created peer pressure that convinced even 'internet veterans' to sell their private information.
It's part of a larger problem with humanity; people are just plain unwilling to consider the viewpoints and motivations of people outside their groups. They just look for more confirmation about their beliefs. It makes them extremely easy to manipulate. Russia didn't break any laws when they meddled in the US presidential election, they just took advantage of the echo chambers people construct for themselves.
The thing that bothers me so much about this is that this should be such an easy fix. The value of empathy is something you're supposed to learn in preschool.
The biggest problem from my perspective is that we're flying blind. We know that our personal information is valuable—I mean "they" (even this part is nebulous) are paying big bucks to get at it. Intuitively I think it makes sense that some bad actors could use our information for something undesirable, but we haven't seen a lot of concrete evidence of it. What I have seen:
Things are kind of vaguely creepy right now, but from a practical standpoint, the average person isn't really affected in any meaningful way. But, it's the nature of information, that once it's out there, it can't be taken back. I suspect part of the reason our information is so valuable is that it has potential value for the future. Some company in the future is going to figure out how to make a fortune on discriminating on our private interactions.
Some things that I think lead us to this point:
I don't know exactly what privacy should be. I'm an Orwell fan and I think his description of how people act in privacy vs non-privacy in 1984 has a lot of truth to it. We've seen psychological studies that back up the Chilling Effect: people behave less creatively when they know they're being watched. I don't think it's psychologically healthy to act as if you're watched 24/7. At some level you need a private space to let something out, knowing that it's not going to affect your social standing or your bank account or your ability to travel or your ability to find a job.
I think the prevalence of [locked down] phones are part of what leads to people being more technologically illiterate, along with there being no in-school education on the basics of how electronics and computers in general work.
Could you provide an example of this? I had the same high hopes, and now you've crushed them. I'm curious about how this manifests in the classroom and why it may be the case.
While there are some great responses here, I think another big reason is that too many things are being oversimplified when presented to schools and kids. It's really a bunch of little things molding together into one big mound of technological illiteracy.
For example, take all of the Hour of Code activities schools push on students–these are generally drag and drop "move left", "move right" type activities where it's really just logical reasoning. Sure, this is an important skill when it comes to coding, but this is by no means what writing code is like.
Also, Chromebooks. These things are being touted in laptops and being put in every classroom they can cram them in to. They are by no means laptops and are really tablets at best. Through their extremely simple interface, it teaches kids to expect this from all computers and not understand how to (a) use a regular computer, or (b) figure stuff like this out on their own.
Some tasks that have a very high failure right among middle school (and even many high school) students:
How old are the kids you're working with?
I kind of forced my younger sister into learning some comp stuff mostly by refusing to do basic things for her on the pc and setting up touch typing tutorials with pocket money incentives. She was miles ahead of the class when she had it (age 12) a few years later, teacher gave them a sheet to fill out and said to use the computer, she just googled the title and found the answer sheet. I was very proud.
This is so unexpected for me. I was born in the late 90s, so I grew up with technology. I would've thought that those born after me would be far better with technology than I have.
Do these students regularly use PCs, or are they using iPads? I can't imagine anyone using a PC and not having a basic understanding of file systems, URLs and typing. Also, what's happened to the nerds? When I was at school there was always a small group in class that would be messing around with the command line etc.
It's like in the 1970s we invented cooking, but children today grew up in a world of TV dinners and take out. That doesn't make you a chef.
The big problem is the transparency and privacy-invasion isn't turned on the people who control and build the tech and isn't turned on politicians. There was an article where Steve Jobs wouldn't let his kids use iPads, and Mark Zuckerberg bought the houses surrounding his house to have some semblance of privacy. If privacy and a clear mind undisturbed by interruption are so valuable to billionaires, why do we subjugate the rest of the population?
I bet all those ad execs pushing for more targeted advertising either use ad-block themselves or never click on ads and make a conscious effort to avoid them, or just don't use the internet as the same way as everyone else.
The real problem is the power differential.
Firstly, the majority of ad-tech companies don't actually no much about you. They are mostly making wild guesses and selling their assets based on that. The people who buy ad space know this and adjust their payments accordingly. The company that by far has the most information about you is Facebook, because most people give the information that advertisers want when they sign up. This is why facebook ads are worth more than most ad tech companies. Google also has a lot of personalized data that's inferred from searches and google services logins.
Secondly, most adtech companies including facebook and google don't hand out very much information. A client will attempt to buy an ad for x dollars with y demographic and get data back on how many impressions, clicks, and conversions they got. They rarely have an idea of who it actually served to.
There is also tracking software on most websites but the tracking technology is often not tied directly to ad tech vendors or even indirectly -- although google and facebook is probably a different story. Often analytics services are used to find out what a websites demographics are for direct ad sales and so that company can target the correct demographic for advertisements they are purchasing elsewhere.
In my opinion the greatest threats to your personal privacy are Google, Facebook, and data breaches.
Google and Facebook know basically everything there is to know about you and are using it as effectively as it can to serve ads and do who knows what else. The main worry with these two companies is that they are so big it's hard to know what they are doing and insane amount of market space they own.
Data breaches are a privacy nightmare because a lot of companies store data in a way that says "we will never get hacked" and then they get hacked. So your name, address, social, and a bunch of other data are often easy to pull down in one big chunk.
If I had to make one law to improve privacy and data protection it would be a law that forces transparency. Users would be able to easily understand how and what their data is being used for. This would probably put the brakes on the most nefarious business practices from the people with the most data and make it obvious how little data most other ad tech companies actually have.
There could also be some kind of rating system that reveals much data a company has. D1 would be the highest level, this would be google or facebook meaning they basically know everything about you. It could go to down to D10 there is zero data.
Side Note: I find it hilarious how people complain about ad tech companies giving them super poorly targeted ads and also complain how much data they have. If they had a lot of data the ads wouldn't be poorly targeted. :)
I guess we know different people, the people I know complain about getting ads that are unrelated to them. I suppose that's the problem with anecdotes over data.
My annoyance with it is the bait and switch. You start with a good product, people join it, they depend on it, then you slowly strip out the core functionality of what made it good and start to introduce trackers and advertising and more advertising. Then no one wants to delete their account because they're already in the ecosystem and they'll then "live with it". It makes me very angry.
The sad thing about this is that this could be said about the world wide web in general and not just "services" like facebook and google. It's like a world wide "This is why we can't have nice things" phenomenon and it all boils down to money and power and it's fucking infuriating.
Well, it's not really about privacy. It's about power. We (as in, anyone not in the ruling or wealthy classes) live in a society where state and corporations wield vast power over our lives, both subtly and not.
When states and corporations lose privacy we gain power over them --- this is why whistleblowers are special and require protection. Whistleblowers can show abuse, opression, and exploitations against us. And this means we can do something about it.
When we give up privacy to states and corporations, they gain power. This seems an awful thing to me. More specifically, given up privacy means:
And so on. I'm sure for people involved in the contemporary privacy conversation the above list is familiar --- it shouldn't be hard to find corresponding literature.
Dealing with the above issues is hard. Getting off facebook, twitter, google et al is doable, although you may find out who your real friends are if people can't communicate through a corporate channel. Fixing email is tough (I do have some work in this space). Dealing with the state is a near impossible task. It seems all throughout the west all goverments are bent on total monitoring of civilians, a total perversion of the role of a representative democracy. You may have some success in engaging in the political process --- write letters, protest, vote --- but I fear that most of this is just normal background noise to pollies. I don't have a good solution on this point.
I think Riseup's security guide does a good job at explaining why privacy matters in an easy way:
The rest of their security guide does a good job at providing an easy-to-follow guide to protect your privacy & security. I like how they've split it into human, device, message, and network security.
My biggest privacy concern is how all the privacy noise is drowning the real issues.
Privacy is far from all or nothing, yet a lot of "privacy advocates" problematize causes that are simply non-issues.
These people that keep crying wolf lead people to zone out regarding privacy completely. There are real, important issues that are drowning unaddressed.
I'm sorry but this sounds like something a company would say to stir the conversation away from real issues. Can you ellaborate?
You can't fight every battle at once. You need to get at the most important things first, then branch out. There can be many large issues at once, but if you try to take n more than a handful, it becomes overwhelming for anyone who isn't deeply invested in the topic themselves.
My greatest privacy concern right now is that the people who advocate for taking privacy seriously often make it out to be this all-or-nothing thing, then go to insane lengths to hide all information about themselves.
Essentially, privacy advocates often come off as crazy folks who obviously must be hiding something serious because they go to absolutely extreme lengths to hide all trace of their online activity.
This leads people to zone out from the real issues, like election meddling, large volume of data everyone gives away to anyone who asks without a thought, government spying on its citizens without legal oversight and so on.
Regular people don't give a shit about those issues in a meaningful way because the people speaking on those topics seem crazy since they do so many over-the-top things to protect absolutely all information about themselves. It makes privacy concerns seem like a giant overreaction from a reactionary group that's way outside the norm.
Ah OK I understand although I find most people the places I frequent online to be somewhat reasonable towards the privacy issue I can see what you mean.
I'm concerned with how corporations have trained people to not even consider pseudo-anonymity as a valid option when it was one of the pillars of the early internet. So many issues become huge problems when you just have one identity online and it's the same as that of your social security, your day job or your family portraits. We don't have this problem in real life, so why are we creating it online? To give ads valid clicks?
The problem with privacy is that people are stupid. The news is saturated with stories about how companies screw over the masses because they gave their personal information to them, and yet people keep using these company's services and making money for them. Even before facebook hit the scene, it was common knowledge to never give identifying information online, and yet people flocked to these services and sold themselves out. They created peer pressure that convinced even 'internet veterans' to sell their private information.
It's part of a larger problem with humanity; people are just plain unwilling to consider the viewpoints and motivations of people outside their groups. They just look for more confirmation about their beliefs. It makes them extremely easy to manipulate. Russia didn't break any laws when they meddled in the US presidential election, they just took advantage of the echo chambers people construct for themselves.
The thing that bothers me so much about this is that this should be such an easy fix. The value of empathy is something you're supposed to learn in preschool.
The problem with privacy is that it's so difficult to explain why privacy is so important.