16
votes
What's wrong with WhatsApp? As social media has become more inhospitable, the appeal of private online groups has grown. But they hold their own dangers – to those both inside and out
Link information
This data is scraped automatically and may be incorrect.
- Title
- What's wrong with WhatsApp
- Authors
- William Davies
- Published
- Jul 2 2020
- Word count
- 3677 words
Shouldn't the title be "What's wrong with people?"
Whatsapp isn't like most social media or news reporting. It doesn't bubble up items that will make you enraged. It simply facilitates discussions. If this is a problem, that is on us as human beings.
Sometimes technology accelerates human problems. There were jokes and chain letters and pyramid schemes and urban legends in postal mail, but email made it easier to forward them. A meme that produces outrage and encourages replication is likely to spread, even without any other algorithmic encouragement.
To slow down "viral" content you need to add some kind of speedbump between hops and to prevent it from escalating you need ways of getting replication to fall below 1.0. Electronic social distancing, you might say.
Yes, but the accelerant is the internet, and no one is shutting off the internet.
Oh come on, that would be throwing out the baby, bathwater, and entire house.
I can see your point. As you mention, the problem pre-dates Whatsapp. It was literally worse when inter-office faxes were a thing. But it was so much slower. It's just as bad on email. But email is slower, and has more friction.
I think we agree that friction wont solve the problem, but in theory should significantly slow the problem down.
I simply disagree that your solution is feasible in todays day and age, because as soon as you add speedbumps to whatsapp, someone will come up with a new and improved whatsapp that does not have the speed bumps.
I'm not sure about that. Many people do seem to prefer forums with better filtering over unmoderated forums that tend to fall apart. Gmail got where it is through better spam filtering, for example.
The key would be to add delay to actions so that it doesn't seem to get in your way, but with a good emergent effect.
Well, you have me there. I am all for speed bumps. And I have no good reason for assuming others are not.
Haven't we all agreed on the fact that human beings are flawed and suspectable to flawed systems? Shouldn't we regulate the tech that exposes us to content that makes us vulnerable? It would be nice to have some amount moderation on WhatsApp, I mean we saw from what it can do to people.
But WhatsApp isn't a social media site like its parent Facebook. It's a messaging app.
Would it not be problematic for there to be moderation on end-to-end encrypted private messages between people? Because that inherently means there can't be end-to-end encrypted messages. WhatsApp can't moderate traffic that's encrypted on their end.
Even though it allows for crazies and criminals to talk to each other, I think there SHOULD always be a way to privately communicate with someone else without the government, or any company, with the ability to "moderate" what you say. That's pretty uncomfortable as an idea.
Well, we are gonna have to disagree on that. Yes Its a messaging app but people use it as any other social media to spread hateful and misinformed content. I do agree on end to end encryption part, but I just don't think a tool which has been used to radicalize people to cause violence multiple times can make that argument when talking moderation.
When you are around educated people you can sometimes be blind to the implications of tools like this on undereducated communities. If you want end to end encryption you can use other similar less popular tools and let them moderate the popular ones.
I didn't advocate for curbing the tech, infact in my original comment I said the very thing you are implying. But, I think we can reduce the propogation of these fundamental issues by moderating content on tools like WhatsApp, which surely helped to escalate the spread of misinformation to a wider audience.
That can go even worse, though. Wechat is perfect example of the anti-WhatsApp way to do things. Is platforms that can be moderated necessarily better in areas where people can be uneducated, and leaders can be have a... generous definition of "hate speech"?
I don't understand where you are going with this :/ . 1st you seemed worried about losing e2e encryption, now you are concerned that leaders could stop free speech? This free speech argument can be made against social medias as well, right? Also WhatsApp is not from an authoritarian country like china so does the countries it caused damage, there are institutions in democratic countries which can make leaders accountable for their actions.
I mean, those are pretty connected concepts? In particular, the point being made by the article is that WhatsApp is used to drive radicalization, while unencrypted platforms can equally be used to drive radicalization, just not "naturally".
Well, yeah. In fact, the reason I differentiate WhatsApp from a social media platform like Facebook, is that Facebook, which curates posts displayed to you, should be held to a higher standard.
I mean, we've seen how well some of those have worked in the US. It's certainly better than China; the point is that this puts everything on the good faith attempts to moderate fairly by the entity that owns the chat app.
I believe this is not that simple. WhatsApp is responsible for knowingly amplifying some of the worst aspects of human nature for the purpose of profit. That’s frequently okay, but in this particular case the problem became big enough to have actual, measurable, negative consequences.
When the outcomes are knowable, both companies and individuals are morally responsible for the consequences of their actions, regardless of their legality.
I am confused by this. Can you clarify what you mean? (and provide some sources on this, if possible?)
Suppose cocaine is legal and Bayer holds its patent. Also suppose that Bayer has access to research showing that consumption of cocaine increases the natural human tendency for impulsive and criminal behavior, but chooses to keep selling it over the counter anyway. They market it as a cure all for people of all ages. In that scenario, Bayer is morally responsible for all the impulsive and criminal behavior that becomes more likely due to the consumption of its product, regardless of its legality.
The fact that a behavioral pattern is already part of human nature does not exclude those that take advantage of it from responsibility.
The above is an illustration of that pattern, so try interpreting with an open mind. This is not about the specifics of the example but rather the correction of the argument form.
I don't think your analogy really holds up.
Cocaine is equivalent to an app that helps people communicate? really?
And you are trying to make an equivalence of the detrimental effects of cocaine with a platform that allows people to communicate?
Have I missed some research that points to WhatsApp causing criminal behavior in its users?
Well, You came up with a sensational analogy, but an analogy not really pertinent to my initial question. I am trying to understand what your argument actually is, from my initial question when you said:
WhatsApp uses whispernet, which is an E2E protocol, so faceboook cannot really see what people are messaging each other (they can still snoop on metadata and build graphs on you interaction patterns, but that is besides the point). So they cannot really amplify what is being communicated on their platform.
So, my question once again is, how is WhatsApp amplifying the worst aspects of human nature knowingly for profit? (Have you looked into what the profit streams for whatsapp is?)
Primarily, what I am trying to get at is, that at the end of the day, WhatsApp is a communication platform, that reduces barriers for people to communicate.
Is your argument against any such platform?
There’s really nothing sophisticated about this argument. I don’t think it makes sense for me to go in details, the article makes a good job of doing that.
Also this: https://en.m.wikipedia.org/wiki/Principle_of_charity
I have trouble with
Especially when the
X
We are discussing is a communication platform.Since what people are doing on WhatsApp can be done on a bunch of other communication platforms, so we just close them all?
There is evidence for the harmfulness of WhatsApp.
That’s a false dilemma.
They way I see it, There is evidence for the harmfulness of allowing people to be able to communicate freely, from this article.
WhatsApp was just one of the platforms that allowed people to engage in this communication.
There is no evidence that the same isn't happening on other platforms like Telegram or even Signal.
Nope.
From the WikiPage:
You are arguing that WhatsApp is harmful and hence Facebook must
shut it down.be condemned.My argument is, the harmfulness that you are trying to condemn is an artifact of an open communication platform, not just limited to WhatsApp; And if you are arguing to shut it down, to be consistent, you should also shutdown all other communication platforms which cause similar harm. I for sure have seen mass texts in Telegram with bogus information. Reddit helps with promoting a ton of crazy theories; we should
shutcondemn them all down, right? This is not false dichotomy, sorry!All I am saying is you are being extremely reductionist with a very complex subject!
Moreover you have still not answered my original question
How is WhatsApp (not Facebook, but the platform WhatsApp)
Are they providing some fake internet points for sending messages? Do they somehow advertise groups with malicious ideologies to more people? How are they amplifying your so called worst aspects of human nature?
I literally never said anything even remotely to that effect!
You may have an argument to make, but I don’t think you’re conversing with me.
I concede that you weren't arguing for that. I have edited my reply above.
Any rebuttal for the rest?
So what? This is not a good defense. Besides, WhatsApp is the most popular, it can cause more damage and it its natural for it to receive more attention.
That is false. Platforms are intrinsically planned to allow and incentivize different kinds of behavior.
I don’t have the knowledge and patience to look this up right now. I think the article makes good points.
Not really, I merely posed a general argument which is likely to be true given the conditions I assumed in order to demonstrate how companies can be unethical. I think you’re mistaking simplicity with reductionism.
I am sorry, you're correct there!
I think this was my initial question with your statement, where you said WhatsApp is amplifying negative behavior.
The article actually does not do a good job of how any of this is specific to WhatsApp, it definitely has the largest user-base and hence a lot of the examples cited in the article were most definitely circulated amongst WhatsApp groups.
However, the point I am trying to make is that WhatsApp (or Signal, Telegram, etc) all handle a certain set of base use-cases. Transferring message from person to person and person-to-groups. And they try to reduce the friction in getting messages (and audio/video clips, etc) between people.
What I fail to see is how these platforms promote and incentivize these kinds of behavior
Which was the crux of my initial question to you :)
The specificity of an issue is not that relevant. It’s better to just ask:
I think this is a key difference on how we approached this.
I am actually concerned with what exactly is the issue and is it right to be blaming WhatsApp in the first place.
In fact, It could be any of the various communication platforms we have which can in fact cause these issues.
In light of that, How do we properly address this?
I don't think just targeting WhatsApp would be properly addressing this issue.
(I am not even discussing how we would target a communication platform to solve this, yet. I am only making a case against just targeting one platform)
One of the reasons WhatsApp gained such a huge marketshare was because of how simple it
wasis to use it. If we impose restrictions on WhatsApp, the bulk of the people will migrate to another platform, and the underlying issue will still continue.So, I don't agree with you when you say
How can you properly address an issue, when you don't understand it!
You made several mentions to the fact that these issues occur in other platforms. I’m stating that this is not that relevant: fixing them on a single platform is a worthy goal regardless of their uniqueness.
Guess we will have to agree to disagree :)
Without considering the underlying problems and just pushing a solution on a single platform seems like it wouldn't really solve the issue and could only lead to a very hard game of whack-a-mole!
By that standard most large scale change would become impossible. Progress doesn’t happen by unanimous universal decree. Reality is dirty.
Well now that is a false dilemma isn't it!
I say, lets study the issue, do our due diligence and try to come up with a solution that tries to solve it holistically (it still may not, which is reality) and you shoot back with, well, then most large scale changes would become impossible :)
I would argue a lot of our issues today stems for short-sighted large-scale changes that have been implemented as a knee-jerk reaction, without considering underlying issues!
This is not a false dilemma in the terms you proposed: it is an actual dilemma because in those terms there are actually only two options.
I cannot interpret this new argument you are proposing in the same way because it's an entirely new one.
Kinda feels like you really need the last word in here!So this would be my last reply in this thread, and this has kinda gone stale!I couldn't help with my previous reply cause that is such a clear example of false dichotomy!
Cause, you propose most large scale changes would become impossible if we have to understand the underlying issues to enact change! Which clearly isn't the case, there is a huge spectrum of options in-between.
Also,
The framing of those two options was from what you wrote, since you drew the conclusion (from my reply above) saying By that standard most large scale change would become impossible.
I framed it as an either/or to show the false dichotomy that you were drawing!
Anyways, as I said, regarding this WhatsApp issue we must agree to disagree! Cheers!
Hey buddy, what's the value of having the last word? What is the prize, really? I merely seem to disagree with you, that's all. It's a natural part of life, isn't it? I may very well be wrong, I may very well be correct, but that is not a sport for me.
No need to take things personally one way or the other.
Welp, since you asked.
With every reply of yours, you subtly changes what we are discussing!
I generally tend to assign this kind of behavior with people who need to have the last word in a conversation (for whatever reason)!
After all this back and forth, you still have not actually answered my initial question :)
From me asking you why you believe WhatsApp intentionally promotes bad behavior, we have come to why we should not really spend time on the underlying issue and just impose a solution on WhatsApp!
PS: This isn't personal, and I have struck out that statement from my reply above.
Please ascribe any erratic pattern entirely to my incompetence rather than bad faith. I assure you that's the case.
I feel like this article is focusing too much on Whats App. Just look at this paragraph
Well, yeah? Is that supposed to be a complaint? That Whats App allows people to communicate, some of whom happen to be crazy?
Of course a text, SMS or not, message group exists without anyone else knowing about it.
They focus on WhatsApp because it is one of the apps more globally used, more convenient (i.e. easy to use), and effective at facilitating group conversation. Just like forks can be used to kill, but guns are more effective for it, so there is a lot of debate regarding guns.
The issue is that big tech companies create platforms, mine people's data, but don't care how they are affecting each individual community. If you manage to create a big platform, then you shouldn't just get big profits, you should also get increased responsibility.
Here on Tildes you might get banned if you don't respect the terms of use, but on WhatsApp you might spread fake news all day and redirect harmful messages, pictures, videos, and nobody tells you a thing.
But they're actually not, in this case, which is the issue. They can't read your messages at all. Like, you can do the same thing on Signal; in fact, that's why Signal exists, so you can securely and conveniently message anyone without any single entity being able to listen in.
Like, it's one thing to complain about Facebook promoting hate groups because they drive engagement up. I find it hard to fault Whatsapp, though owned by FB, for providing E2E messaging. Yes, that means that hate groups, insane people, criminals, etc. can communicate on it, but at least IMO, the alternative is an unacceptable level of government oversight, which can easily be abused.
I really, really don't think stuff like EARN IT should exist, and prohibit companies from having E2E encrypted messaging channels. It's far more harm than you gain.
They still mine data, metadata that is. They know your contacts graph, how much time you spend talking to each person, how many pictures you send etc. Moreover, only the chats are encrypted, your folders with media are not. But that is not even my point here. E2E is good, I never said the contrary. However you should moderate your platform in other ways, recently they decreased the amount of people you can forward messages to in one action [1]. Maybe they should decrease group sizes. Maybe they should limit how much one person can post in a single day when in group chats. Social media is far from solved, because it is "social", there is the human factor involved, and people cannot "be solved".
While those are fine, they're not really moderation against hate groups per say, quite like
is for instance. It would also affect, say, a protest movement against a local government ability to communicate and coordinate with its members. It just cuts anyone's ability to spread any message. Whether or not those tradeoffs are worth it is worth asking, but I don't think it's really solves the core issues at hand.
Obviously you can't have micro moderation with E2E, but they have to do something, they are the tech geniuses after all. Its all good when your product makes investors very rich, but when you mess with the fabric of society, suddenly its to much for you to deal with?
They have billions of users, can you even picture that? I can't.
It's not so simple as to just "do something" about it, though. Some problems, many problems, can't be solved just by throwing more money at it and more smart people at it, like researching something in Civilization game.
This, imo, is a very complex issue because it really stems from giving people better ability to talk to each other. To reduce that, is to not just stop hate groups from talking to each other, but support groups as well. Moving forward and backwards is a double edged sword, and it's difficult to ascertain how sharp each edge is.
Communication itself is a double edged sword. If no one could communicate to each at all, there'd certainly be a lot less hate in the world, given that you'd be stuck in your local bubble of interaction. But clearly that extreme example is throwing the baby out with the bathwater.
It's a super difficult, complex, gray area, where any move that looks good, can instead cause great damage.
I'm not sure that trying to "solve" this is the right way to think about it. If we see technology as an accelerant for viral spread then maybe we could slow it down? Maybe it would help to add a half-hour delay for group-to-group forwarding, and perhaps more by default? Maybe it would be a good idea to limit the size of groups again?
This won't keep things from going viral but it slows the spread via encrypted channels and lets the rest of the world catch up.
The downside would be slowing down urgent messages. But you could still send things faster as long as you name the recipients.
I don't think of this as a case where we need to avoid messing with the status quo because we think the status quo is good, or even suspect it's good using a Chesterton's fence argument. Social apps were built via uncontrolled experimentation and we might as well experiment more.
But they didn't exist up until recently. This is a problem that they amplified by being a well designed product by some metric that didn't take into account the problems that are appearing now. Maybe they should start researching metrics that measure social impact across cultures and people from different backgrounds.
But I agree with you, it is complex.
I don't know about that one. Everything in that article existed before, it was simply on a smaller scale because it was more difficult to communicate.
How do you create a metric for "social impact"? How do you distinguish positive and negative social impact? How do you do that without confounding factors when WhatsApp can't read any of the messages?
if the answer is "well, they have money and smart people, go do it", that's not really an answer. Money and intelligence can't make the subjective objective.
When I said they, I was talking about Whatsapp. They are recent in terms of human history. What I was trying to say is that yes they are
However they gave that ability, collected the benefits, but didn't take into account all the other "abilities" that they gave to people.
You can not give an "ability" to people in San Francisco, and expect the same "ability" to have similar effects in India.
The point is that it's not cut and dry like that. Who says they didn't take into account what people would do with it? What if they took both the benefits, and the cons to society into consideration?
It is not an obvious decision that they should've never launched in India, or should cut back on what they offer people. The overall +/- is not obvious at all. The opposite of WhatsApp is WeChat, where the company in ownership can moderate what goes on.
My opinion remains as: They have the responsibility on this issue.
Anyways, I have to go. Thanks for the chat.
I suppose my take would be: they have responsibility, and depending on the circumstances, their responsibility may be to do nothing. They are responsible, but the responsible thing isn't necessarily to limit group sizes, etc. and certainly not to remove E2E encryption.
Here in India it is nearly impossible to stop using WhatsApp. Schools to workplaces use WhatsApp groups as their default information sharing platform. Misinformation through WhatsApp has been a widely discussed topic for a while in India after incidents like WhatsApp lynchings. But eventually most of the people especially young adults internalized the Idea of not believing anything that get shared on WhatsApp. There is even a meme called "WhatsApp university".
I dont think Indian government had taken any action to control the misinformation through WhatsApp as the article suggests. If they are talking about arrests made on pedophilia groups then yes, but that doesn't count as misinformation, does it? Also, Modi supporters thrive on WhatsApp.
For eg when the country was in lockdown for about a month, Modi, as an appreciation for hardwork by health workers, told the country to make sounds like clapping hands or utensils. Lot of people criticized him for not actually caring about health workers or migrant workers who at that time stuck at different states.At the same time, there has been an incredible amount of conspiracies running around whatsapp, like claiming making sounds will destroy corona virus and even celebrities supporting these claims which led to some people gathering at places for making sounds. Which destroyed any achievement of the month long lockdown.I'd like to correct your comment which propagates misinformation.
Modi announced Janta Curfew on 22 March, 2020 and requested people to clap and banging on metal plates at 5pm afterwards while remaining inside. It lasted for 14 hours only. Lockdown was announced three days after that.
I don't understand where you get the about one month figure from?
Edit Source
Oh.. I mixed it up with lights off thing.. sorry, I am from Kerala we started lockdown 2 weeks before the whole country that's why I said one month.
I crossed out that section I will edit it with more information when I have time, thank you :)
Oh, didn't know that. It's fine. :D
Also, I was talking about the confusion he made by insisting people to do the clapping and light candles. His supporters were propagating out right conspiracies for his actions through whatsapp. He didn't insist on staying inside the second time he said to do the lighting candles fiasco. And there were multiple incidents of people gathering for fireworks at that time as well.
I replied again to clear some confusion I may have caused. Yes I was wrong on some facts, but it was not deliberate, and the things I said is not completely wrong. Also conspiracies through WhatsApp and other social media is one of the major concerns for the democratic process in India and Modi is fueling those.