I find myself very disappointed by the tild.es comments here. Let's start with this: I argue that all unverified content on Pornhub falls into one of the following three categories: Pirated...
I find myself very disappointed by the tild.es comments here.
Let's start with this: I argue that all unverified content on Pornhub falls into one of the following three categories:
Pirated content, i.e. professionally produced porn uploaded by a party other than the copyright holder.
Illegitimate amateur content, i.e. non-professionally produced porn featuring individuals who were underage, did not consent, etc.
Legitimate amateur content, i.e. non-professionally produced porn where all participants legally could and did consent to all the acts recorded, the recording thereof, and their publication online.
The argument opponents to this action are presenting, then, is that the vast majority of unverified content falls into the latter bucket, to such an extent that the impact of the second and (to the extent that piracy of pornography is harmful to its participants, which is a multi-level discussion I would like not to have here and stick to the axiom that the harm is greater than zero) first buckets are negligible in comparison.
(My personal observation is that almost all unverified pornographic content, on Pornhub or elsewhere, is in the first bucket.)
I don't see any evidence presented that this is the case, but I do see compelling evidence—linked through TFA—that there are significant numbers of videos in the second bucket, and of course it is obvious at a glance that there are massive quantities in the first.
Also, while the harm from distribution of child, rape and revenge porn is clear and well-documented, the benefit to society of making legit amateur porn easily available over the Internet has gone unstated so far. I'm unlikely to be convinced it's valuable enough to override the harm, even if it does, as assumed, make up the vast majority; these are not major intellectual works, nor do the vast majority of them carry significant artistic value. The major plausible benefit I know of is displacing harm incurred by the professional porn industry, which while definitely non-zero, is definitely also a mixed bag (it's a significant employer, and while acknowledging the harm it does, I'm not interested in excluding the experiences of sex workers who claim to find their work fulfilling or empowering).
The use of the "but what about 'but what about the children'" argument I find especially disappointing, because I think the use of "but what about the children" as an excuse for intrusion into free expression and privacy is utterly abhorrent; but its counterarguments are only weakened when, as here, they are used in situations where in fact it is documented that children are being harmed.
I don't think anyone is saying get rid of amateur porn in favour of professional porn-- from what I can see PH left their verified amateur videos intact -- which is what should be the standard for...
Also, while the harm from distribution of child, rape and revenge porn is clear and well-documented, the benefit to society of making legit amateur porn easily available over the Internet has gone unstated so far. I'm unlikely to be convinced it's valuable enough to override the harm, even if it does, as assumed, make up the vast majority; these are not major intellectual works, nor do the vast majority of them carry significant artistic value. The major plausible benefit I know of is displacing harm incurred by the professional porn industry, which while definitely non-zero, is definitely also a mixed bag (it's a significant employer, and while acknowledging the harm it does, I'm not interested in excluding the experiences of sex workers who claim to find their work fulfilling or empowering).
I don't think anyone is saying get rid of amateur porn in favour of professional porn-- from what I can see PH left their verified amateur videos intact -- which is what should be the standard for amateur content.
I'm going to apologize for the bad analogy in advance but it's no different than selling chili. If you want to give some chili to your neighbours that's fine. But as soon as you want to sell it or run a soup kitchen you need to comply with whatever regulations there are about chili as a product, presumably which prevent you selling or giving away 'beef' chili that's actually got mouse meat in it or whatever.
I guess I should have been clearer: I'm only talking about unverified content, since (as far as I can see reported) Pornhub hasn't removed any verified content. I'm contesting the implicit...
I guess I should have been clearer: I'm only talking about unverified content, since (as far as I can see reported) Pornhub hasn't removed any verified content. I'm contesting the implicit assertion that unverified legit amateur porn provides a benefit that offsets the harm from other unverified content.
Verified amateur porn has been confirmed (theoretically—we are assuming that Pornhub's verification procedures are reliable) to not be causing harm, so it has no need to justify its presence. And indeed, I agree wholeheartedly that verification of some sort is a minimum standard to which all porn should be held.
Hopefully they will update their policy but their verification process was having a picture of "yourself" holding up pornhub.com written on a piece of paper. It does nothing about age, basically...
Hopefully they will update their policy but their verification process was having a picture of "yourself" holding up pornhub.com written on a piece of paper. It does nothing about age, basically nothing about coercion and i believe only covered the account holder (ie you didnt have to get everyone who was in your videos to hold up a piece of paper).
My mistake, my tendency to read at pace missed the double negative in "but what about 'but what about the children'" and it caused me to frame your comment differently. Agreed in full as well.
My mistake, my tendency to read at pace missed the double negative in "but what about 'but what about the children'" and it caused me to frame your comment differently.
but its counterarguments are only weakened when, as here, they are used in situations where in fact it is documented that children are being harmed.
[i.e., copyrighted content uploaded illegally] I would say that this is observer's bias. While I would concur that most of the unverified content appearing on the first pages of “Top” searches in...
(My personal observation is that almost all unverified pornographic content, on Pornhub or elsewhere, is in the first bucket.)
[i.e., copyrighted content uploaded illegally]
I would say that this is observer's bias. While I would concur that most of the unverified content appearing on the first pages of “Top” searches in adult entertainment sites indeed does fall into this category, there are literally millions of amateur videos with less than a hundred views that are perfectly legal. As for the argument that such content is not of “significant artistic value” or “not… major intellectual works,” this argument can be as well applied to Facebook, Twitter, Instagram, YouTube, Reddit,… to justify the removal of more or less 90% to 99% of their content—I agree that the content is not objectively extremely valuable but not with the removal of this content for this reason.
but I do see compelling evidence… that there are significant numbers of videos in the second bucket
[i.e., illegal porn]
less than 1‰ of the total for the stuff featuring minors, hardly more than 1% if involuntary stuff is added. In serious adult entertainment sites like PornHub, such content (as well as pirated content) gets constantly reported and removed by both the community and third parties, and the problem is that it has been soo far too easy to re-upload it.
The NYT article linked from the topic article and form one of the comments below blames PornHub for allowing the users to directly download videos, unlike YouTube. Given the recent scandals around youtube-dl, how long would it take for pornhub-dl to appear should PornHub disable downloading?
I'm not stating that amateur porn lacks artistic or intellectual value as a reason for removing it. I'm stating that to argue that it has artistic or intellectual value is not an argument for...
As for the argument that such content is not of “significant artistic value” or “not… major intellectual works,” this argument can be as well applied to Facebook, Twitter, Instagram, YouTube, Reddit,… to justify the removal of more or less 90% to 99% of their content
I'm not stating that amateur porn lacks artistic or intellectual value as a reason for removing it. I'm stating that to argue that it has artistic or intellectual value is not an argument for keeping it (because it lacks such value).
less than 1‰ of the total for the stuff featuring minors, hardly more than 1% if involuntary stuff is added.
One percent of ten million is ten thousand videos. Is that not a "significant number"?
Also, I'd appreciate a citation for that number. I strongly suspect those are lower bounds, and the vast majority of the 10M unverified videos have never been meaningfully evaluated.
Given the recent scandals around youtube-dl, how long would it take for pornhub-dl to appear should PornHub disable downloading?
youtube-dl supports Pornhub.
Requiring the user to download and use a commandline tool is a much higher bar than having a "download" button right there on the page, though; and regardless, the fact that Kristof was wrong on that point does nothing to discredit the rest of his article.
Here we are specifically dealing with PornHub videos. The desktop front page of PornHub has some 50 thumbnails, and I daresay any unverified video that would appear there would be from the pirate...
One percent of ten million is ten thousand videos. Is that not a "significant number"?
Here we are specifically dealing with PornHub videos. The desktop front page of PornHub has some 50 thumbnails, and I daresay any unverified video that would appear there would be from the pirate category (and many adult content producers tacitly tolerate some degree of infringement as this brings new paying customers in the long time). Thus, while there could have been some myriad of problematic videos on PornHub, it was necessary for a user to know what to type in the search bar.
Also, I'd appreciate a citation for that number. I strongly suspect those are lower bounds, and the vast majority of the 10M unverified videos have never been meaningfully evaluated.
Both the article linked to in the topic and the NYT article linked from there and from one of the comments here claim that PornHub had “thousands” of videos featuring underage persons. I think that if there had been tens of thousands, they would have definitely said so.
People are just mad that certain content creator favorites will just drop away because being “verified” is the last thing they want (they wish to remain supremely anonymous). I get it, it’s...
People are just mad that certain content creator favorites will just drop away because being “verified” is the last thing they want (they wish to remain supremely anonymous). I get it, it’s frustrating, but I guess it is what it is to stop bigger evils.
Summary: in order to remove alleged “thousands” of problematic videos, PornHub has been forced to remove 10 million of legal ones. Major Problem 1: Moderation. It is well-known that perfect...
Summary: in order to remove alleged “thousands” of problematic videos, PornHub has been forced to remove 10 million of legal ones.
Major Problem 1: Moderation. It is well-known that perfect moderation at scale is not feasible economically. At some point it becomes inevitable that some content (less than 1‰ in this case) on a platform violates rules—whether the local law or just the platform's rules. There are very few options that are better than Notice-and-Take-Down (e.g. ContentID to prevent the same offending video from being re-uploaded), but those options are still very imperfect and often extremely expensive.
Major Problem 2: Market Power Abuse. There are several services essential to the Internet, such as cloud services, security certificates, payment processing systems, etc., that are provided by oligopolies or even monopolies like DNS. I believe such services should be treated like other essential services such as water and electricity. In particular, claims that such service “support illegal activities” are as nonsensical as similar claims with respect to water and electricity suppliers.
Minor Problem: “child porn” and “sex trafficking” are used as a leverage against adult content in general. The article cites that
[PornHub] went on to claim its two biggest critics, the National Center on Sexual Exploitation and TraffickingHub, have right-wing ties and are actually focused on abolishing pornography and commercial sex work.
I think it's important that we acknowledge the problem that unverified porn is a huge vehicle for exploitation. Not only for underage and sex trafficking scenarios but also for the much more...
I think it's important that we acknowledge the problem that unverified porn is a huge vehicle for exploitation. Not only for underage and sex trafficking scenarios but also for the much more common revenge porn and other content uploaded without consent. Pretending this criticism is less valid because some of it is being championed by groups acting in bad faith doesn't help anyone -- it's a problem that needs to be addressed. The fact that it took the NYT and strong-arming by CC companies is shameful on the part of Pornhub IMO.
I had a friend who had some nudes of hers uploaded to PH without her permission, she just wasn't able to get it taken down at all. PH didn't respond to any of her requests.
I had a friend who had some nudes of hers uploaded to PH without her permission, she just wasn't able to get it taken down at all. PH didn't respond to any of her requests.
Slippery slope: why should any unverified content be allowed? What is the real difference between an involuntary nude photo and an involuntary non-nude photo, or a photo of someone's pet without...
Why should unverified porn be allowed to be uploaded and distributed?
Slippery slope: why should any unverified content be allowed? What is the real difference between an involuntary nude photo and an involuntary non-nude photo, or a photo of someone's pet without the owner's approval, or a photo of someone's house without the owner's approval (viz. Google Street View in Germany), etc.?
I definitely agree that if there is a valid reason for any content—SFW or NSFW—to be taken down, then it should be taken down. However, it this particular case we deal with AuthRight groups using the existence of such content (not the content itself) as a leverage against adult entertainment in general.
It's really hard to take this question seriously. An involuntary photo of someone walking down the street (or their pet or house) is a lot less likely to cause the person to be stalked, harassed,...
What is the real difference between an involuntary nude photo and an involuntary non-nude photo
It's really hard to take this question seriously. An involuntary photo of someone walking down the street (or their pet or house) is a lot less likely to cause the person to be stalked, harassed, lose their job, affect their mental state, affect their standing in their community, etc. There's also the legal issue in many locales that one has a legal expectation of privacy in places where they usually disrobe and don't when walking in public.
I'm perhaps approaching this from the opposite perspective than MetArtScroll, but I actually do think the question is reasonable. I am not suggesting that society should not be concerned with...
Exemplary
I'm perhaps approaching this from the opposite perspective than MetArtScroll, but I actually do think the question is reasonable. I am not suggesting that society should not be concerned with consent for videos involving sex or nudity, but rather that it seems confusing and dismaying that society should be so concerned with consent only for such videos, while seeming to move in the opposite direction for everything else.
In fact, society in the last ten years has seemed to move more and more toward celebrating a sort of citizen-panopticon of non-consensual and often confrontational media being posted publicly, often in hurtful ways. If one reads through r/all, for example, it's clear that significant portions of Reddit are devoted to this, including many the subreddits in a top-100 list by activity, with places like PublicFreakout, or iamatotalpieceofshit, or insanepeoplefacebook, or JusticeServed, to say nothing of the smaller groups that often seem to involve racist and sexist elements. There are also many groups that appear dedicated to videos less aimed at confrontation than embarrassment, like holdmycosmo.
Many of these do involve matters that have exactly the sorts of effects you mention. They often display people in traumatic, and often violent, moments. They often present interactions that were never intended to be public, and often do so in ways intended to be hurtful, often including misrepresentations or simply completely made up stories that, were they coming from traditional media, would be outright defamatory and illegal. "Witch-hunting" is nominally discouraged to avoid the sort of repercussions you mention, but as harassment often seems to be the point, it's extremely frequent to see such posts causing stalking, harassment, employment and reputation and mental repercussions, and so on. There seems to be no real attempt to stop this: it seems more often to be celebrated. In some cases, these are from private spaces, but even for public spaces, they create a situation where simply going outside creates a risk of involuntary worldwide publicity and harassment. Even for non-confrontational videos, people who are the involuntary subjects of viral videos often end up suffering repercussions.
I am not saying that there is no place for non-consensual videos outside of a sexual context. They can serve important purposes, particularly in exposing the actions of repressive governments, or the abuses of police in countries with profound flaws in law enforcement, like the USA. But I do think there needs to be a balance here, and it should not simply be whether or not sex or nudity was involved. It should also be recognized that they are, most often, non-consensual.
Put shortly: it is perverse that posting a secretly-taken video of someone being brutally beaten unconscious, accompanied with a defamatory story about how they deserved it, can get your post to the front page of one of the most popular sites on the internet, so long as they are wearing clothes. It is perverse that posting a photo of someone without clothes without their consent is seen as something that should be illegal, yet posting a secret recording of them without their consent revealing that they are gay, or an atheist, or that they have had an abortion, all things that could quite easily have much more damaging effects, is seen as completely legal, or, if illegal, results in laws being entirely ignored.
That sounds like a straw man and is not at all what I said. The question I was responding to was: That does not immediately sound to me like: Do you see the difference between what they posted and...
I am not suggesting that society should not be concerned with consent for videos involving sex or nudity, but rather that it seems confusing and dismaying that society should be so concerned with consent only for such videos, while seeming to move in the opposite direction for everything else.
That sounds like a straw man and is not at all what I said. The question I was responding to was:
What is the real difference between an involuntary nude photo and an involuntary non-nude photo, or a photo of someone's pet without the owner's approval, or a photo of someone's house without the owner's approval
That does not immediately sound to me like:
They often display people in traumatic, and often violent, moments. They often present interactions that were never intended to be public, and often do so in ways intended to be hurtful, often including misrepresentations or simply completely made up stories that, were they coming from traditional media, would be outright defamatory and illegal.
Do you see the difference between what they posted and I was reacting to, and what you posted? The first one makes no mention of violence or trauma and suggests the billion other emotions that people have on a day-to-day basis and that they generally wouldn't be upset about others seeing.
Your point is valid, but is completely unrelated to the point I was making, which is that there is a difference between everyday activities and sexual activities (and to many people anything showing nudity is sexual, even if you and I don't feel that way). Likewise there is a difference between everyday activities and any other private activity such as traumatic events, violent reactions, and other stuff.
I agree that involuntary content (not just porn) is a problem. However, I am not sure that 10 million items should be removed for the sake of “thousands,” or, if we include all illegal content,...
I agree that involuntary content (not just porn) is a problem. However, I am not sure that 10 million items should be removed for the sake of “thousands,” or, if we include all illegal content, tens of thousands of violations.
The fact that it took… strong-arming by CC companies
I stand behind my argument that credit card companies are essential services like water and electricity. If a platform willingly supports (rather than does not delete “sufficiently” speedily) illegal content, then the platform should be sued (willing support = Section 230 does not apply) in court rather than strong-armed by essential service providers.
Yeah, that tweet from LailaMickelwait paints a very different picture of the situation. Shouldn't this be straight up disinformation? Most tweets over there seem pretty wild.
Minor Problem: “child porn” and “sex trafficking” are used as a leverage against adult content in general.
Yeah, that tweet from LailaMickelwait paints a very different picture of the situation. Shouldn't this be straight up disinformation?
@LailaMickelwait: Over 10 million videos infested with rape and trafficking are now gone. This is a good day for victims who have been begging Pornhub to remove their abuse. #Traffickinghub
Right. So in other words, Pornhub removed all unverified videos (with a great majority of legitimate, but "untrusted" content), and this person's headline is "10 million videos infested with rape...
Right. So in other words, Pornhub removed all unverified videos (with a great majority of legitimate, but "untrusted" content), and this person's headline is "10 million videos infested with rape and trafficking are now gone".
I don't know about you, but when someone twists numbers and facts like this, I lose trust in everything else they're saying and start questioning their motive.
I don't think she's implying that all or even a significant amount of the 10 million videos involved rape or trafficking. She says those 10 million were "infested" by videos of that type, and as...
I don't think she's implying that all or even a significant amount of the 10 million videos involved rape or trafficking. She says those 10 million were "infested" by videos of that type, and as far as I can see it, she's right. If PornHub could have identified which ones were violations they likely would have utilized a more selective removal process. I think she's pointing out that a significant quantity of legal adult content was fully comingled with rape and trafficking content. In fact, PornHub's blanket removal is a sort of tacit admission that the problem was big enough that it was worth nuking 75% of their content.
"When unverified accounts upload videos, they're bringing rape, child porn … and some, I assume, are legitimate videos." Sorry; I couldn't let the comparison go. You see how it's the same...
"When unverified accounts upload videos, they're bringing rape, child porn … and some, I assume, are legitimate videos."
Sorry; I couldn't let the comparison go. You see how it's the same playbook, right? Obviously the subject is not comparable 1:1 but the use of this tactic makes me doubt their motive right off the bat.
Far too many people out there have an agenda against (legitimate) sex workers… Here in Belgium for example, Brussels used the pandemic as a scapegoat to ban prostitution unilaterally. This was overturned as illegal later on.
The comparison is certainly resonant, but while I believe that Trump was overstating harms, I believe that conversations about sexual exploitation online tend to understate harms. There are...
Exemplary
The comparison is certainly resonant, but while I believe that Trump was overstating harms, I believe that conversations about sexual exploitation online tend to understate harms.
There are undoubtedly a lot of genuine, legitimate sex workers out there, and I firmly believe they should be able to operate with dignity and autonomy. On the other hand, there is also a considerable amount of genuine exploitation out there as well. There’s even some overlap between the two. I have a family member who used to work with women who had left sex work, and many of them would be seen as “legitimate” workers now because they are adults, despite the fact that they were coerced/forced into it initially, often in their childhoods. If you want to read more about this, you can see one of my kfwyre-length comments about it here.
PornHub hasn't just been ambivalent to these distinctions in a sort of neutral, "dumb-pipe" sort of way (not that even that would be okay, at least not in my book). The Girls Do Porn lawsuit is probably the most famous indicator of how the company has handled content of this type:
Girls Do Porn was a sex trafficking operation that forced and coerced dozens of women as young as 18 into sex on camera, and lied to them about where and how the videos would be distributed. The women were told by everyone involved, from cast and crew to the owner, that the videos would not appear online. After filming, their videos were uploaded to Girls Do Porn's own site, as well as Pornhub, where the Girls Do Porn monetized its videos as a Pornhub "content partner." Pornhub also promoted Girls Do Porn as a content partner even after women in Girls Do Porn videos came forward about abuse and sued it.
The complaint claims that as early as 2009, "and definitely by fall 2016," Mindgeek knew Girls Do Porn was coercing and intimidating models into having sex on camera. It also places much of the blame for the victims' harm on Mindgeek, presenting several claims from models themselves that Pornhub failed to take down videos, even when they reported the videos to Pornhub and pleaded with the company to remove them.
After a 15-year-old girl went missing in Florida, her mother found her on Pornhub — in 58 sex videos. Sexual assaults on a 14-year-old California girl were posted on Pornhub and were reported to the authorities not by the company but by a classmate who saw the videos. In each case, offenders were arrested for the assaults, but Pornhub escaped responsibility for sharing the videos and profiting from them.
It's precisely their failure to handle content like this that I believe led them to go scorched earth in the first place. The internet will now get mad at all of the genuine content lost and blame it on the people making the complaints, when the fact of the matter is that if PornHub had better vetted and handled its content in the first place then it wouldn't have to resort to effectively pulling the plug on everything. I don't see this outcome as the gross overreach of sex-negative people -- I see this as PornHub flouting its responsibility for safety and accountability in an industry that is rife with exploitation, only to be forced to take extreme measures when their lack of scruples resulted in a direct threat to their revenue stream.
I don't know if it's like this in other countries, but in the US teachers are "mandated reporters". It means that I am legally required to report any suspected instance of child abuse. I am compelled to do this not if I am certain that abuse is occurring, but if I even think that it might be. It's a mandate I take very seriously.
Correspondingly, this line from the previously-linked Times article tells me everything I need to know about where PornHub stands on this same issue:
this year Pornhub began voluntarily reporting illegal material to the National Center for Missing and Exploited Children
This year. 2020. They've been in operation since 2007 and only now are they starting to report on the children that end up on their site -- children whose abuse is directly documented on video and which they share with the public at large on their plaform.
Yes, it's true that with one fell swoop PornHub took down a large number of videos that were not harmful and not exploitative, and they would love for you to look at that and see it as a great loss; an injustice; harm done to innocents. I think it's more important that we examine the reason why they had to take this action in the first place, and I believe it to be because of what they didn't do with the videos that were genuinely harmful and exploitative.
If we're using Trump quotes to summarize the situation, I think a more apt one would be this: "You know, it really doesn’t matter what they upload as long as it's got a young and beautiful piece of ass."
Typing that sentence genuinely makes me shudder. But PornHub? It makes them income.
This reminds me of an unexplained tendency in the NY Times article that unsettled me. Throughout, a common feature in many of the stories involving schools was the way that others at the school...
Sexual assaults on a 14-year-old California girl were posted on Pornhub and were reported to the authorities not by the company but by a classmate who saw the videos.
This reminds me of an unexplained tendency in the NY Times article that unsettled me. Throughout, a common feature in many of the stories involving schools was the way that others at the school quickly saw the materials. In at least one instance, this is described as almost instantaneous, with the person going in the next day to a school where everyone had seen the footage on PornHub.
There's never an acknowledgement of how this dissemination process worked, and there seems to be the implication that the material became widespread in the school community merely through availability on the site. But this makes no sense. It would suggest that either PornHub operates in a very different way than I would expect, with profoundly creepy data collection for both videos and viewers, that child abuse videos are front-page material there, or that a large number of school students are constantly searching for their classmates.
The far more plausible scenario is that the uploaders themselves are connected with the advertising of the videos to their classmates, as a way of further hurting the person involved. That, however, changes PornHub's role from causing the problem to simply aiding a larger problem of school communities behaving maliciously. The problem would seem to be only somewhat hindered by this type of ban: similar behaviour would be possible through the students sharing the videos privately but widely amongst their classmates. And in many of the stories, the effect on classmate behaviour at school seems to be one of the largest hurtful effects: it would thus seem that this culture at schools of sharing such videos is perhaps a larger problem looming behind the problem of PornHub's behaviour?
This also explains some of the aspects regarding reports: setting aside the question of whether PornHub should scrutinize videos before they are posted, it would seem quite unsurprising, for example, that a video would be reported to authorities by a classmate first, if the uploaders had immediately advertised the video to classmates.
Yeah, there is definitely more to the problem than just PornHub. It's sadly common in this day and age for images of any sort to get shared without consent and with the express purpose of harming...
Yeah, there is definitely more to the problem than just PornHub. It's sadly common in this day and age for images of any sort to get shared without consent and with the express purpose of harming the individual(s) depicted. Your other comment here is a great exploration of that, and of the different lights in which sexual and non-sexual content are viewed.
I do think it important that we not use the existence of the larger problem as an exoneration of PornHub however. They were effectively cashing in on the popularity and traffic this problem generates, rather than attempting to counter or mitigate it. I feel similarly about reddit and how much of its frontpage content is now shallow outrage-bait, for example.
Yeah it wasn't clear in my original assertions but I'm not defending pornhub in the slightest. They're gross and most porn sites tend to be by nature; pornhub is actually one of the more...
Yeah it wasn't clear in my original assertions but I'm not defending pornhub in the slightest. They're gross and most porn sites tend to be by nature; pornhub is actually one of the more professional ones, with a serious company behind it so they have a reputation to defend. So this stuff affects them, where a smaller site in the worst case would shut down and rebrand or whatever.
What I am always uncomfortable with is the amount of anti-sex activists who will jump on any opportunity to fuck with sex workers. And it dead on seems that way that the person in question above is in that category.
My doubts are real as to whether this person gives a shit about children and/or victims of sex trafficking. Life is easier as a political activist when you're in the "blameless zone": the subjects which shield you from criticism by nature, allowing you to serve your loosely related agenda under pretext. You're most familiar with that technique when it's "for the children" or "against terrorism".
I'm cynical sometimes. Do you truly think it's unwarranted here?
I absolutely get where you’re coming from and have had my share of experience with the types of people you describe. I just don’t see that at all in her tweet, which I feel was fair. The...
I absolutely get where you’re coming from and have had my share of experience with the types of people you describe. I just don’t see that at all in her tweet, which I feel was fair. The disinformation and bad faith that is obvious to you doesn’t ping for me at all.
I will qualify that I literally know nothing about her beyond that one tweet and her brief mention in the Times article, so perhaps my lack of knowledge of her is influencing my judgment.
I don't think anyone's going to settle this via armchair arguments about what's plausible, what's not, or what arguments have been seen before. It's just disagreeing about priors without looking...
I don't think anyone's going to settle this via armchair arguments about what's plausible, what's not, or what arguments have been seen before. It's just disagreeing about priors without looking at the data for this case. (Which we can't, at this point, and even if we could, I wouldn't be looking at it myself either.)
That tweet is indeed straight up disinformation, and this tactics is extremely common from those groups. N.B. On Reddit, those groups regularly suggest to ban all nudity because Reddit does not...
That tweet is indeed straight up disinformation, and this tactics is extremely common from those groups.
N.B. On Reddit, those groups regularly suggest to ban all nudity because Reddit does not have age verification for such content, i.e., since it is not possible to verify that an NSFW-nudity post does not contain “child porn,” then, they claim, all NSFW-nudity posts are bad.
One way to make moderation easier is to have a lot less content. Assuming the credit card companies relent, it seems like PornHub would do okay, since I would guess that they still have plenty of...
One way to make moderation easier is to have a lot less content. Assuming the credit card companies relent, it seems like PornHub would do okay, since I would guess that they still have plenty of porn.
I am wondering who really loses in that case, when the variety of porn goes down. It’s still a lot more variety than was available before the Internet?
If the credit card companies don’t give in, does this help alternative porn sites or cryptocurrency more? Would PornHub have trouble converting large amounts of cryptocurrency to cash or would a company like CoinBase be happy to help?
Edit: “before the Internet” is not the best benchmark. I do not agree with giving up something the world has achieved today while being consoled that the outcome is still better than a longer...
It’s still a lot more variety than was available before the Internet?
Indeed, let's compare everything with 1491. EVERYTHING would look better today.
Edit: “before the Internet” is not the best benchmark. I do not agree with giving up something the world has achieved today while being consoled that the outcome is still better than a longer while ago.
I am wondering who really loses in that case, when the variety of porn goes down.
IMHO, if the variety of anything legal goes down (and the vast majority of the unverified PornHub content was perfectly legal, i.e. voluntary and not involving minors), then everyone losesEdit: strictly speaking, no one gains and there are some who lose.
I mean this is part of the problem -- that claim is not verifiable whatsoever. More importantly though, let's say this is true and only 0.05% of videos are illegal. How is increasing the variety...
and the vast majority of the unverified PornHub content was perfectly legal, i.e. voluntary and not involving minors
I mean this is part of the problem -- that claim is not verifiable whatsoever.
More importantly though, let's say this is true and only 0.05% of videos are illegal. How is increasing the variety of pornographic material morally justifiable when it comes at the expense of the continued exploitation of people?
The way that I think of it is that there are diminishing returns to variety, but it's not that clear when it sets in. This isn't something you can deduce from armchair reasoning. YouTube has...
The way that I think of it is that there are diminishing returns to variety, but it's not that clear when it sets in. This isn't something you can deduce from armchair reasoning.
YouTube has billions of videos. How many videos would you ever watch? On the other hand, if you are interested in something specific, like the recordings of a specific song, you may find that they don't have it if the song is sufficiently obscure. The value of a larger selection isn't that you can consume more content, but that you can find things that are increasingly specific and obscure.
Since I don't ahem keep up with porn, I'm not one to judge. I don't know what level of diversity is important to people. I would guess that variety matters less than, say, videos as a whole, but that's just a guess.
But you seem to think that everyone loses if any video is lost, and I think that's an exaggeration? I can't really take it seriously.
This is exactly my point, which applies to SFW and NSFW alike. Right, I was not strict enough. If any legal [important!—this word cannot be omitted] video is lost, then no one gains and there are...
The value of a larger selection isn't that you can consume more content, but that you can find things that are increasingly specific and obscure.
This is exactly my point, which applies to SFW and NSFW alike.
But you seem to think that everyone loses if any video is lost, and I think that's an exaggeration? I can't really take it seriously.
Right, I was not strict enough. If any legal [important!—this word cannot be omitted] video is lost, then no one gains and there are some who lose so the overall outcome is definitely negative.
Yeah, that's better but I'm still not going to agree that deleting legal content is always harmful. In 2020, aren't we beyond that now? I think some is spam, some useless, some actively...
Yeah, that's better but I'm still not going to agree that deleting legal content is always harmful. In 2020, aren't we beyond that now? I think some is spam, some useless, some actively misleading. At the very least, it takes up space and clutters search results. It seems likely that even a porn website needs to think about what's worth keeping?
Deleting content, i.e., making it unavailable in a particular place, is not the same as content getting lost, i.e., becoming universally unavailable forever, and it is the latter that, IMHO, has...
Deleting content, i.e., making it unavailable in a particular place, is not the same as content getting lost, i.e., becoming universally unavailable forever, and it is the latter that, IMHO, has been implied in the discussion above.
I think some is spam, some useless, some actively misleading.
What is spam and/or useless in one place can be relevant in another place, and actively misleading content should IMHO be preserved in dedicated places so that people learn to deal with such stuff.
At the very least, it takes up space and clutters search results.
Space is not a problem now in 2020 (unless it is a zettabyte video of white noise), and search algorithms are getting increasingly improved to demote trash to below the 10th page of results.
Space is a problem for video. Very few organizations can host videos for free. Even Google is slowly moving away from free unlimited uploads for everyone. You also aren't thinking of moderation...
Space is a problem for video. Very few organizations can host videos for free. Even Google is slowly moving away from free unlimited uploads for everyone.
You also aren't thinking of moderation costs. If you want to actually do a good job of not hosting illegal content, someone is going to have to review it all, or you need really good machine learning, beyond current state of the art. Either that or everyone looks the other way, which only works for so long.
But for a while the costs weren't thought to be as high as they're known to be now, which is how a lot of Internet companies were able to become huge.
In a room sit three great men, a king, a priest, and a rich man with his gold. Between them stands a sellsword, a little man of common birth and no great mind. Each of the great ones bids him slay the other two. ‘Do it,’ says the king, ‘for I am your lawful ruler.’ ‘Do it,’ says the priest, ‘for I command you in the names of the gods.’ ‘Do it,’ says the rich man, ‘and all this gold shall be yours.’ So tell me—who lives and who dies?
Nope. It's that power is based on what people believe. (In this case, it's based on what the sellsword believes.) You can see a longer quote from the book here:...
Nope. It's that power is based on what people believe. (In this case, it's based on what the sellsword believes.)
I find myself very disappointed by the tild.es comments here.
Let's start with this: I argue that all unverified content on Pornhub falls into one of the following three categories:
The argument opponents to this action are presenting, then, is that the vast majority of unverified content falls into the latter bucket, to such an extent that the impact of the second and (to the extent that piracy of pornography is harmful to its participants, which is a multi-level discussion I would like not to have here and stick to the axiom that the harm is greater than zero) first buckets are negligible in comparison.
(My personal observation is that almost all unverified pornographic content, on Pornhub or elsewhere, is in the first bucket.)
I don't see any evidence presented that this is the case, but I do see compelling evidence—linked through TFA—that there are significant numbers of videos in the second bucket, and of course it is obvious at a glance that there are massive quantities in the first.
Also, while the harm from distribution of child, rape and revenge porn is clear and well-documented, the benefit to society of making legit amateur porn easily available over the Internet has gone unstated so far. I'm unlikely to be convinced it's valuable enough to override the harm, even if it does, as assumed, make up the vast majority; these are not major intellectual works, nor do the vast majority of them carry significant artistic value. The major plausible benefit I know of is displacing harm incurred by the professional porn industry, which while definitely non-zero, is definitely also a mixed bag (it's a significant employer, and while acknowledging the harm it does, I'm not interested in excluding the experiences of sex workers who claim to find their work fulfilling or empowering).
The use of the "but what about 'but what about the children'" argument I find especially disappointing, because I think the use of "but what about the children" as an excuse for intrusion into free expression and privacy is utterly abhorrent; but its counterarguments are only weakened when, as here, they are used in situations where in fact it is documented that children are being harmed.
I don't think anyone is saying get rid of amateur porn in favour of professional porn-- from what I can see PH left their verified amateur videos intact -- which is what should be the standard for amateur content.
I'm going to apologize for the bad analogy in advance but it's no different than selling chili. If you want to give some chili to your neighbours that's fine. But as soon as you want to sell it or run a soup kitchen you need to comply with whatever regulations there are about chili as a product, presumably which prevent you selling or giving away 'beef' chili that's actually got mouse meat in it or whatever.
I guess I should have been clearer: I'm only talking about unverified content, since (as far as I can see reported) Pornhub hasn't removed any verified content. I'm contesting the implicit assertion that unverified legit amateur porn provides a benefit that offsets the harm from other unverified content.
Verified amateur porn has been confirmed (theoretically—we are assuming that Pornhub's verification procedures are reliable) to not be causing harm, so it has no need to justify its presence. And indeed, I agree wholeheartedly that verification of some sort is a minimum standard to which all porn should be held.
Hopefully they will update their policy but their verification process was having a picture of "yourself" holding up pornhub.com written on a piece of paper. It does nothing about age, basically nothing about coercion and i believe only covered the account holder (ie you didnt have to get everyone who was in your videos to hold up a piece of paper).
My mistake, my tendency to read at pace missed the double negative in "but what about 'but what about the children'" and it caused me to frame your comment differently.
Agreed in full as well.
[i.e., copyrighted content uploaded illegally]
I would say that this is observer's bias. While I would concur that most of the unverified content appearing on the first pages of “Top” searches in adult entertainment sites indeed does fall into this category, there are literally millions of amateur videos with less than a hundred views that are perfectly legal. As for the argument that such content is not of “significant artistic value” or “not… major intellectual works,” this argument can be as well applied to Facebook, Twitter, Instagram, YouTube, Reddit,… to justify the removal of more or less 90% to 99% of their content—I agree that the content is not objectively extremely valuable but not with the removal of this content for this reason.
[i.e., illegal porn]
less than 1‰ of the total for the stuff featuring minors, hardly more than 1% if involuntary stuff is added. In serious adult entertainment sites like PornHub, such content (as well as pirated content) gets constantly reported and removed by both the community and third parties, and the problem is that it has been soo far too easy to re-upload it.
The NYT article linked from the topic article and form one of the comments below blames PornHub for allowing the users to directly download videos, unlike YouTube. Given the recent scandals around
youtube-dl
, how long would it take forpornhub-dl
to appear should PornHub disable downloading?I'm not stating that amateur porn lacks artistic or intellectual value as a reason for removing it. I'm stating that to argue that it has artistic or intellectual value is not an argument for keeping it (because it lacks such value).
One percent of ten million is ten thousand videos. Is that not a "significant number"?
Also, I'd appreciate a citation for that number. I strongly suspect those are lower bounds, and the vast majority of the 10M unverified videos have never been meaningfully evaluated.
youtube-dl
supports Pornhub.Requiring the user to download and use a commandline tool is a much higher bar than having a "download" button right there on the page, though; and regardless, the fact that Kristof was wrong on that point does nothing to discredit the rest of his article.
Here we are specifically dealing with PornHub videos. The desktop front page of PornHub has some 50 thumbnails, and I daresay any unverified video that would appear there would be from the pirate category (and many adult content producers tacitly tolerate some degree of infringement as this brings new paying customers in the long time). Thus, while there could have been some myriad of problematic videos on PornHub, it was necessary for a user to know what to type in the search bar.
Both the article linked to in the topic and the NYT article linked from there and from one of the comments here claim that PornHub had “thousands” of videos featuring underage persons. I think that if there had been tens of thousands, they would have definitely said so.
People are just mad that certain content creator favorites will just drop away because being “verified” is the last thing they want (they wish to remain supremely anonymous). I get it, it’s frustrating, but I guess it is what it is to stop bigger evils.
Summary: in order to remove alleged “thousands” of problematic videos, PornHub has been forced to remove 10 million of legal ones.
Major Problem 1: Moderation. It is well-known that perfect moderation at scale is not feasible economically. At some point it becomes inevitable that some content (less than 1‰ in this case) on a platform violates rules—whether the local law or just the platform's rules. There are very few options that are better than Notice-and-Take-Down (e.g. ContentID to prevent the same offending video from being re-uploaded), but those options are still very imperfect and often extremely expensive.
Major Problem 2: Market Power Abuse. There are several services essential to the Internet, such as cloud services, security certificates, payment processing systems, etc., that are provided by oligopolies or even monopolies like DNS. I believe such services should be treated like other essential services such as water and electricity. In particular, claims that such service “support illegal activities” are as nonsensical as similar claims with respect to water and electricity suppliers.
Minor Problem: “child porn” and “sex trafficking” are used as a leverage against adult content in general. The article cites that
I think it's important that we acknowledge the problem that unverified porn is a huge vehicle for exploitation. Not only for underage and sex trafficking scenarios but also for the much more common revenge porn and other content uploaded without consent. Pretending this criticism is less valid because some of it is being championed by groups acting in bad faith doesn't help anyone -- it's a problem that needs to be addressed. The fact that it took the NYT and strong-arming by CC companies is shameful on the part of Pornhub IMO.
I had a friend who had some nudes of hers uploaded to PH without her permission, she just wasn't able to get it taken down at all. PH didn't respond to any of her requests.
Slippery slope: why should any unverified content be allowed? What is the real difference between an involuntary nude photo and an involuntary non-nude photo, or a photo of someone's pet without the owner's approval, or a photo of someone's house without the owner's approval (viz. Google Street View in Germany), etc.?
I definitely agree that if there is a valid reason for any content—SFW or NSFW—to be taken down, then it should be taken down. However, it this particular case we deal with AuthRight groups using the existence of such content (not the content itself) as a leverage against adult entertainment in general.
It's really hard to take this question seriously. An involuntary photo of someone walking down the street (or their pet or house) is a lot less likely to cause the person to be stalked, harassed, lose their job, affect their mental state, affect their standing in their community, etc. There's also the legal issue in many locales that one has a legal expectation of privacy in places where they usually disrobe and don't when walking in public.
I'm perhaps approaching this from the opposite perspective than MetArtScroll, but I actually do think the question is reasonable. I am not suggesting that society should not be concerned with consent for videos involving sex or nudity, but rather that it seems confusing and dismaying that society should be so concerned with consent only for such videos, while seeming to move in the opposite direction for everything else.
In fact, society in the last ten years has seemed to move more and more toward celebrating a sort of citizen-panopticon of non-consensual and often confrontational media being posted publicly, often in hurtful ways. If one reads through r/all, for example, it's clear that significant portions of Reddit are devoted to this, including many the subreddits in a top-100 list by activity, with places like PublicFreakout, or iamatotalpieceofshit, or insanepeoplefacebook, or JusticeServed, to say nothing of the smaller groups that often seem to involve racist and sexist elements. There are also many groups that appear dedicated to videos less aimed at confrontation than embarrassment, like holdmycosmo.
Many of these do involve matters that have exactly the sorts of effects you mention. They often display people in traumatic, and often violent, moments. They often present interactions that were never intended to be public, and often do so in ways intended to be hurtful, often including misrepresentations or simply completely made up stories that, were they coming from traditional media, would be outright defamatory and illegal. "Witch-hunting" is nominally discouraged to avoid the sort of repercussions you mention, but as harassment often seems to be the point, it's extremely frequent to see such posts causing stalking, harassment, employment and reputation and mental repercussions, and so on. There seems to be no real attempt to stop this: it seems more often to be celebrated. In some cases, these are from private spaces, but even for public spaces, they create a situation where simply going outside creates a risk of involuntary worldwide publicity and harassment. Even for non-confrontational videos, people who are the involuntary subjects of viral videos often end up suffering repercussions.
I am not saying that there is no place for non-consensual videos outside of a sexual context. They can serve important purposes, particularly in exposing the actions of repressive governments, or the abuses of police in countries with profound flaws in law enforcement, like the USA. But I do think there needs to be a balance here, and it should not simply be whether or not sex or nudity was involved. It should also be recognized that they are, most often, non-consensual.
Put shortly: it is perverse that posting a secretly-taken video of someone being brutally beaten unconscious, accompanied with a defamatory story about how they deserved it, can get your post to the front page of one of the most popular sites on the internet, so long as they are wearing clothes. It is perverse that posting a photo of someone without clothes without their consent is seen as something that should be illegal, yet posting a secret recording of them without their consent revealing that they are gay, or an atheist, or that they have had an abortion, all things that could quite easily have much more damaging effects, is seen as completely legal, or, if illegal, results in laws being entirely ignored.
That sounds like a straw man and is not at all what I said. The question I was responding to was:
That does not immediately sound to me like:
Do you see the difference between what they posted and I was reacting to, and what you posted? The first one makes no mention of violence or trauma and suggests the billion other emotions that people have on a day-to-day basis and that they generally wouldn't be upset about others seeing.
Your point is valid, but is completely unrelated to the point I was making, which is that there is a difference between everyday activities and sexual activities (and to many people anything showing nudity is sexual, even if you and I don't feel that way). Likewise there is a difference between everyday activities and any other private activity such as traumatic events, violent reactions, and other stuff.
I agree that involuntary content (not just porn) is a problem. However, I am not sure that 10 million items should be removed for the sake of “thousands,” or, if we include all illegal content, tens of thousands of violations.
I stand behind my argument that credit card companies are essential services like water and electricity. If a platform willingly supports (rather than does not delete “sufficiently” speedily) illegal content, then the platform should be sued (willing support = Section 230 does not apply) in court rather than strong-armed by essential service providers.
Yeah, that tweet from LailaMickelwait paints a very different picture of the situation. Shouldn't this be straight up disinformation?
Most tweets over there seem pretty wild.
Right. So in other words, Pornhub removed all unverified videos (with a great majority of legitimate, but "untrusted" content), and this person's headline is "10 million videos infested with rape and trafficking are now gone".
I don't know about you, but when someone twists numbers and facts like this, I lose trust in everything else they're saying and start questioning their motive.
I don't think she's implying that all or even a significant amount of the 10 million videos involved rape or trafficking. She says those 10 million were "infested" by videos of that type, and as far as I can see it, she's right. If PornHub could have identified which ones were violations they likely would have utilized a more selective removal process. I think she's pointing out that a significant quantity of legal adult content was fully comingled with rape and trafficking content. In fact, PornHub's blanket removal is a sort of tacit admission that the problem was big enough that it was worth nuking 75% of their content.
"When unverified accounts upload videos, they're bringing rape, child porn … and some, I assume, are legitimate videos."
Sorry; I couldn't let the comparison go. You see how it's the same playbook, right? Obviously the subject is not comparable 1:1 but the use of this tactic makes me doubt their motive right off the bat.
Far too many people out there have an agenda against (legitimate) sex workers… Here in Belgium for example, Brussels used the pandemic as a scapegoat to ban prostitution unilaterally. This was overturned as illegal later on.
The comparison is certainly resonant, but while I believe that Trump was overstating harms, I believe that conversations about sexual exploitation online tend to understate harms.
There are undoubtedly a lot of genuine, legitimate sex workers out there, and I firmly believe they should be able to operate with dignity and autonomy. On the other hand, there is also a considerable amount of genuine exploitation out there as well. There’s even some overlap between the two. I have a family member who used to work with women who had left sex work, and many of them would be seen as “legitimate” workers now because they are adults, despite the fact that they were coerced/forced into it initially, often in their childhoods. If you want to read more about this, you can see one of my kfwyre-length comments about it here.
PornHub hasn't just been ambivalent to these distinctions in a sort of neutral, "dumb-pipe" sort of way (not that even that would be okay, at least not in my book). The Girls Do Porn lawsuit is probably the most famous indicator of how the company has handled content of this type:
The New York Times article that prompted this recent shakeup in the first place is also eye-opening:
It's precisely their failure to handle content like this that I believe led them to go scorched earth in the first place. The internet will now get mad at all of the genuine content lost and blame it on the people making the complaints, when the fact of the matter is that if PornHub had better vetted and handled its content in the first place then it wouldn't have to resort to effectively pulling the plug on everything. I don't see this outcome as the gross overreach of sex-negative people -- I see this as PornHub flouting its responsibility for safety and accountability in an industry that is rife with exploitation, only to be forced to take extreme measures when their lack of scruples resulted in a direct threat to their revenue stream.
I don't know if it's like this in other countries, but in the US teachers are "mandated reporters". It means that I am legally required to report any suspected instance of child abuse. I am compelled to do this not if I am certain that abuse is occurring, but if I even think that it might be. It's a mandate I take very seriously.
Correspondingly, this line from the previously-linked Times article tells me everything I need to know about where PornHub stands on this same issue:
This year. 2020. They've been in operation since 2007 and only now are they starting to report on the children that end up on their site -- children whose abuse is directly documented on video and which they share with the public at large on their plaform.
Yes, it's true that with one fell swoop PornHub took down a large number of videos that were not harmful and not exploitative, and they would love for you to look at that and see it as a great loss; an injustice; harm done to innocents. I think it's more important that we examine the reason why they had to take this action in the first place, and I believe it to be because of what they didn't do with the videos that were genuinely harmful and exploitative.
If we're using Trump quotes to summarize the situation, I think a more apt one would be this: "You know, it really doesn’t matter what they upload as long as it's got a young and beautiful piece of ass."
Typing that sentence genuinely makes me shudder. But PornHub? It makes them income.
This reminds me of an unexplained tendency in the NY Times article that unsettled me. Throughout, a common feature in many of the stories involving schools was the way that others at the school quickly saw the materials. In at least one instance, this is described as almost instantaneous, with the person going in the next day to a school where everyone had seen the footage on PornHub.
There's never an acknowledgement of how this dissemination process worked, and there seems to be the implication that the material became widespread in the school community merely through availability on the site. But this makes no sense. It would suggest that either PornHub operates in a very different way than I would expect, with profoundly creepy data collection for both videos and viewers, that child abuse videos are front-page material there, or that a large number of school students are constantly searching for their classmates.
The far more plausible scenario is that the uploaders themselves are connected with the advertising of the videos to their classmates, as a way of further hurting the person involved. That, however, changes PornHub's role from causing the problem to simply aiding a larger problem of school communities behaving maliciously. The problem would seem to be only somewhat hindered by this type of ban: similar behaviour would be possible through the students sharing the videos privately but widely amongst their classmates. And in many of the stories, the effect on classmate behaviour at school seems to be one of the largest hurtful effects: it would thus seem that this culture at schools of sharing such videos is perhaps a larger problem looming behind the problem of PornHub's behaviour?
This also explains some of the aspects regarding reports: setting aside the question of whether PornHub should scrutinize videos before they are posted, it would seem quite unsurprising, for example, that a video would be reported to authorities by a classmate first, if the uploaders had immediately advertised the video to classmates.
Yeah, there is definitely more to the problem than just PornHub. It's sadly common in this day and age for images of any sort to get shared without consent and with the express purpose of harming the individual(s) depicted. Your other comment here is a great exploration of that, and of the different lights in which sexual and non-sexual content are viewed.
I do think it important that we not use the existence of the larger problem as an exoneration of PornHub however. They were effectively cashing in on the popularity and traffic this problem generates, rather than attempting to counter or mitigate it. I feel similarly about reddit and how much of its frontpage content is now shallow outrage-bait, for example.
Yeah it wasn't clear in my original assertions but I'm not defending pornhub in the slightest. They're gross and most porn sites tend to be by nature; pornhub is actually one of the more professional ones, with a serious company behind it so they have a reputation to defend. So this stuff affects them, where a smaller site in the worst case would shut down and rebrand or whatever.
What I am always uncomfortable with is the amount of anti-sex activists who will jump on any opportunity to fuck with sex workers. And it dead on seems that way that the person in question above is in that category.
My doubts are real as to whether this person gives a shit about children and/or victims of sex trafficking. Life is easier as a political activist when you're in the "blameless zone": the subjects which shield you from criticism by nature, allowing you to serve your loosely related agenda under pretext. You're most familiar with that technique when it's "for the children" or "against terrorism".
I'm cynical sometimes. Do you truly think it's unwarranted here?
I absolutely get where you’re coming from and have had my share of experience with the types of people you describe. I just don’t see that at all in her tweet, which I feel was fair. The disinformation and bad faith that is obvious to you doesn’t ping for me at all.
I will qualify that I literally know nothing about her beyond that one tweet and her brief mention in the Times article, so perhaps my lack of knowledge of her is influencing my judgment.
I don't think anyone's going to settle this via armchair arguments about what's plausible, what's not, or what arguments have been seen before. It's just disagreeing about priors without looking at the data for this case. (Which we can't, at this point, and even if we could, I wouldn't be looking at it myself either.)
That tweet is indeed straight up disinformation, and this tactics is extremely common from those groups.
N.B. On Reddit, those groups regularly suggest to ban all nudity because Reddit does not have age verification for such content, i.e., since it is not possible to verify that an NSFW-nudity post does not contain “child porn,” then, they claim, all NSFW-nudity posts are bad.
One way to make moderation easier is to have a lot less content. Assuming the credit card companies relent, it seems like PornHub would do okay, since I would guess that they still have plenty of porn.
I am wondering who really loses in that case, when the variety of porn goes down. It’s still a lot more variety than was available before the Internet?
If the credit card companies don’t give in, does this help alternative porn sites or cryptocurrency more? Would PornHub have trouble converting large amounts of cryptocurrency to cash or would a company like CoinBase be happy to help?
Indeed, let's compare everything with 1491. EVERYTHING would look better today.Edit: “before the Internet” is not the best benchmark. I do not agree with giving up something the world has achieved today while being consoled that the outcome is still better than a longer while ago.
IMHO, if the variety of anything legal goes down (and the vast majority of the unverified PornHub content was perfectly legal, i.e. voluntary and not involving minors), then everyone losesEdit: strictly speaking, no one gains and there are some who lose.
I mean this is part of the problem -- that claim is not verifiable whatsoever.
More importantly though, let's say this is true and only 0.05% of videos are illegal. How is increasing the variety of pornographic material morally justifiable when it comes at the expense of the continued exploitation of people?
The way that I think of it is that there are diminishing returns to variety, but it's not that clear when it sets in. This isn't something you can deduce from armchair reasoning.
YouTube has billions of videos. How many videos would you ever watch? On the other hand, if you are interested in something specific, like the recordings of a specific song, you may find that they don't have it if the song is sufficiently obscure. The value of a larger selection isn't that you can consume more content, but that you can find things that are increasingly specific and obscure.
Since I don't ahem keep up with porn, I'm not one to judge. I don't know what level of diversity is important to people. I would guess that variety matters less than, say, videos as a whole, but that's just a guess.
But you seem to think that everyone loses if any video is lost, and I think that's an exaggeration? I can't really take it seriously.
This is exactly my point, which applies to SFW and NSFW alike.
Right, I was not strict enough. If any legal [important!—this word cannot be omitted] video is lost, then no one gains and there are some who lose so the overall outcome is definitely negative.
Yeah, that's better but I'm still not going to agree that deleting legal content is always harmful. In 2020, aren't we beyond that now? I think some is spam, some useless, some actively misleading. At the very least, it takes up space and clutters search results. It seems likely that even a porn website needs to think about what's worth keeping?
Deleting content, i.e., making it unavailable in a particular place, is not the same as content getting lost, i.e., becoming universally unavailable forever, and it is the latter that, IMHO, has been implied in the discussion above.
What is spam and/or useless in one place can be relevant in another place, and actively misleading content should IMHO be preserved in dedicated places so that people learn to deal with such stuff.
Space is not a problem now in 2020 (unless it is a zettabyte video of white noise), and search algorithms are getting increasingly improved to demote trash to below the 10th page of results.
Space is a problem for video. Very few organizations can host videos for free. Even Google is slowly moving away from free unlimited uploads for everyone.
You also aren't thinking of moderation costs. If you want to actually do a good job of not hosting illegal content, someone is going to have to review it all, or you need really good machine learning, beyond current state of the art. Either that or everyone looks the other way, which only works for so long.
But for a while the costs weren't thought to be as high as they're known to be now, which is how a lot of Internet companies were able to become huge.
This may be behind a paywall, but here is the article that kick started this whole thing:
I don’t think I get it. Where is this from?
Is the answer that if he refuses they all just pay another sellsword to kill him?
Nope. It's that power is based on what people believe. (In this case, it's based on what the sellsword believes.)
You can see a longer quote from the book here:
https://scifi.stackexchange.com/questions/161997/so-what-was-the-answer-to-varys-riddle