20 votes

It's been twenty-four years since internet companies were declared off-the-hook for the behavior of their users. That may change, and soon

37 comments

  1. post_below
    (edited )
    Link
    For those unfamiliar with recent pushes to eliminate or neuter section 230, it's important to understand that without it there's a very real chance sites like Tildes couldn't exist. If you're...

    For those unfamiliar with recent pushes to eliminate or neuter section 230, it's important to understand that without it there's a very real chance sites like Tildes couldn't exist.

    If you're liable for illegal (or even just objectionable) content on your platform then you need a bulletproof way to moderate it. That's expensive, volunteer moderators are unlikely to provide you with a reasonable legal defense. Even with paid moderators, how long does something need to be online before you become liable? Do you need to pre-moderate instead? What if the moderators miss something, are you liable for honest mistakes? And so on.

    It wouldn't just impact sites with less grandiose ambitions. The next attempt to disrupt the social media giants with a new platform won't happen if the legal burden is too expensive for a startup.

    The big social media platforms should do a better job with misinformation. I'd support laws targeting that specifically, but killing 230 only benefits special interests (marketers for one) and Trump's attorney general in his goal to have a bigger stick to shake at social media.

    Side note: Don't believe them when they inevitably tell you it's about protecting kids, it's never really about that anymore, just a way to play on your emotions.

    27 votes
  2. [5]
    LukeZaz
    Link
    I'm honestly surprised there's even any bipartisan pressure coming on to change Section 230. By my understanding, it's one of the cornerstones supporting today's open internet. I'm not a free...

    I'm honestly surprised there's even any bipartisan pressure coming on to change Section 230. By my understanding, it's one of the cornerstones supporting today's open internet. I'm not a free speech absolutist, but limiting 230 at all still seems very concerning at a glance.

    12 votes
    1. [4]
      spacecowboy
      Link Parent
      I highly recommend Eric Goldman's blog (a professor of law at Santa Clara University) on all stuff related to section 230 including the latest bills that want to change it. Here is a recent one:...

      I highly recommend Eric Goldman's blog (a professor of law at Santa Clara University) on all stuff related to section 230 including the latest bills that want to change it. Here is a recent one:
      https://blog.ericgoldman.org/archives/2020/09/sen-graham-cares-more-about-trolls-than-section-230-comments-on-online-content-policy-modernization-act.htm

      By the way I'm curious what is the situation in Canada since Tildes is a Canadian non-profit. Are there protections for Canadian organisations against liability for user's content?

      9 votes
      1. Deimos
        Link Parent
        From what I've read about it, the law in Canada isn't very clear. There also aren't really any major online platforms located in Canada, so it probably hasn't seemed like something important to...

        From what I've read about it, the law in Canada isn't very clear. There also aren't really any major online platforms located in Canada, so it probably hasn't seemed like something important to fully figure out at this point.

        Here are a couple of recent articles that talk about the current law some, if you're interested in looking into it more:

        From my perspective, I mostly just don't ever expect Tildes to be large enough for it to be an issue. Most of the type of posts that have the potential to cause major issues for not taking down (e.g. hate speech, piracy, revenge porn) are things I have no intention to allow anyway. Defamation is trickier, but honestly in most realistic cases I'd probably just decide to take it down if it was causing issues. My goal for the site isn't really for it to be a "public square" or anything similar, and I wouldn't want to get into a legal battle over refusing to take down someone's complaints about a restaurant or something.

        4 votes
      2. [2]
        Grzmot
        Link Parent
        As far as I know, the servers hosting the site are still in the US, so we're still in trouble.

        As far as I know, the servers hosting the site are still in the US, so we're still in trouble.

        1 vote
        1. spacecowboy
          Link Parent
          The servers are in Canada. From: https://docs.tildes.net/policies/privacy-policy "Tildes runs on dedicated servers located in Canada and rented exclusively to Spectria."

          The servers are in Canada.

          From:
          https://docs.tildes.net/policies/privacy-policy
          "Tildes runs on dedicated servers located in Canada and rented exclusively to Spectria."

          9 votes
  3. [2]
    pallas
    Link
    This article seems to substantially misrepresent the context and intent of Section 230 in the US, and so it is worth reviewing the history behind it, rather than only reviewing the history of the...

    This article seems to substantially misrepresent the context and intent of Section 230 in the US, and so it is worth reviewing the history behind it, rather than only reviewing the history of the other, struck-down portions of the Communications Decency Act. The purpose of Section 230 was to allow providers to moderate content. Unless I am mistaken in my interpretation of that history, a repeal of Section 230 without something else to replace it would most likely not result in service providers taking more responsibility for content, but in them abandoning all moderation entirely.

    Prior to Section 230, there were two ways, in the US, that internet forum providers could be considered for legal purposes, applying law that significantly predated the internet: as publishers, in which case they were legally liable for everything that every user posted, or as distributors, in which case they weren't legally liable, but also couldn't perform any moderation or place any restrictions on the content.

    Two companies, CompuServe and Prodigy, both created forums in the early 1990s. CompuServe decided they would have no moderation whatsoever, even of content that was illegal. Prodigy, on the other hand, developed some basic content guidelines, had some moderators that could enforce those guidelines, and filtered out offensive language.

    Both were sued for defamation as a result of posts that were argued as being defamatory. CompuServe was found not to be liable, because it was only a distributor. But Prodigy was found liable for the defamation, because the mere existence of any moderation and content restrictionds meant that Prodigy was liable for all content. That meant that the simple act of, say, banning offensive words or even simply banning child abuse images meant that the provider would be liable for non-obvious cases of defamation. The post that Prodigy was fined over, for example, was a post arguing that a particular investment bank had committed fraud in connection with an IPO. There would have been no way, without doing an investigation themselves, for Prodigy's moderators to have known that this content was actually defamatory. And yet all they had wanted to do was have moderation to remove obviously problematic content. Yet the legal environment that existed before Section 230 did not allow this. This perverse situation resulted in Section 230, which clarified that providers would not become liable for user-posted content simply because they did some basic and even-handed moderation.

    Thus, Section 230 was not meant to absolve providers of responsibility for user content. They already had a way to avoid liability: just allow all content, with no moderation. The intention was to allow them to have some moderation of user content without that making them liable for everything involving that content.

    Without Section 230, internet forums as we know them today simply could not exist in the US, unless they became complete free-for-alls, and so I have to expect that, were it repealed, rather than shutting down entirely, we'd end up with the other, extreme, alternative.

    12 votes
    1. post_below
      (edited )
      Link Parent
      Section 230 does protect platforms from liability for catching everything if they do moderation, and indeed some recent proposed legislation wants to modify that aspect of it to actually do a...

      Section 230 does protect platforms from liability for catching everything if they do moderation, and indeed some recent proposed legislation wants to modify that aspect of it to actually do a funny sort of reverse censorship by disallowing certain types of moderation. That's another thread though.

      Before that bit, there is this:

      No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.

      Without 230 platforms would potentially be responsible for everything published, like a newspaper but with less protection.

      1 vote
  4. sjvn
    Link
    "Many marketing leaders are increasingly tired of waiting for the government or industry to act, and have begun to express their displeasure by joining month-long boycotts or abandoning social...

    "Many marketing leaders are increasingly tired of waiting for the government or industry to act, and have begun to express their displeasure by joining month-long boycotts or abandoning social media altogether."

    Really? I don't see that. Companies will boycott advertising at times, but that's usually on old-style mass media. And, abandon social media! Ha! No. Just no. They don't.

    9 votes
  5. [11]
    bloup
    Link
    As a practical matter I don’t think making platforms be liable for their users’ content is a very good idea. But philosophically, if a platform is allowed to be the sole financial beneficiary of...

    As a practical matter I don’t think making platforms be liable for their users’ content is a very good idea. But philosophically, if a platform is allowed to be the sole financial beneficiary of their users’ content, why shouldn’t they also have to be liable?

    4 votes
    1. [10]
      skybrian
      Link Parent
      Because philosophically, liability is about holding people responsible for what they did. Whether or not you benefit financially has nothing to do with whether it’s your fault.

      Because philosophically, liability is about holding people responsible for what they did. Whether or not you benefit financially has nothing to do with whether it’s your fault.

      7 votes
      1. [9]
        bloup
        Link Parent
        I don’t really agree with “liability is about holding people responsible for what they did”. Like for example, if you get hurt on the job, your employer is financially responsible for your injury...

        I don’t really agree with “liability is about holding people responsible for what they did”. Like for example, if you get hurt on the job, your employer is financially responsible for your injury even if they didn’t really “do” anything to cause the injury.

        Lots of things in life incur unavoidable costs. “Liability” is just about having a system of assigning these costs to different entities in the most reasonable way, and there are plenty of situations where figuring out who is responsible is a lot more than “whose fault is it”.

        2 votes
        1. [8]
          skybrian
          Link Parent
          There are two parts here, what does the law say about who is liable, and why is it that the law is set up that way? I’m not a lawyer, but it seems like employers are held responsible for what...

          There are two parts here, what does the law say about who is liable, and why is it that the law is set up that way? I’m not a lawyer, but it seems like employers are held responsible for what happens in a work environment because they control working conditions and give orders about the work to be done, so they should have incentive to use that power to keep the workplace safe. (They might contract with another firm to do some work offsite, but then that contractor is responsible for their own working conditions, typically. There are grey areas.)

          It’s not entirely unreasonable that platforms control what is published and should take responsibility for it, but it implies that they actually will have to exercise that control. It means the platform is empowered and users disempowered.

          I do prefer to use moderated platforms, but it seems like we shouldn’t be in too much of a hurry to disempower users.

          In any case, who benefits financially shouldn’t enter into it. Tildes is hosted somewhere and the hosting provider gets paid, but they don’t make the rules here and it’s better that way. The contract isn’t about that, it’s about providing Internet hosting.

          2 votes
          1. [7]
            bloup
            Link Parent
            I don't really see how making it so a platform is on the hook for harmful content could possibly be construed as "empowering" the company. Even without that pressure, they already have full...

            I don't really see how making it so a platform is on the hook for harmful content could possibly be construed as "empowering" the company. Even without that pressure, they already have full authority and discretion to do whatever they like to any of the content posted to the site, and when there is content that is damaging to the interests of the platform itself (like the presence a controversial community that is harming the reputation of the platform) the platform will almost always just ban that content. I mean, look at reddit and the many white supremacy groups that have been banned.

            Now, what about situations where the platform is hosting content that is not necessarily damaging to the platform itself, but still carries some kind of broader social cost? Well, you can either hold some combination of the platform and the actual creator of the harmful content liable, or you can just externalize those costs to the rest of society...

            The reason why "who financially benefits" absolutely should enter into this, is the person who financially benefits the most is typically the one who is most incentivized to create the conditions that allow the harmful behavior to happen. Once more, I turn to the example of worker's compensation.

            3 votes
            1. [6]
              skybrian
              Link Parent
              Power is often delegated or left unused and that's effectively a transfer of power, though it could be taken back. For example, an absentee owner might be out of touch and their manager is making...

              Power is often delegated or left unused and that's effectively a transfer of power, though it could be taken back. For example, an absentee owner might be out of touch and their manager is making all the day-to-day decisions, and the owner has little idea what's going on. In practice, power is often with the people who actually show up and are paying attention. There are Chinese and Russian proverbs about this, along the lines of "heaven is high and the emperor is far away."

              I think this often happens online, especially with these very large services. Being hands-on is time-consuming and the absentee-owner model, where most power is delegated, is the only thing that works at scale. Until it stops working.

              Making the company liable means they have to be less trusting. They have to be more careful who they delegate power to, give them stricter instructions, and avoid risky edge cases. This means less leeway at the edges.

              People who are rich, technically competent, eloquent, and/or have friends who are will find a way. People that nobody knows or trusts might have a harder time of getting their voices heard. As for anonymous voices - well, being anonymous and untrusted are almost the same thing, right? Until you build up a reputation, anyway.

              So, for example, what is the liability risk of hosting content about police brutality? Suppose it's faked or taken out of context? Maybe better not to host that kind of content if you can't verify it, so you don't get sued for ruining someone's reputation?

              Liability risk puts you at risk of action from people or organizations who have lawyers and are willing to use them, and that power is itself concentrated with the rich and well-connected.

              2 votes
              1. [5]
                bloup
                Link Parent
                But the only reason that this power is "left unused" is because it is financially in the platform's best interest to do nothing, at least currently, because there are basically no consequences for...

                But the only reason that this power is "left unused" is because it is financially in the platform's best interest to do nothing, at least currently, because there are basically no consequences for the platform. I can understand the idea that putting some kind of pressure on a platform to take a more active role in moderating the content hosted on the platform could be construed as "disempowering users" because it could potentially lead to situations of overzealous enforcement. But, pressuring a platform to take action against socially harmful content that ordinarily would have made more money for the platform hardly seems like "empowerment" of the platform at all.

                I'm also not convinced that the idea that platforms being liable for the content they host is incompatible with "the absentee-owner model". I mentioned this in another comment, but I can't think of a good reason why most troublesome situations couldn't be resolved through an insurance policy and a slightly more proactive role in moderating the content that appears on the platform.

                1 vote
                1. [4]
                  skybrian
                  Link Parent
                  "Socially harmful" and "legally risky" aren't the same thing. Saying bad things about powerful people is legally risky, because they could sue you. You might want to read about the fight between...

                  "Socially harmful" and "legally risky" aren't the same thing. Saying bad things about powerful people is legally risky, because they could sue you. You might want to read about the fight between Peter Thiel and Gawker.

                  There will be an incentive for social platforms to act, somehow, but what they actually do might not be what you want.

                  If there are clear enough limits on liability that it can be covered by insurance, then this would be a way for people with a good case to go to court and get compensated (or more likely settle out of court), where the money comes from higher prices for advertisers and/or users.

                  In some cases that might even be justice. I'm not sure it's really what people are hoping for, though?

                  1. [3]
                    bloup
                    Link Parent
                    Just so it’s clear, by “socially harmful”, I am simply referring to actions which incur externalized economic costs. Like if someone went on Reddit and wrongfully slandered a famous person so hard...

                    Just so it’s clear, by “socially harmful”, I am simply referring to actions which incur externalized economic costs. Like if someone went on Reddit and wrongfully slandered a famous person so hard that it somehow caused them to suffer demonstrable economic losses, that famous person could absolutely file a lawsuit against that Reddit user for those damages. This would be an example of content that is “socially harmful”.

                    I just find it frustrating that profit driven platforms make so much money in ethically sketchy ways while enjoying basically 0 risk when it comes to their top commodity. Like what other business works like that? I also find that if I try to apply the idea that a profit driven platform should have no responsibility for the content hosted on the platform to some kind of real life analogue to platforms, it leads to strange places. Like, if I leased some office space and I started a business advertising itself as “a platform for new small businesses”, and I wound up renting a few desks to a scam operation that eventually gets busted, should I really not be on the hook at all for that? Of course, on a large platform things will get missed, but the thing with liability is no matter what happened, whoever’s fault it was, or how unfair it is, somebody always has to pay. And if the scam victims want restitution, and can’t get it in full from the scammers, who should have to pay the rest of the damages? The scam victims? Or the person who provided the scammers with the means to carry out their scam, regardless of knowledge or intent (hopefully through their insurance policy, something that any active business should have)?

                    1 vote
                    1. [2]
                      skybrian
                      Link Parent
                      There is a question of privacy and a question of who is going to do the policing. Like, if you have a house and you rent it out, and the tenants do something bad, are you liable? Well, maybe...

                      There is a question of privacy and a question of who is going to do the policing. Like, if you have a house and you rent it out, and the tenants do something bad, are you liable?

                      Well, maybe sometimes, there is an argument they should be liable, but if the landlord has to police the tenants then maybe they will install cameras everywhere and create a bunch of rules and so much for privacy. Also, they won't rent to anyone they think is at all sketchy, because why take the risk? Sucks to be you, sketchy person.

                      We have limits and boundaries on responsibilities because we don't want to police each other all the time. It's expensive and time-consuming. Being the cop sucks.

                      I think there is room for compromise. Maybe some people should be responsible for some things where currently they're not. But, there need to be well-specified limits on what people are responsible for and it should be stuff that's reasonably under their control.

                      2 votes
                      1. bloup
                        Link Parent
                        I definitely appreciate your point, and you definitely are not wrong. This is sort of what I was getting at earlier when I mentioned “overzealous enforcement” as being construed as disempowerment...

                        I definitely appreciate your point, and you definitely are not wrong. This is sort of what I was getting at earlier when I mentioned “overzealous enforcement” as being construed as disempowerment of users.

                        So I know that we spun a pretty long yarn here, but I do agree with you insofar as making it so platforms are liable for the content they host could potentially have unintended and undesirable consequences. But if all we are asking for is just a teensy bit of oversight so that innocent people would actually be better protected against harm by some bad actor that the platform is enabling, but we can’t do that because these platforms might make our lives even worse than it was before in unforeseen ways, can we really say they are compatible with an equitable society?

                        2 votes
  6. [17]
    archevel
    Link
    I am likely in the minority here, but I would like to explore the idea of websites being liable for the content they host. Of course this would impact sites like Tildes, but maybe the pros would...

    I am likely in the minority here, but I would like to explore the idea of websites being liable for the content they host. Of course this would impact sites like Tildes, but maybe the pros would outweigh the cons?

    Newspapers can't AFAIK just print anything, they are responsible for the texts printed in their papers. Even if that is in some kind of "letter to the editor" portion of the paper. That todays website wouldn't exi st if they were liable for user produced content is not a strong argument. Maybe that is the case, maybe sites would adapt? Maybe having the internet solely contain moderated speech in public forums would improve the overall discourse?

    Also as mentioned in a comment that making sites liable would only benefit large corporations is dependant on what rules we decide to apply. Perhaps spreading missinformation and hate speech on a large scale (say like something along the lines of Facebook) should be more penalised than if the local newspaper publishes something similar to their 500 readers.

    4 votes
    1. [14]
      DrStone
      Link Parent
      The letter-to-the-editor style user content already exists in online publications, where the entity is in control of what gets published, actively curating everything published. The rough...

      The letter-to-the-editor style user content already exists in online publications, where the entity is in control of what gets published, actively curating everything published. The rough equivalent for an open forum (like Tildes or Reddit) would be a human curation team reading every single post and comment before they can be shown at all.

      5 votes
      1. [11]
        archevel
        Link Parent
        I realise that this would likely be expensive and maybe make open forums infeasible. In principle I think having a curation team would be great! Having one standard for publishing in print, ie....

        The rough equivalent for an open forum (like Tildes or Reddit) would be a human curation team reading every single post and comment before they can be shown at all.

        I realise that this would likely be expensive and maybe make open forums infeasible. In principle I think having a curation team would be great!

        Having one standard for publishing in print, ie. providing content, and a non-standard of anything-goes for online content is weird in my opinion. Especially when the hosts for online content make a bunch of money and becomes super powerful. They ought to be held to a much higher standard!

        As for sites like tildes, maybe they go away. Maybe they become more selective in who can post. Maybe a peer review of posts is put in place.

        Arp242 mentions that the issue is that while a lot of speech might be objectionable it isn't illegal. But if that is the case, then what's the problem? Overt hate speech, personal attacks and missinformation ought to be removed if possible. Putting the responsibility of removing such content on the entity providing it just seems common sense.

        1. [5]
          RNG
          Link Parent
          Advocating for the abolition of open forums is an incredibly radical position, even among those who are critical of the broadness of Section 230 protections. Not to mention that the position that...

          I realise that this would likely be expensive and maybe make open forums infeasible. In principle I think having a curation team would be great!

          As for sites like tildes, maybe they go away. Maybe they become more selective in who can post. Maybe a peer review of posts is put in place.

          Advocating for the abolition of open forums is an incredibly radical position, even among those who are critical of the broadness of Section 230 protections. Not to mention that the position that Section 230 should be eliminated is still a minority viewpoint.

          I don't mean this in a "gotcha" sort of way, I am legitimately curious, if you really do support the end of open forums in favor of a handful of "curated" or "peer reviewed" posts, why are you here?

          7 votes
          1. [4]
            archevel
            Link Parent
            I think I didn't express myself well enough. I'm not advocating for closing of open forums. I'm advocating for publishers of content (be it newspapers, Facebook or Tildes) to have a larger...

            I think I didn't express myself well enough. I'm not advocating for closing of open forums. I'm advocating for publishers of content (be it newspapers, Facebook or Tildes) to have a larger responsiblity than they currently have to ensure eg. hate speech and missinformation isn't being spread on their platforms. Of course there are degrees to this and there is a judgment call to be made on what consitutes this type of content.

            Is there a reason content provided by Facebook to be treated differently from e.g. a letter to the editor in a newspaper? If anything, because of Facebook's size, shouldn't they be held to a higher standard than even the largest newspaper publications?

            if you really do support the end of open forums in favor of a handful of "curated" or "peer reviewed" posts, why are you here?

            Even if I wanted to end all open forums (which I don't) I could still enjoy them while they are here :)

            1 vote
            1. [3]
              RNG
              (edited )
              Link Parent
              It's okay if you've changed your position since your last comment, but I'm going to push back on the claim that the consequences of the policy you have discussed so far don't necessarily entail...

              I think I didn't express myself well enough. I'm not advocating for closing of open forums. I'm advocating for publishers of content (be it newspapers, Facebook or Tildes) to have a larger responsibility than they currently have to ensure eg. hate speech and misinformation isn't being spread on their platforms.

              It's okay if you've changed your position since your last comment, but I'm going to push back on the claim that the consequences of the policy you have discussed so far don't necessarily entail the end of open discussion on the web, for reasons you spelled out better than I ever could in your previous comment.

              There is no hyperbole here: the position you have laid out is the most radical one I have encountered on the issue of Section 230 protections, an area of discourse I have followed for a very long time. I imagine most who oppose Section 230 on the grounds that it is too broad would be horrified at a possible world where even a site like Tildes couldn't exist because it would need to host content that hasn't been explicitly pre-authorized by an editor.

              1 vote
              1. [2]
                archevel
                Link Parent
                Do you mean that an inevitable consequence of making eg Facebook take a larger responsibility for the content it spreads is that sites like Tildes would all go away? With being responsible for the...

                Do you mean that an inevitable consequence of making eg Facebook take a larger responsibility for the content it spreads is that sites like Tildes would all go away? With being responsible for the content I do not necessarily mean that they must pre-authorize anything. Nor have I made any conclusions on what the consequence of failing that responsibility should be. Anyway, again why would increasing the responsibility of Facebook for the content it hosts lead to the closing of open forums?

                (Btw I have not read much about Section 230 so I wholly admit alotnof ignorance here).

                1. skybrian
                  Link Parent
                  It might mean that Deimos would be wise to buy liability insurance and agree to whatever the insurance company's rules are. Not a complete barrier, but a hoop to jump that raises costs a bit....

                  It might mean that Deimos would be wise to buy liability insurance and agree to whatever the insurance company's rules are. Not a complete barrier, but a hoop to jump that raises costs a bit.

                  Maybe some new startup would make a lot of money with an algorithm that rates posts for legal risk, and those that score too high get held back for review by a moderator?

                  The big tech companies would be able to do it themselves and self-insure. Smaller places would outsource.

                  2 votes
        2. [2]
          DrStone
          Link Parent
          Regardless of online or off, do believe there is any meaningful difference between a publisher and a forum?

          Having one standard for publishing in print, ie. providing content, and a non-standard of anything-goes for online content is weird in my opinion

          Regardless of online or off, do believe there is any meaningful difference between a publisher and a forum?

          1 vote
          1. archevel
            Link Parent
            I really liked this question! Have thought about it for a while now, but I haven't arrived at a fully formed answer. I think I believe there is a difference... But I'm not 100% sure of how big a...

            I really liked this question! Have thought about it for a while now, but I haven't arrived at a fully formed answer. I think I believe there is a difference... But I'm not 100% sure of how big a difference there is, if the difference is relevant in the context of responsibility for content on a site etc etc. Anyway good question!

        3. [3]
          skybrian
          Link Parent
          I don’t think gossip would go away. It would go dark. It would move to email and text messages and private groups and no longer be visible to search engines, and people would continue to spread...

          I don’t think gossip would go away. It would go dark. It would move to email and text messages and private groups and no longer be visible to search engines, and people would continue to spread rumors and conspiracy theories and memes. You wouldn’t know its source, and what are you going to do, sue your friends for resharing it?

          It seems like Signal would do well?

          1. [2]
            archevel
            Link Parent
            Sure, but I think there is a distinction to be made between a public open forum and private messaging. An analog would be holding a speech at the townsquare vs sending someone a private letter. I...

            Sure, but I think there is a distinction to be made between a public open forum and private messaging. An analog would be holding a speech at the townsquare vs sending someone a private letter. I think providers should have a larger responsibility w.r.t. text in open forums, but I don't think they should have the same responisbility for e.g. text messages.

            2 votes
            1. skybrian
              Link Parent
              Thinking like an epidemiologist, easy, instant forwarding to small groups of correspondents is enough to get rapid, exponential spread of a message. It could be just as viral as Twitter messages.

              Thinking like an epidemiologist, easy, instant forwarding to small groups of correspondents is enough to get rapid, exponential spread of a message. It could be just as viral as Twitter messages.

              1 vote
      2. [2]
        bloup
        Link Parent
        I don't really think that making platforms liable for the content they host necessarily means they would have to curate and vet literally everything that appears on the platform prior to...

        I don't really think that making platforms liable for the content they host necessarily means they would have to curate and vet literally everything that appears on the platform prior to publishing. Liability insurance exists precisely because it just is not possible for pretty much any business to completely control its liabilities.

        I mean imagine for a moment that someone creates a public platform and then a malicious group uses it to coordinate some kind of financial fraud. Should the platform really not have any culpability here? And if they were culpable, couldn't they just protect themselves through an insurance policy instead of micromanaging everything that appears on the site? Obviously, in an effort to keep insurance premiums down, the platform would be still incentivized to take a more proactive role in moderating harmful content, but that honestly sounds like it would be a good thing to me, and definitely does not seem like it should be some kind of insurmountable financial obstacle to starting a platform.

        1. DrStone
          Link Parent
          Sorry, I think I was unclear. In this particular comment, I wasn't trying to consider the platform implications for the potential liability changes. I was just intended to show how the...

          Sorry, I think I was unclear. In this particular comment, I wasn't trying to consider the platform implications for the potential liability changes. I was just intended to show how the "letter-to-the-editor" analogy OP used (i.e. external submitted content still completely curated by a publisher) would look applied to the social media / forum structure. (To me, publishers and platforms are fundamentally different, so "curating" everything on a platform would be absurd)

          1 vote
    2. [3]
      Comment deleted by author
      Link Parent
      1. [2]
        archevel
        Link Parent
        If they recorded what you said and replayed it anytime someone came to the establishment, then I think they share some responsibility in the spreading of hate speech. Is that a fair analogy to...

        If I go to the local community space and shout "kill all Jews!" then should the owner of the space be held accountable?

        If they recorded what you said and replayed it anytime someone came to the establishment, then I think they share some responsibility in the spreading of hate speech. Is that a fair analogy to what content providers on the web does?

        2 votes
        1. [2]
          Comment deleted by author
          Link Parent
          1. archevel
            Link Parent
            Good point. But isn't Twitter, Reddit and Facebook "the arbiters of truth and decency" already today? They curate and select what content to show to whom (often promoting controversial content...

            The biggest question is "how much (legal) hate speech or misinformation is acceptable on these platforms?" and that's a very tricky question, and I'm not sure that's a question that can be answered by law either. Personally I'm not happy with this kind of stuff, but I'm also not especially thrilled at the prospect of Twitter, Reddit, and Facebook became arbiters of truth and decency. Both options rather suck.

            Good point. But isn't Twitter, Reddit and Facebook "the arbiters of truth and decency" already today? They curate and select what content to show to whom (often promoting controversial content since it generates engagement) so in that sense I think they are very similar to newspaper editors albeit more automated.

            Is it offensive, indecent, and kind of pointless shouting? Sure. But in the right context it can be acceptable, and it's certainly cathartic. Not everything needs to be Very Serious Conversation.

            Agreed. However in some contexts there are lines that shouldn't be crossed. Your example seems rather tame, but for sale of argument let's say that it constitutes hate speech. Then, I believe it should be removed. The hard part here is to make the distinction and also who gets to decide. Today the content providers get to decide what stays up and what gets removed. I would prefer if that decision wasn't solely up to the big corporations.