20 votes

Florida has passed an unconstitutional law to allow suing and fining social media companies (except ones that also own theme parks) for censoring users or de-platforming politicians

4 comments

  1. JXM
    Link
    It’s pretty clear that DeSantis is doing this for show because he wants to run for president in 2024. He knows it is unenforceable but it makes for good headlines.

    It’s pretty clear that DeSantis is doing this for show because he wants to run for president in 2024. He knows it is unenforceable but it makes for good headlines.

    14 votes
  2. stu2b50
    Link
    Silly, but mostly a move to show off to the anti-tech wing of GOP voters (esp. after the Trump deplatforming, they're very angry). It's going to pretty quickly get thrown out. I'd also assume the...

    Silly, but mostly a move to show off to the anti-tech wing of GOP voters (esp. after the Trump deplatforming, they're very angry). It's going to pretty quickly get thrown out.

    I'd also assume the tech companies will go for the 1st amendment argument rather than S. 230, since it wouldn't help them to bring more spotlight to that, whereas as we well know now, the constitution is sacred to many voters.

    8 votes
  3. Deimos
    Link
    The first legal challenge against the law has been filed now by NetChoice and the Computer and Communications Industry Association. Some analysis and a link to the full complaint at Techdirt:...

    The first legal challenge against the law has been filed now by NetChoice and the Computer and Communications Industry Association. Some analysis and a link to the full complaint at Techdirt: First Legal Challenge To Florida's Unconstitutional Social Media Moderation Law Has Been Filed

    7 votes
  4. DanBC
    Link
    Something similar is going to happen in the UK. https://www.gov.uk/government/news/landmark-laws-to-keep-children-safe-stop-racial-hate-and-protect-democracy-online-published ---begin quote

    Something similar is going to happen in the UK. https://www.gov.uk/government/news/landmark-laws-to-keep-children-safe-stop-racial-hate-and-protect-democracy-online-published

    ---begin quote

    Freedom of expression

    The Bill will ensure people in the UK can express themselves freely online and participate in pluralistic and robust debate.

    All in-scope companies will need to consider and put in place safeguards for freedom of expression when fulfilling their duties. These safeguards will be set out by Ofcom in codes of practice but, for example, might include having human moderators take decisions in complex cases where context is important.

    People using their services will need to have access to effective routes of appeal for content removed without good reason and companies must reinstate that content if it has been removed unfairly. Users will also be able to appeal to Ofcom and these complaints will form an essential part of Ofcom’s horizon-scanning, research and enforcement activity.

    Category 1 services will have additional duties. They will need to conduct and publish up-to-date assessments of their impact on freedom of expression and demonstrate they have taken steps to mitigate any adverse effects.

    These measures remove the risk that online companies adopt restrictive measures or over-remove content in their efforts to meet their new online safety duties. An example of this could be AI moderation technologies falsely flagging innocuous content as harmful, such as satire.

    Democratic content

    Ministers have added new and specific duties to the Bill for Category 1 services to protect content defined as ‘democratically important’. This will include content promoting or opposing government policy or a political party ahead of a vote in Parliament, election or referendum, or campaigning on a live political issue.

    Companies will also be forbidden from discriminating against particular political viewpoints and will need to apply protections equally to a range of political opinions, no matter their affiliation. Policies to protect such content will need to be set out in clear and accessible terms and conditions and firms will need to stick to them or face enforcement action from Ofcom.

    When moderating content, companies will need to take into account the political context around why the content is being shared and give it a high level of protection if it is democratically important.

    For example, a major social media company may choose to prohibit all deadly or graphic violence. A campaign group could release violent footage to raise awareness about violence against a specific group. Given its importance to democratic debate, the company might choose to keep that content up, subject to warnings, but it would need to be upfront about the policy and ensure it is applied consistently.

    4 votes