9 votes

AI and ethical licensing

8 comments

  1. [3]
    Grendel
    Link
    Full disclosure: I read an article that casually mentioned the ethics of the license. I tried to find an existing article that focused on that, but after I was unable to find any I wrote this one...

    Full disclosure:
    I read an article that casually mentioned the ethics of the license. I tried to find an existing article that focused on that, but after I was unable to find any I wrote this one myself (rather than make an insanely long post here).

    I'm really curious about your thoughts, especially those of you who may see issues with ethical elements being added to licenses.

    What did they get right?
    What could go wrong?
    How would you do this differently?
    What are your thoughts on AI/Machine Learning in general?

    3 votes
    1. [2]
      FluffyKittens
      Link Parent
      Won’t comment too much on the merits of the licensing angle (I’ll leave that to someone in the legal profession), other than to say it’s a noble intent, but probably not the cure for what ails us....

      Won’t comment too much on the merits of the licensing angle (I’ll leave that to someone in the legal profession), other than to say it’s a noble intent, but probably not the cure for what ails us.

      Using blackbox AI as anything other than a flakey guessing device is a recipe for disaster. Examples of misapplication include Tesla FSD, Amazon’s worker surveillance tools, and automated proctoring services. Examples of good applications for blackbox AI would be industrial applications like visually scanning for defective units on a factory conveyor belt, or research work like scanning satellite imagery/LIDAR measurements to identify probable archaeological sites. In other words, it should only be used in contexts where the impact of false positives is minimal.

      Regarding the ML industry more broadly, we have plenty of “dumber” statistical tools that not only are more explainable than blackbox AI, but are also generally more robust and effective. There’s plenty of greenfield work to be done that doesn’t necessitate the societal harm imposed by the current blackbox boom, and I’m optimistic the industry will move back towards the simple but effective tools long-term.

      For practical examples of what I mean by “dumber” tools, Wikipedia’s Anomaly Detection article has a good list that spans the spectrum of technique complexity pretty well: https://en.wikipedia.org/wiki/Anomaly_detection#Popular_techniques

      (Obligatory disclaimer that I’m probably about to take a job with a “dumb AI” healthtech company - but that’s because I’m practicing what I preach on this front.)

      3 votes
      1. Grendel
        Link Parent
        That is the sentiment I was really trying to capture. The vast majority of people only know what they see in the news and commercials about AI. They are being sold something dangerous and they...

        That is the sentiment I was really trying to capture. The vast majority of people only know what they see in the news and commercials about AI. They are being sold something dangerous and they have no idea of the reality of that danger.

        I tried to keep it non technical in the hopes that it makes non techies at least consider questioning this "AI" movement that we are seeing today.

        3 votes
  2. [5]
    skybrian
    (edited )
    Link
    I don't know any others, but it's probably not the first restriction of this kind, given the commercial contracts are typically not standardized, and they can have all sorts of restrictive...

    I don't know any others, but it's probably not the first restriction of this kind, given the commercial contracts are typically not standardized, and they can have all sorts of restrictive clauses, depending on what the owners and lawyers come up with.

    One difference, though, is that they're typically contracts rather than licenses. And, although it's not about source code, there are also "terms of service" that can be included by reference in click-through end user licenses. (Here are DALL-E's terms of use which are pretty restrictive and refer to the Open AI Terms of Use, which add more restrictions.)

    But you can't rely on legal restrictions such as licenses and terms of use to entirely prevent people from doing bad things with either source code or a service. Users don't necessarily read license agreements; you may not have read some you "agreed" to. Users will do what they decide is okay, baed on a fairly self-serving interpretation of what "okay" means. It's not going to stop "piracy."

    They do prevent usage by companies who have lawyers and policies about checking license agreements. Also, I'd expect that just by not being an open source license, that will keep the code out of any package repositories and many Linux distributions. And that may be... fine? Maybe you don't need to contribute your AI code to the world? If you're worried about what people will do with it, you probably don't want people on 4chan to start using it either? Why publish the source code for strangers at all?

    In the bigger picture, I don't think license restrictions are going to do much to prevent machine learning from used in bad ways. Banks are going to write their own credit scoring systems, or buy commercial software from companies that cater to them. (Never mind what China is doing.) So this is largely about not getting your own hands dirty, because if the code is useful, other teams will probably come up with something similar.

    And I'm reminded of how, before the Ukraine invasion, there were some controversies over military uses of open source software. Also, employees at big tech companies protested contracts with the US military. That seems to have died down, now that supplying weapons to Ukraine is mostly considered good?

    3 votes
    1. [4]
      vord
      Link Parent
      At least in the US, that's not really true. Courts have generally upheld that licenses and terms of use are still 100% binding, just like a contract. With some exceptions, like if it violates...

      But you can't rely on legal restrictions such as licenses and terms of use to entirely prevent people from doing bad things with either source code or a service.

      At least in the US, that's not really true. Courts have generally upheld that licenses and terms of use are still 100% binding, just like a contract. With some exceptions, like if it violates another law, and often then only the violating part is struck.

      It certainly won't stop the abuse an end user might do, but certainly gives a big stick to wield to put a stop to it real quick.

      1 vote
      1. [3]
        skybrian
        Link Parent
        That's if it gets to court. Ask the music industry how well that worked to keep people from copying their music. And even winning court cases isn't enough to keep Sci-Hub offline, though key keep...

        That's if it gets to court. Ask the music industry how well that worked to keep people from copying their music. And even winning court cases isn't enough to keep Sci-Hub offline, though key keep changing their DNS.

        But court cases do work for fat targets (businesses and organizations) and people who try to keep things legal to avoid getting sued (same).

        1 vote
        1. [2]
          Grendel
          Link Parent
          I think the big difference here is the users. Music was being copied mostly a mass of individuals. That's pretty impossible to go after. Average users aren't going to be using AI to make medical...

          I think the big difference here is the users. Music was being copied mostly a mass of individuals. That's pretty impossible to go after.

          Average users aren't going to be using AI to make medical decisions. Companies are the main concern for many of these ethical issues, and it's much easier to go after a company.

          Hopefully the license will keep companies from even trying to use it unethicaly due to the risk.

          3 votes
          1. skybrian
            Link Parent
            Yes, exactly. Most US and EU companies aren't going to touch it. (I remember when a lot of companies were skeptical about using open source code at all.) China may be a different story. I can't...

            Yes, exactly. Most US and EU companies aren't going to touch it. (I remember when a lot of companies were skeptical about using open source code at all.)

            China may be a different story. I can't find it now, but there was a story about some researcher in China who set up a website where people can upload radiology images to scan them using machine learning.

            1 vote