23 votes

Top companies ground Microsoft Copilot over data governance concerns

3 comments

  1. [2]
    skybrian
    Link
    I linked to Simon Willison's blog post because I think it explains the issue better than the original: ...

    I linked to Simon Willison's blog post because I think it explains the issue better than the original:

    The concern here isn’t the usual fear of data leaked to the model or prompt injection security concerns. It’s something much more banal: it turns out many companies don’t have the right privacy controls in place to safely enable these tools.

    ...

    If your document permissions aren’t properly locked down, anyone in the company who asks the chatbot “how much does everyone get paid here?” might get an instant answer!

    This is a fun example of a problem with AI systems caused by them working exactly as advertised.

    16 votes
    1. Kenny
      Link Parent
      That’s what is happening at my organization. We’re 3-6 months out of the appropriate data security and compliance settings to use Copilot with any PII or sensitive data.

      That’s what is happening at my organization. We’re 3-6 months out of the appropriate data security and compliance settings to use Copilot with any PII or sensitive data.

      7 votes
  2. joshtransient
    Link
    This was the exact same argument when Delve was introduced ten years ago. Users don't want to do data or permission cleanup. Orgs don't want to enforce it. Until one of those problems are solved,...

    This was the exact same argument when Delve was introduced ten years ago. Users don't want to do data or permission cleanup. Orgs don't want to enforce it. Until one of those problems are solved, search (and anything powered by it like Copilot) is going to be scary — "what if someone sees something they're not supposed to" — and disappointing — "we bought a million dollar product, turned it on, didn't configure it, and it didn't immediately perform like early 2010s-era Google."

    6 votes