8 votes

Why humanitarians are worried about Palantir’s new partnership with the UN

2 comments

  1. [2]
    patience_limited
    Link
    Since I'd brought up Palantir Technologies elsewhere today, I thought I'd look at what that data-devouring corporate scourge of trust and freedom is up to lately. This project does not inspire...

    Since I'd brought up Palantir Technologies elsewhere today, I thought I'd look at what that data-devouring corporate scourge of trust and freedom is up to lately.

    This project does not inspire confidence.

    Beyond a few press releases, we still don’t have much information about how the WFP came to this agreement with Palantir or what the full terms are—a bad precedent to set in the humanitarian assistance field, where trust is key. This includes having little to no information about Palantir’s pricing model (which is notoriously opaque) or its algorithmic assessments (also notoriously opaque and, like other algorithms in this space, subject to harmful biases). As political scientist Virginia Eubanks documents in her recent book, Automating Inequality: How High-Tech Tools Profile, Police, and Punish the Poor, we’ve already seen how some black box algorithms used to decide who gets social aid in the U.S. have hurt already-marginalized and heavily surveilled communities. (For example, a computer system used by the state of Indiana to cut welfare waste flagged minor errors in benefit applications as a “failure to cooperate”—and resulted in over 1 million individuals losing benefits.) While we certainly don’t know if Palantir’s analyses will have similar impacts on the people that WFP serves, we do know that unaccountable automated decision-making has the potential to do a lot of harm.

    I’m also afraid that the WFP is overconfident in its ability to anonymize and protect sensitive data it shares with Palantir. In a recent report, the International Committee of the Red Cross and Privacy International observed that humanitarian organizations often don’t really understand how the private companies they work with collect and analyze data and metadata, making it harder for them to ensure that the companies are doing the right thing. At the same time, it’s getting ever easier to draw potentially damaging inferences about people (both individuals and groups) from data that doesn’t, at first glance, seem revealing. Unfortunately, Palantir’s services revolve around doing just that.

    4 votes
    1. Maven
      Link Parent
      Given how much data palantir probably already has, they might as well just hand over the full un-anonymized dataset.

      I’m also afraid that the WFP is overconfident in its ability to anonymize and protect sensitive data it shares with Palantir.

      Given how much data palantir probably already has, they might as well just hand over the full un-anonymized dataset.