11 votes

Transport for London’s AI Tube station experiment

3 comments

  1. skybrian
    Link
    From the article: … … … There’s a Wired article that’s apparently based on similar documents, but I’ll go with this one since it’s not paywalled.

    From the article:

    At the end of last year, it was widely reported that the transit agency had launched an intriguing trial at Willesden Green station in North West London. The idea seemed to be that AI cameras would be used to spot people jumping or pushing their way through ticket barriers, to fight back against fare evasion.

    The reporting at the time was not especially detailed – with only vague nods to how the software works, or what it would be capable of. As it turns out, the trial was so much more than this.

    That’s why I’m delighted today to bring you the full, jaw-dropping story of TfL’s Willesden AI trial. Thanks to the Freedom of Information Act, I’ve obtained a number of documents that reveal the full extent of what the trial was trying to achieve – and what TfL was able to learn by putting it into practice.

    To make the station “smart”, what TfL did was essentially install some extra hardware and software in the control room4 to monitor the existing analogue CCTV cameras.

    In total, the system could apparently identify up to 77 different ‘use cases’ – though only eleven were used during trial. This ranges from significant incidents, like fare evasion, crime and anti-social behaviour, all the way down to more trivial matters, like spilled drinks or even discarded newspapers.

    For example, in the “safeguarding” bucket of use-cases, the AI was programmed to alert staff if a person was sat on a bench for longer than ten minutes or if they were in the ticket hall for longer than 15 minutes, as it implies they may be lost or require help.

    And if someone is stood over the yellow line on the platform edge for more than 30 seconds, it similarly sends an alert, which prompts the staff to make a tannoy announcement warning passengers to stand back. (There were apparently 2194 alerts like this sent during the trial period – that’s a lot of little incremental safety nudges.)

    There’s a Wired article that’s apparently based on similar documents, but I’ll go with this one since it’s not paywalled.

    7 votes
  2. Maelstrom
    Link
    Super interesting to see an implementation like this that appears genuinely useful without violating the rights, privacy etc of the commuters. Gives me a bit of hope for the future, however as the...

    Super interesting to see an implementation like this that appears genuinely useful without violating the rights, privacy etc of the commuters. Gives me a bit of hope for the future, however as the author notes the system would be very capable and easily tasked with less benevolent surveillance.

    5 votes
  3. secret_online
    Link
    All things considered, it looks like TfL's (Transport for London) use of image recognition in this trial is pretty reasonable. The signal of aggression being raising your arms was an interesting...

    All things considered, it looks like TfL's (Transport for London) use of image recognition in this trial is pretty reasonable. The signal of aggression being raising your arms was an interesting find. The article acknowledges the potential use-case of staff raising their hands to call for help as a quick action, rather than having to futz about on a tablet in a dangerous situation. That's a cool unintended side effect that possibly has applications elsewhere too.

    I do wonder what the other use-cases that were discarded were, especially the others in the shortlist that were in the high/medium category. Also the fact that some protected characteristics were being used by the system initially set off alarm bells, but upon thinking for 2 seconds it became clear that accessible needs (of all kinds) are a no-brainer. I do think having an explicit list of which characteristics are being processed would be good for transparency, rather than just a "the list includes these two but we're not actually going to tell you what else in in there". And finally facial recognition was being used for fare evasion in later parts of the trial, which makes me a little iffy. I acknowledge that for what they were trialing it's necessary, but hope its use stays limited. I'm not necessarily against its use, but I'd rather see more clarity and controls around how its used.

    3 votes