• pelespirit@sh.itjust.works
    link
    fedilink
    arrow-up
    7
    ·
    11 months ago

    That’s troublesome right there, it should be an outside commission that gets to see and choose which officers are flagged.

    Truleo’s software allows supervisors to select from a set of specific behaviors to flag,

    • Thorny_Insight@lemm.ee
      link
      fedilink
      arrow-up
      3
      ·
      11 months ago

      Were probably talking like hundreds of thousands of hours of new recordings each week. There’s zero chance a human is going to review that all.

      • howrar
        link
        fedilink
        arrow-up
        3
        ·
        11 months ago

        The third party can still use the same kinds of automation tools.

        • pelespirit@sh.itjust.works
          link
          fedilink
          arrow-up
          1
          ·
          11 months ago

          Yes, that’s what I meant. The 3rd party would have get to see all of the info and they get to decide which flags to look for. IMO, then the third party, the police union and a civilian commission gets to vote on whether the officer stays or not. The unions have way too much power.

  • Renegade@infosec.pub
    link
    fedilink
    arrow-up
    5
    ·
    11 months ago

    I would be curious what the article means by AI. For example this might include some transcription and sentiment anaysis. Didn’t see anything too complicated in their description of what the software does.

    • keepthepace@slrpnk.net
      link
      fedilink
      arrow-up
      2
      ·
      11 months ago

      Video surveillance also had “violent behavior” models for a while. I am guessing that in 99.9% of the videos, nothing worth noticing happens. If that allows them to flag the remaining 0.1% for human review, that’s already a huge boost.