• Pigeon@beehaw.org
    link
    fedilink
    arrow-up
    11
    ·
    8 months ago

    Also, it’s the type of thing that makes me very worried about the fact that most of the algorithms used in things like police facial recognition software, recidivism calculation software, and suchlike are proprietary black boxes.

    There are - guaranteed - biases in those tools, whether in their processors or in the unknown datasets they’re trained on, and neither police nor journalists can actually see the inner workings of the software to know what those biases are, to counterbalance them or to recognize if the software is so biased as to be useless.