The Rise of 'Algoracism'
May 8, 2024
I had never heard of it: "Algoracism". But in an interview by Tom Grosfeld with Professor Mark Schuilenburg in the magazine Vrij Nederland, Schuilenburg, a professor of digital surveillance, argues that the dynamics of surveillance have changed. In the past, there was suspicion first, followed by surveillance (Grosfeld, 2024). Now, it's the other way around. Values such as privacy, non-discrimination, transparency, and accountability are losing ground to security and efficiency.
For example, Schuilenburg points out that the police now have so much data that looking back becomes more important than looking forward. Traditionally, crime has been focused on people. As a result, there is insufficient awareness of how AI itself will lead to entirely new forms of crime. With the police becoming less visible on the streets, social cohesion is diminishing. The responsibility to watch out for each other is being delegated to technology.
Schuilenburg emphasizes that he is not painting a dystopian picture; it is already happening, but we are not fully aware of it. While Schuilenburg is based in the Netherlands, what he asserts applies equally to Curaçao.
Miguel Goede
Mr. Goede, while doing some research, I understand that “algoracism” is a term that refers to the racial biases that can be embedded in algorithms, machine learning, and artificial intelligence. And that this phenomenon often reproduces and reinforces existing racial prejudices and discriminatory practices,
By saying 'there is insufficient awareness of how AI itself will lead to entirely new forms of crime' does one mean that it can lead to 'algoracism'?