The EU’s home affairs chief wants to read your private messages
Content
A new law is threatening the privacy of the European Union’s 447 million inhabitants.
The CSA Regulation, proposed by European Commissioner Ylva Johansson, could undermine the trust we have in secure and confidential processes like sending work emails, communicating with our doctors, and even governments protecting intelligence.
The regulation wants to routinely scan our private communications online using AI tools to look for the spread of child sexual abuse material.
It does not matter if you are suspected of a crime or not, this scanning could include everyone — hundreds of millions of law-abiding European residents.
This EU home affairs law does not just propose to scan the words that we type. It also wants to scan the personal pictures on our phones, the documents on our clouds, and the contents of our emails.
All the ways that we live our lives online, including a lot of deeply personal information, could be subject to regular digital searches.
Having anyone’s legitimate conversations monitored will harm everyone, especially children. Experts show that no one will be protected by making the internet less secure.
Mass surveillance online does not make us safer, it erodes our democratic rights and freedoms.
Do you know know that AI-based tools are fundamentally discriminatory?
Research confirms that AI systems perpetuate discrimination. We see men of colour flagged as suspicious when they’re not doing anything wrong. Women’s and girls’ bodies and LGBTQ+ people are over-censored.
These technologies entrench structural racism, sexism, homophobia and inequality, meaning that certain people are over-targeted whilst others are erased.
How can we trust this inherently faulty technology with such a sensitive issue as our children’s safety online?
Despite what the name suggests, AI tools aren’t even particularly intelligent — at least not in the way that we commonly think of intelligence.
They make mistakes that even a small child would not make. That does not mean that they cannot be useful, but we must be very careful about when it is — and is not — appropriate to use them.