quarta-feira, 1 de junho de 2022

Known content detection is not new but potential for protecting children is huge

 


The bold proposals set out by the European Commission to get a grip on the spread of child sexual abuse material (CSAM) on the internet are probably some of the most aggressive measures so far in the battle to protect children online, writes Dan Sexton.

Dan Sexton is the chief technical officer at the Internet Watch Foundation (IWF), a British child safety nonprofit. 

The new legislation would require the tech industry to detect known and unknown images and videos of children suffering sexual abuse, rape, and sexual torture.

It would also mandate them to detect and prevent grooming material and report offending content to a new EU Centre to tackle child sexual exploitation and abuse.

The mandatory detection of this material is fundamentally a good thing. Our latest figures show the issue is not going away, and criminals are still turning to servers in EU states to host some of the most heinous and hurtful material. In 2021, we found that 62% of all known child sexual abuse material (CSAM) was traced to servers in an EU member state. (...)

Sem comentários:

Enviar um comentário