The resolution doesn’t offer many specifics, but it’s the latest sign that AI-generated images of child sexual abuse are becoming a priority for platforms and law enforcement — even if it’s still framed as an “emerging” problem.
The UK announced the agreement ahead of an AI safety summit later this week, and agencies from Italy, Germany, the US, Korea, and Australia signed it, as did OnlyFans, Snapchat, and a variety of child safety nonprofits.
Eufy’s new 360-degree 4K camera doesn’t need Wi-Fi or power outlets
El Niño expected to smash heat records in 2024
Now it’s the Galaxy Z Flip 6’s turn to leak in unofficial renders
What a bunch of A-list celebs taught me about how to use my phone
Flop rock: inside the underground floppy disk music scene