
X, the social media platform owned by Elon Musk, is facing demands from regulators to address the spread of sexualised images of people across the social media platform, generated on-demand by its in-built AI tool Grok.
Reuters reported the European Commission (EC) described the sharing of images of undressed women and children across X as unlawful and appalling, adding to concerns raised from politicians across the world.
The issue relates to users asking the chatbot on X to alter real images to create a sexualised situation without consent. Reuters reported X previously described the functionality as “spicy mode”.
EC representative Thomas Regnier declared it is “very aware” X was offering the mode. “This is not spicy. This is illegal. This is appalling. This is disgusting. This is how we see it and this has no place in Europe”.
UK regulator Ofcom also hit out at X over the issue, demanding the social media company explain how Grok is able to produce such images of people and whether it was legally failing to protect its users.
Under the UK’s Online Safety Act, Ofcom said it is illegal to create or share intimate or sexually explicit images of people without their consent, a regulation which also covers AI deepfakes.
Online companies are expected to take the necessary steps to counter such images and remove them as quickly as possible under the law.
Regulators in India, France and Malaysia have also raised concerns over the content.
Source: Mobile World Live
Image Credit: X





