A human rights group claims TikTok promotes pornography and sexualised videos to minors. Researchers created fake child accounts, enabled safety settings, and still received explicit search prompts. These included clips showing masturbation simulations and pornographic sex. TikTok says it acted immediately once notified and insists it prioritises safe, age-appropriate experiences for young users.
Fake child accounts uncover risky content
In July and August, Global Witness researchers set up four TikTok profiles. They posed as 13-year-olds using false birth dates. The platform did not request additional verification. Investigators activated TikTok’s “restricted mode”. The company markets this feature as a barrier against sexual or mature material. Despite this, accounts received sexualised search suggestions in the “you may like” section. These led to videos of women exposing breasts, flashing underwear, and simulating masturbation. At the extreme, explicit pornography appeared hidden in ordinary-looking clips to evade moderation.
Campaign group issues warning
Ava Lee from Global Witness described the findings as a “huge shock”. She said TikTok not only fails to protect children but actively recommends harmful material. Global Witness usually studies how large tech platforms affect democracy, human rights, and climate change. The organisation first encountered TikTok’s explicit content while conducting unrelated research in April.
TikTok defends safety measures
Researchers reported the findings earlier this year. TikTok said it removed the flagged content and introduced fixes. But when Global Witness repeated the test in late July, sexual videos appeared again. TikTok insists it has more than 50 safety features for teenagers. It claims nine out of ten violating clips are deleted before being viewed. Following the report, TikTok said it upgraded search functions and removed additional harmful content.
New rules increase platform responsibility
On 25 July, the Children’s Codes under the Online Safety Act took effect. Platforms must enforce strong age checks and prevent minors from accessing pornography. Algorithms must also block material linked to suicide, self-harm, or eating disorders. Global Witness repeated its research after the codes came into force. Ava Lee urged regulators to intervene, stressing that children’s online safety must now be enforced.
Users question sexualised recommendations
During the study, researchers monitored user reactions. Some expressed confusion at sexualised search suggestions. One wrote: “can someone explain to me what is up with my search recs pls?” Another commented: “what’s wrong with this app?”
