DIGITAL LIFE

TikTok recommends sexual content and pornography to children, report alleges
TikTok's algorithm recommends pornography and sexualized content to children's accounts, according to a new report from Global Witness (a human rights organization).
Researchers from the organization created fake profiles of 13-year-olds and enabled safety settings, but still received search recommendations with sexually explicit terms. These recommendations led to sexual videos, including images of penetration.
TikTok stated that it is committed to providing safe and appropriate experiences for minors. The company also said it took immediate action after being informed of the issue by Global Witness.
The four fake accounts were created between July and August of this year. The researchers used fake birthdates, pretending to be 13-year-olds, and were not asked to provide any other information to confirm their identities. Pornography
Global Witness researchers also activated the platform's "restricted mode," which, according to TikTok, prevents users from viewing "adult or complex themes, such as... sexually suggestive content."
Even without performing searches, the researchers found overtly sexual terms being suggested in the app's "you might like" section. The suggestions led to videos of women simulating masturbation, displaying underwear in public places, or showing their breasts.
In extreme cases, the content included explicit pornographic films depicting penetrative sex. These videos were embedded within seemingly harmless material, a successful strategy to circumvent the platform's moderation.
Ava Lee of Global Witness told the BBC that the findings were a "huge shock" to the researchers.
According to Lee, "TikTok isn't just failing to prevent children from accessing inappropriate content—it's suggesting it to them as soon as they create an account."
Global Witness is an activist group that investigates how large technology companies influence discussions about human rights, democracy, and climate change.
The organization's researchers identified the problem in April while conducting other studies.
Removed Videos...Global Witness informed TikTok, which said it took immediate action to address the issue. But in late July and August of this year, the group repeated the survey and found that the app still recommended sexual content. TikTok claimed to have more than 50 features designed to protect teens: "We are fully committed to providing safe and age-appropriate experiences." The app also said it removes 9 out of 10 videos that violate its guidelines before they are even viewed.
After being alerted by Global Witness, TikTok said it took steps to "remove content that violated our policies and roll out improvements to our search suggestion feature."
In the UK, children aged 8 to 17 spend two to five hours a day online, according to Ofcom research. The study shows that almost all children over 12 have cell phones and watch videos on platforms like YouTube and TikTok.
TikTok introduced a 60-minute screen time limit by default for children under 18 in 2023, but this limit can be disabled through settings.
In March 2025, an exposé published by the BBC revealed that TikTok profits from livestreams of sexual content by teenagers. The app retains approximately 70% of the fees.
TikTok knew about the exploitation of children and teenagers in livestreams and conducted an internal investigation in 2022, according to allegations in a lawsuit filed by the US state of Utah. The complaint alleges that the company ignored the issue because it "significantly profited" from the exploitation.
TikTok stated that the lawsuit, currently underway in the United States, disregards the "proactive measures" it took to improve the platform's safety.
Child Protection Legislation...On July 25th of this year, the Online Safety Act's Children's Codes came into effect in the UK, imposing a legal duty on digital platforms to protect children online. This measure requires some services, particularly pornographic websites, to verify the age of users in the UK.
The law aims to make the internet safer, especially for minors, and is implemented and monitored by Ofcom, the country's media regulator.
Companies must adopt "highly effective age controls" to prevent minors from accessing pornography. They must also adjust their algorithms to block content that encourages self-harm, suicide, or eating disorders.
The change in the law has sparked widespread debate in the UK.
Some experts and activists are advocating for stricter rules and even banning children under 16 from social media. Ian Russell, president of the Molly Rose Foundation, created after the death of his 14-year-old daughter, said he was "dismayed by the lack of ambition" in Ofcom's codes.
The UK's leading children's charity, the National Society for the Prevention of Cruelty to Children (NSPCC), criticized the fact that current legislation fails to guarantee protection on private messaging apps like WhatsApp.
According to the organization, end-to-end encryption "continues to pose an unacceptable risk to children." With this type of encryption, messages are encrypted upon leaving the sender's phone and can only be decoded on the recipient's phone.
Privacy advocates, on the other hand, say the age verification methods adopted by the UK are invasive and ineffective.
Silkie Carlo, director of Big Brother Watch, a British NGO that promotes privacy and civil rights campaigns, stated that these rules could lead to "security breaches, invasions of privacy, digital exclusion, and censorship."
TikTok and Regulation in Brazil...TikTok is one of the most popular social networks in Brazil, present on 46% of cell phones, behind Instagram (91%) and Facebook (76%), according to a Mobile Time/Opinion Box survey. Unofficial estimates indicate around 100 million users in the country, which has a population of 213 million.
In June, the Supreme Federal Court (STF) expanded the regulation of digital platforms, establishing that companies can be held liable for criminal content posted by third parties. Serious content, such as anti-democratic messages, child pornography, and encouragement of suicide, must be actively removed, while other content only needs to be deleted after notification.
On September 18th, President Luiz Inácio Lula da Silva signed the Digital Statute for Children and Adolescents (ECA Digital), which establishes the responsibility of technology companies to protect children under 18 from harmful content.
"It's a mistake to believe that big tech will take the initiative to self-regulate. This mistake has already cost the lives of several children and adolescents. Several countries have made progress in creating legal provisions to protect children and adolescents in the digital world," the president stated.
Regulation of this sector will be handled by the National Data Protection Agency (ANPD).
By Angus Crawford - BBC Investigative Team(www.bbc.com)
No comments:
Post a Comment