TikTok is making a promotional push in Europe and Australia around a bundle of safety-focused features, some of which it announced in the US earlier this month, and which it says are aimed at protecting teenage users from dangerous challenges.
The company remains the target of a major consumer protection complaint in the region — which has led to active monitoring of its policies by the European Commission.
The measures being (re)announced by TikTok include a permanent in-app guide which pushes teens to engage with a ‘4-step’ process (aka: “stop, think, decide, act“) before engaging in online challenges; a dedicated policy category for dangerous acts and challenges in the reporting menu to make it easier for users to report problem challenges; and dedicated safety videos from curated creators being pushed to users who are under 18 via their ‘For You’ feeds to further raise awareness of safety issues around challenges.
In a sample video from the #SaferTogether campaign, TikTok creator @maddylucydann can be seen sketching a scenario in which a medic in an emergency department is figuring out what to tell the young patient who’s been admitted with serious injuries after falling attempting to imitate some parkour they saw in an online video but without having the necessary skills to pull off the tricks safely — and putting heavy emphasis on making kids think before they attempt something similarly silly.
Also today, in what looks like a new announcement, TikTok said it’s making a financial contribution to Western Sydney University to support further research into online challenges, and sharing research data with the university’s Young and Resilient Research Centre to that end.
It says this data “formed the basis” of an earlier report, authored by Dr. Zoe Hilton and published by Praesidio Safeguarding.
“We believe these two contributions will help the Western Sydney University Young and Resilient Research Centre in their interdisciplinary approach to developing evidence-based policies and practices to strengthen the resilience of young people in the digital age,” TikTok said in a blog post attributed to Alexandra Evans, its head of safety public policy, Europe.
The blog post also quotes Amanda Third, the co-director of the Centre which will be benefitting from the platform’s financial and data-based largess — stating that TikTok’s contribution will “assist it to explore the challenges involved in keeping young people safe online with real world data”; and “help us develop research to inform policies, programs and interventions to minimise the risks and maximise the benefits of the digital age for young people”.
TikTok’s blog post does not specify how much money it is donating to the university — but when asked for the figure the company (and Third) told us it’s 108,420 AUD (circa ~$78k).
The video-sharing platform has been facing months of scrutiny by regulators in the EU following consumer protection and privacy complaints; and an emergency intervention in Italy last year related to concerns over a ‘blackout challenge’ which local media had linked to the death of a child.
In the latter case, TikTok refuted any link with its platform but ended up removing more than half a million accounts in Italy which it had been unable to verify did not belong to children under the age of 13.
We reached out to the Italian data protection authority for an update on its monitoring of the company’s safety measures but at the time of writing it had not responded.
The European consumer protection umbrella association, BEUC, declined to give an assessment of the specific measures TikTok has announced today — saying it prefers to wait for regulators to weigh in on its concerns.
“We prefer to wait for the assessment by the consumer protection authorities who are following up on our complaints that the video sharing platform is breaking consumer law,” Alexandre Biard, team leader for enforcement at BEUC, told us, adding: “We expect the authorities to take measures to make sure the platform respects consumer rights.”
We also contacted the Commission asking for a progress update on its scrutiny of TikTok’s ToS and will update this report with any response.
Attention to online child safety has been dialling up in multiple jurisdictions in recent years.
In the US, tech execs from major platforms have been grilled by lawmakers on the issue — which has led to a number of bills being proposed, including most recently the Kids Online Safety Act.
While, elsewhere in Europe, the UK is now enforcing a Children’s Code which aims to regulate platforms’ design choices and defaults by forcing them to prioritize privacy and safety.
The country has much broader Online Safety legislation in the pipe, too, also with a major focus on child safety, which will introduce a legal duty of care on platforms towards users.
Down under, in Australia, there’s another Online Safety Bill on the slate — introduced at the end of 2021 — which similarly puts emphasis on tightening the law to protect children from cyberbullying and other online safety risks.
So there’s a clear, global consensus emerging around regulating platforms under a child protection rubric.