The EU has initiated a formal investigation into TikTok concerning the platform's adequacy in safeguarding children under the Digital Services Act (DSA).
This marks the second inquiry under the DSA following a December probe into one of
Elon Musk's enterprises.
The investigation, announced on Monday, focuses on allegations that TikTok failed to fulfill its duty in protecting minors from potential online harms.
The European Commission is scrutinizing TikTok, especially for potentially exposing young users to harmful content through its algorithmic recommendations.
Other key issues under examination include TikTok's age verification processes and its transparency with regards to advertising. Researcher access to TikTok's data is also being assessed following TikTok's response to the Commission's demand for information on its content monitoring and child protection strategies.
The Commission vows to diligently pursue evidence and impose necessary enforcement actions. Thierry Breton, the EU's internal market commissioner, emphasizes the platform's critical responsibility to adhere to the DSA, given its extensive reach to young users within the EU.
With over 142 million monthly users in the EU, TikTok has been urged by commission executive vice president Margrethe Vestager to evaluate the services' risks carefully. Meanwhile, TikTok asserts its commitment to safeguarding young users, touting its pioneering efforts in age-appropriate features and settings.
There's no set deadline for the investigation's conclusion. Nevertheless, the DSA empowers the EU to impose significant fines, up to six percent of a company's global turnover, and can ban platforms for severe, repeated violations.
The DSA, which applies to all digital services since February 17, mandates stricter content regulation and enhanced online consumer protection.