Brussels launches a proper investigation into TikTok for the dearth of safety for minors and lack of transparency | EUROtoday

Get real time updates directly on you device, subscribe now.

The European Commission launched a proper process this Monday to evaluate whether or not TikTok could have infringed the European Digital Services Act with regard to the safety of minors, promoting transparency and entry to information for researchers, in addition to the chance administration of addictive design and dangerous content material.

“Today we opened an investigation into TikTok for suspected breaches of transparency and child protection obligations due to addictive design and screen time limits. For the 'rabbit hole effect' and age verification, as well as the default privacy settings. The application of the European directive is necessary for a safer internet for young people,” mentioned the Commissioner liable for the Internal Market, the French Thierry Bretonwhich already has related processes towards different digital giants.

Based on the preliminary investigation carried out so far, and particularly after the evaluation of the chance evaluation report despatched by TikTok in September 2023, in addition to the corporate's responses to formal requests for data from the Commission on unlawful content material, safety of minors and entry to information”, the French commissioner's team has decided to launch this formal procedure.

In a statement published this afternoon, Bruselas explains the points in which the company has not been able to convince with its explanations preliminaries, so you face a file that can end in sanctions. The EU is concerned the algorithm of the platform, “which might stimulate addictions or create so-called rabbit gap results.” The in-depth evaluation, they state, “is important to counteract the potential dangers for the train of the basic proper to the bodily and psychological well-being of the particular person, respect for the rights of the kid and its impression on radicalization processes.” Well, it is not only a problem of consumption, but also of political or religious radicalization. “The measures, and particularly the instruments of age verification utilized by TikTok to forestall minors from accessing inappropriate content material is probably not affordable, proportionate and efficient,” say community experts.

The other big problem, which was one of the topics highlighted in the first conversations between the Commission and the technology company, affects the requirements contained in the European Digital Services Law to guarantee “a excessive stage of privateness, safety and safety of minors.” , particularly with regard to the privacy settings “by default for minors as a part of the design and operation of their advice techniques.”

As almost always happens in these investigations, European officials complain about the lack of access and information to their data and systems. The opening of a formal process does not prejudge the result. Sometimes companies are exonerated, but the normal thing is that there are usually specific demands, behavioral modifications, operational changes or even division of the business structure, given the risk of multimillion-dollar sanctions. Brussels also explains that what is decided on the aforementioned points does not mean that there cannot be measures in the event of other infringements, “for instance in relation to the obligations of a supplier in relation to the dissemination of illicit content material, comparable to terrorist content material or “online sexual abuse of minors, or with the notification of suspected crimes.”

From now on, one other section begins, however in contrast to Competition procedures, for instance, the Digital Services Law doesn’t set up any authorized deadline to finish “formal” investigations. The length of an in-depth investigation is dependent upon a number of elements. “, among them the complexity of the matter, the degree of cooperation of the affected company with the Commission and the exercise of the right of defense,” they warn of their assertion.