La presente informativa è resa, anche ai sensi dell’art. 13 del D. Lgs. 196/2003 “Codice in materia di protezione dei dati personali” (“Codice Privacy”) 
e degli artt. 13 e 14 del Regolamento (UE) 2016/679 (“GDPR”), a coloro che si collegano alla presente edizione online del giornale Tribuna Economica di proprietà di AFC Editore Soc. Coop. 

Leggi di più


A landmark new set of EU rules for a safer and more accountable online environment enters into force with the Digital Services Act (DSA). The DSA applies to all digital services that connect consumers to goods, services, or content. It creates comprehensive new obligations for online platforms

to reduce harms and counter risks online, introduces strong protections for users' rights online, and places digital platforms under a unique new transparency and accountability framework. Designed as a single, uniform set of rules for the EU, these rules will give users new protections and businesses legal certainty across the whole single market. The DSA is a first-of-a-kind regulatory toolbox globally and sets an international benchmark for a regulatory approach to online intermediaries.

New responsibilities for digital services.       The DSA introduces a comprehensive new set of rules for online intermediary services on how they have to design their services and procedures. The new rules include new responsibilities to limit the spread of illegal content and illegal products online, increase the protection of minors, give users more choice and better information.

All online intermediaries will have to comply with wide-ranging new transparency obligations to increase accountability and oversight, for example with new flagging mechanism for illegal content. But a special regime is introduced for platforms with more than 45 million users: for such very large online platforms or search engines, further obligations include wide-ranging annual assessments of the risks for online harms on their services - for example with regard to exposure to illegal goods or content or the dissemination of disinformation. Under the DSA, suitable risk mitigation measures will have to be put in place, and subject to independent auditing of their services and mitigation measures.

Smaller platforms and start-ups will benefit from a reduced set of obligations, special exemptions from certain rules, and most crucial increased legal clarity and certainty for operating across the whole EU's single market. 

Enhanced safeguards for fundamental rights online.      The new rules protect users' fundamental rights in the EU also in the online environment. New protections for the freedom of expression will limit arbitrary content moderation decisions by platforms, and offer new ways for users to take informed action against the platform when their content is moderated: for example, users of online platforms will now have multiple means of challenging content moderation decisions, including when these decisions are based on platforms' terms and conditions. Users can complain directly to the platform, choose an out-of-court dispute settlement body or seek redress before Courts.

New rules also require platforms' terms to be presented in a clear and concise manner and to respect users' fundamental rights.

Very large online platforms and search engines will in addition have to undertake a comprehensive assessment of risks to fundamental rights, including the freedom of expression, the protection of personal data, and freedom and pluralism of the media online as well as the rights of the child.

New supervisory powers for the Commission.       The DSA creates an unprecedented level of public oversight of online platforms across the Union, both at national and EU level. The Commission has powers to directly supervise VLOPs and VLOSEs, companies which individually reach more than 10% of the EU population, approximately 45 million people. Additionally, each Member State will have to designate a Digital Services Coordinator, who will supervise other entities in scope of the DSA as well as VLOPs and VLOSEs for non-systemic issues. The national coordinators and the European Commission will cooperate through a European Board of Digital Services. This EU-wide cooperation mechanism will be established between national regulators and the Commission.

The Commission is setting up a European Centre for Algorithmic Transparency (ECAT) to support its supervisory role with in-house and external multidisciplinary knowledge. The Centre will provide support with assessments as to whether the functioning of algorithmic systems are in line with the risk management obligations that the DSA establishes for VLOPs and VLOSEs to ensure a safe, predictable and trusted online environment.