Friday, November 22, 2024

Trending Topics

HomeTop NewsEurope's Big Tech Law Is Approved. Now Comes the Hard Part

Europe’s Big Tech Law Is Approved. Now Comes the Hard Part

spot_img

The potential gold standard for online content governance in the EU—the Digital Services Act —is now a reality after the European Parliament voted overwhelmingly for the legislation earlier this week. The final hurdle, which is a mere formality, is for the European Council of Ministers to sign off on the text in September. Asha Allen is Advocacy Director for Europe, Online Expression & Civic Space for the Centre of Democracy and Technology, Europe Office, where she coordinates advocacy engagement on the Digital Services Act and European Democracy Action Plan.

The good news is that the landmark legislation includes some of the most extensive transparency and platform accountability obligations to date. It will give users real control over, and insight into, the content they engage with and offer protections from some of the most pervasive and harmful aspects of our online spaces. The focus now turns to implementation of the vast law, as the European Commission begins in earnest to develop the enforcement mechanisms.

The proposed regime is a complex structure in which responsibilities are shared between the European Commission and national regulators, in this case known as Digital Services Coordinators (DSCs). It will rely heavily upon the creation of new roles, expansion of existing responsibilities, and seamless cooperation across borders. What’s clear is that, as of now, there simply isn’t the institutional capacity to enact this legislation effectively.

In a “ sneak peek ,” the Commission has provided a glimpse into how they propose to overcome some of the more obvious challenges to implementation—like how they plan to supervise large online platforms, and how they will attempt to avoid the problems that plague GDPR, such as out-of-sync national regulators and selective enforcement—but their proposal only raises new questions. A huge number of new staff will need to be hired and a new European Centre for Algorithmic Transparency will need to attract world-class data scientists and experts in order to aid in the enforcement of the expansive new algorithmic transparency and data accessibility obligations. The Commission’s preliminary vision is to organize its regulatory responsibilities by thematic areas, including a societal issues team, which will be tasked with oversight over some of the novel due diligence obligations.

Insufficient resourcing here is a cause for concern and would ultimately risk turning these hard won obligations into empty tick-box exercises. One critical example is the platforms’ obligation to conduct assessments to address systemic risks on their services. This is a complex process that will need to take into account all the fundamental rights protected under the EU Charter.

In order to do this, the tech companies will have to develop human rights impact assessments (HRIAs)—an evaluation process meant to identify and mitigate potential human rights risks stemming from a service or business, or in this case a platform—something civil society urged them to do throughout the negotiations. It will, however, be up to the Board, made up of the DSCs and chaired by the Commission, to annually assess the most prominent systemic risks identified, and outline best practices for mitigation measures. As someone who has contributed to developing and assessing HRIAs, I know that this will be no easy feat, even with independent auditors and researchers feeding into the process.

If they are to make an impact, the assessments need to establish comprehensive baselines, concrete impact analyses, evaluation procedures, and stakeholder engagement strategies. The very best HRIAs embed a gender-sensitive approach and pay specific attention to systemic risks that will disproportionately impact those from historically marginalized communities. This is the most concrete method for ensuring all potential rights violations are included.

Luckily, the international human rights framework, such as the UN Guiding Principles on Human Rights, offers guidance on how best to develop these assessments. Nonetheless, the success of the provision will depend upon how platforms interpret and invest in these assessments, and even more so on how well the Commission and national regulators will enforce these obligations. But at current capacity, the ability of the institutions to develop the guidelines, best practices and to evaluate mitigation strategies is nowhere near the scale the DSA will require.

Given the enormity of these tasks, it seems that the European Commission will have to put in place dedicated professional teams of qualified human rights experts with a deep understanding of human rights impact assessments. These independent teams would need to be supported by a breadth of additional expertise and knowledge to ensure their actions are inclusive and meaningful. As it stands now, no role is foreseen for the European Fundamental Rights Agency to provide such support and the public consultations envisaged in the development of guidelines that will shape these mitigation measures will be limited at best.

The DSA notes the necessity for civil society’s input and expertise throughout the text, more so than any other text of its kind that has preceded it. It is clear that the Commission will need said expertise in order to support the development and evaluation of such assessments. Quite simply, without the meaningful engagement of advocates in the implementation and enforcement of the entire DSA, the potentially groundbreaking provisions we have collectively worked so diligently to obtain in the text won’t come to fruition.

Establishing and formalizing civil society as an implementation partner, along with the European Parliament, will increase accountability, public scrutiny, and ensure that a human-rights centred approach to enforcement is implemented. The European Commission has already established Advisory Committees, or high level expert bodies and working groups to aid implementation of legislation in other areas, which are structures that we could draw inspiration from. These entities are far from perfect and would have to be appropriately redefined for the DSA context, but the wheel would not need to be reinvented in this case, just reimagined.

Enforcement of the DSA is going to be an uphill climb. Look no further than the ineffective and inconsistent cross-border cooperation when it comes to the GDPR. Unfortunately, there’s no mechanism in the DSA to guarantee independence from political influence and the depth of the challenges that lay ahead may not be fully understood for several years.

But it is not too late to rectify these potential shortcomings. As the EU institutions and national regulators build more substance into their enforcement strategies, they must acknowledge that if the DSA is to be the gold standard for online content governance, they must innovate and be bold in their approach. Their commitment for systematic engagement with civil society has been written into the law; they must realize this vision by building a collaborative approach to the enforcement mechanisms.

WIRED Opinion publishes articles by outside contributors representing a wide range of viewpoints. Read more opinions here , and see our submission guidelines here . Submit an op-ed at opinion@wired.

com . .


From: wired
URL: https://www.wired.com/story/digital-services-act-regulation/

DTN
DTN
Dubai Tech News is the leading source of information for people working in the technology industry. We provide daily news coverage, keeping you abreast of the latest trends and developments in this exciting and rapidly growing sector.

LEAVE A REPLY

Please enter your comment!
Please enter your name here

spot_img

Must Read

Related News