Very Large Online Platforms: So, you have 45 million users – what now?

February 17th marked an important milestone for the EU’s landmark Digital Services Act (DSA) as all online platforms operating in Europe were required to publish their average monthly active users (MAU). Online platforms and search engines whose MAU surpass 45 million users in the EU will now be designated by the Commission as Very Large Online Platforms (VLOPs) and Very Large Online Search Engines (VLOSEs). 

But what does this mean concretely – and what’s next for these companies? 

Four months until D(SA)-Day

So far, about 20 companies have publicly indicated having more than 45 million MAU in the EU.  A number of other platforms have also declared that they do not, at this stage, meet the fateful threshold: this could, however, change for those witnessing important growth in users over the next year, or beyond – pushing them toward the VLOP category. 

Importantly, regardless of how a platform classifies itselfit is the Commission that has the final say on whether it is to be considered as a VLOP. Eventually, a platform could still be designated as a VLOP in the coming months even if it did not meet the threshold today. For example, the Commission may request more information and also designate based on other sources if it has credible data that a service meets the threshold. 

Following designation, which will be officially communicated to the platforms by the Commission, the concerned services will have four months to complete their first risk assessment exercise. Given that the Commission is likely to move fast with designations, the first risk assessments will need to be completed as early as August 2023.

July 2023: The summer of risk assessments

As mentioned, the first DSA obligation VLOPs & VLOSEs will need to comply with is undertaking risk assessments(Article 35). Concretely, all services designated by the Commission are required to assess annually the systemic risks stemming from the design, functioning or use of the platforms, including any actual or foreseeable impact on:

  1. the spread of illegal content;
  2. the exercise of fundamental rights;
  3. democratic processes and civic discourse; and
  4. concerns related to gender-based violence, public health, minors, and serious negative consequences to people’s physical and mental well-being.

Additionally, for all of the above-mentioned risks, platforms need to assess if these are influenced by any intentional manipulations of their service. 

This obligation will present a significant challenge given the novelty of the exercise and the substantial range of risks covered. Some of the risks will be easier to assess, for example, the dissemination of illegal content, where definitions are more readily available – yet still a complex exercise given the differences across jurisdictions. Others, such as the negative effects on fundamental rights, will be far more complicated, given their broad scope (for example risks related to freedom of speech, non-discrimination, or children’s rights). The most challenging category is likely to be the assessment of risks where the effects are evident outside the platform including, for example, impacts on democratic processes or on psychological well-being. 

In short, VLOPs and VLOSEs will need to consider risks observed on the platform, such as the spread of terrorist content or hate speech, as well as risks where the impact is seen outside the platform, such as concerns related to the psychological well-being of users. In practice, this will likely also mean constructing multiple risk scenarios to understand the effects of the interaction between platforms and users.

As for assessing how potential intentional manipulations add to the risks mentioned, the Code of Practice on Disinformation gives some good indication as to what would be expected from VLOPs and VLOSEs while undertaking their DSA risk assessment cycle. 

What happens next: Transparency, audits & access to data 

The risk assessment obligation is only the first step. Once a platform has identified and assessed its systemic risks, it will be required to draw and implement detailed mitigation plans. In addition to this annual self-assessment obligation (risk assessment results and adjustment of mitigation measures ), VLOPs will be required to undergo yearly independent audits of their risk assessment and mitigation measures taken. Where the audit report finds deficiencies, the VLOP (or VLOSE) will have only a month to set out a plan on how to address the gaps identified. Once the audit is completed, platforms will be required to make the audit results public

The verification mechanisms do not stop here – data access provisions in the DSA mean that VLOPs and VLOSEs need to, under specific conditions, provide access to data to regulators as well as third-party researchers, allowing research into the systemic risks impacting the EU. As such, the risk assessments conducted by the platforms internally as well as the mitigating measures drawn are likely to come under significant scrutiny, not only by auditors but also by researchers who may conduct independent risk assessments based on the data they receive access to. 

Below 45 million users but rapidly growing user numbers?

If a platform is on the cusp of reaching 45M MAUs in Europe but is not quite there yet, this is the time to be testing and preparing in-house as well as looking at what the existing VLOPs will do to comply, since best practices are likely to emerge. The flexibility and applicability of the risk assessment framework on a wide variety of services mean that the definition of successful compliance will also evolve. The current VLOPs are likely to set the stage and can be a helpful benchmark for those reaching the threshold at a later time.  

In the meantime, such platforms need to keep in mind that the MAU numbers reporting requirement was not a one-off obligation – it is now a recurring duty. The Commission and national Digital Services Coordinators also reserve the right to ask for updated MAU numbers or for explanations about the underlying calculations at any time. 

How can Tremau help?

If you are a very large online platform or are likely to become one in the near future, you are likely already working on charting the relevant risk areas where an assessment is required under the DSA. However, risk assessments and necessary mitigation measures can be a handful for internal teams to deploy alone. 

Tremau’s expert advisory team can help you carry out these risk assessments as well as provide support to your internal teams. Further, at Tremau we specialize in assessing your existing processes and practices to flag where the current mitigation measures fall short of the DSA requirements and best practices in order to offer best-in-class remediation plans and long-term compliance.

Feel like you need support now? Tremau can help – check out our advisory services to know more.  

JOIN OUR COMMUNITY

Stay ahead of the curve – sign up to receive the latest policy and tech advice impacting your business.

Share This Post

Further articles

Best practices in trust & safety

Building a Robust Trust & Safety Framework

In today’s digital landscape, Trust and Safety have become paramount concerns for online platforms. With the introduction of regulations like the EU’s Digital Services Act (DSA) and the UK’s Online Safety Act (OSA), platforms face increasing pressure to ensure user safety and maintain trust. However, many find themselves at a loss when it comes to

DSA Compliance

Regulating Online Gaming: Challenges and Future Landscape

The online gaming industry has experienced significant growth in the past years. In the EU alone, the sector generated €23.48 billion in 2022, four times the revenue of digital music and almost twice that of video-on-demand. Only in the EU, the sector generated €23.48 billion in the EU in 2022, namely four times the revenue

Join our community

Stay ahead of the curve – sign up to receive the latest policy and tech advice impacting your business.