Skip to content Skip to footer

The AI Act, Innovation and Competition

The recently published AI Act aims to prevent market fragmentation, which is a positive development. However, many of its provisions also hinder innovation in Europe. Here is a brief overview of some the points discussed by Thibault Schrepel (ALTI) in “Decoding the AI Act: A Critical Guide for Competition Experts”:

  • Lack of practical tech neutrality: Although the AI Act aims for technology neutrality, it fails to differentiate between deterministic and nondeterministic AI systems, imposing the same stringent regulations on both. This approach penalizes safer AI systems that are easier to control and does not appropriately adjust regulatory burdens based on design.

  • Disproportionate compliance burden: Similar to the GDPR, the AI Act enforces the same rules on all companies, irrespective of their size. This places a disproportionate burden on smaller companies, which may struggle with the high costs of compliance compared to larger firms that can absorb these costs more easily. For example, the AI Act require costly compliance measures for high-risk systems, such as extensive documentation, continuous risk management systems, and human oversight. The logics applies for the provisions related to general-purpose AI models (GPAIs). These high costs will deter SMEs innovation.
  • Unnecessarily ambiguous language: The AI Act includes unnecessarily vague terms, leading to uncertainty and potential legal disputes. For instance, the definitions and requirements for high-risk AI systems and phrases like “sufficiently representative” and “to the best extent possible” make compliance challenging and expensive. Overall, ambiguities such as defining “real-time” biometric identification or what is “materially distorting” behavior, are likely to lead to legal challenges. This creates a chilling effect on innovation due to the fear of non-compliance​​.
  • Transparency vs. competition: The Act’s focus on transparency, including requirements for automatic event recording (logs) and system operation transparency, can lead to issues like reverse engineering and the sharing of sensitive information, potentially harming competitive dynamics​ and leading to collusion.
  • Standardization and industry dominance: The reliance on standards set by standardization organizations, which may be dominated by large players, will disadvantage smaller companies. This situation may lead to industry capture, where a few large entities control the standards, reducing overall market competitiveness and innovation.

  • Non-adaptive regulation: The AI Act lacks mechanisms for quick adaptation to new technological developments. While the Commission can amend the list of high-risk AI systems, broader changes require lengthy legislative processes, making it difficult to keep pace with rapid technological advancements.

These factors together create a regulatory environment that may inhibit innovation, especially for smaller companies and startups, and reduce the overall competitiveness of the EU’s AI market. For more (including complete references to all the Articles in the AI Act), seeDecoding the AI Act: A Critical Guide for Competition Experts”.

Amsterdam Law & Technology Institute
VU Faculty of Law
De Boelelaan 1077, 1081 HV Amsterdam