How can companies prepare for the EU AI Act and AI Literacy?

An Interview with Vonne Laan, Lawyer and AI & Privacy Expert at The Data Lawyers

Since February 2, 2025, organizations must start working on AI literacy, based on the AI Act. Vonne Laan, a lawyer at The Data Lawyers, shares her insights and key lessons in this interview on the impact of this law on organizations, the differences from existing regulations, and how companies can prepare while continuing to innovate. This interview follows an event previously organized by Conclusion AI 360 on the EU AI Act and AI literacy.

Key Takeaways:

  • The EU AI Act encourages organizations to innovate responsibly.
  • Involve legal experts from the start to quickly identify risks and prevent delays.
  • Start with the basics: begin small and build step by step.

February 4th, 2025   |   News   |   By: Conclusion AI 360

Share

What exactly is the EU AI Act, and why is this legislation important?

“The EU AI Act is a new European law designed to ensure the responsible use of AI systems,” Vonne begins. “It complements existing regulations such as the GDPR and product safety laws. The EU AI Act is part of the broader EU Data Strategy, which aims to foster innovation in a responsible manner. The law is intended to minimize risks to health, safety, and fundamental rights while promoting innovation.”

How does the EU AI Act differ from existing regulations like the GDPR?

“While the GDPR focuses on privacy and data protection, the EU AI Act has a broader scope. It looks at risks in the context of health, safety, and fundamental rights. Additionally, the EU AI Act is risk-based: systems are classified into risk categories ranging from minimal to high risk. The higher the risk, the more obligations apply. This prevents overregulation,” Vonne explains.

She emphasizes that the EU AI Act supplements the GDPR rather than replacing it. “The goal is not to burden businesses but to fill gaps in existing legislation.”

What obligations does the EU AI Act introduce?

“The EU AI Act establishes different levels of obligations depending on the risk level of an AI system,” Vonne explains. “One of the first obligations is that, from February 2, 2025, companies must train their employees in AI literacy. This means that everyone in the organization must understand what AI is, how it works, and what risks it entails.

For low-risk systems, the primary requirement is transparency: users must be aware when they are interacting with AI, such as in chatbots or deep fakes. For high-risk systems, stricter requirements apply, including data governance, risk management, and oversight.”

On prohibited applications, Vonne is clear: “Think of social credit scoring by governments. That is not allowed in the EU.”

Key Obligations of the EU AI Act:

  • Training and AI Literacy: From February 2, 2025, organizations must train their employees on AI-related topics, tailored to their roles and backgrounds.
  • Transparency Obligations: Users must be informed when they are interacting with AI, such as in chatbots or deep fakes.
  • High-Risk Systems: Stricter requirements apply for high-risk systems, including data management, cybersecurity, risk management, and oversight.
  • Prohibited AI Applications: Some AI systems, such as social credit scoring, are completely banned.

How can companies legally demonstrate compliance with the EU AI Act?

"It starts with taking inventory of the AI models you use and conducting a thorough AI assessment," says Vonne. "After an AI inventory, an AI risk assessment is conducted. This assessment determines both the risk qualification of a system and the organization's role within the EU AI Act. For high-risk systems, a Conformity Assessment is mandatory, which provides a structured evaluation of compliance. Additionally, specific impact assessments are required, such as a DPIA (Data Protection Impact Assessment) and an FRIA (Fundamental Rights Impact Assessment)."

What are the potential consequences of non-compliance?

"The fines are substantial," warns Vonne. "While the maximum fine under the GDPR is €20 million or 4% of an organization's global turnover, under the EU AI Act, it is €35 million or 7%. Additionally, damage claims and reputational harm are real risks." She also emphasizes the importance of trust: "If an organization does not comply with the law, it can seriously damage the trust of customers and partners."

What role do legal experts play in AI projects?

"Lawyers are essential to ensuring compliance," says Vonne. "We make sure that actions align with the law and recognize how documentation could later be used against an organization, for example, by a regulator or a competitor. Our advice is tailored to implementation and focused on daily practice. We translate vague legal standards into concrete, applicable policies."

"The statements made by a chatbot can have legal consequences."

She adds: "Consider promises made to a consumer, such as guaranteeing a discount."

Does the AI Act stand in the way of innovation?

“I don’t see innovation and compliance as opposites,” Vonne states. “It’s about responsible innovation. Start small, prioritize, and build from there. But start now. Compliance obligations take time and resources, but they prevent major issues in the long run.”

What are your recommendations for companies preparing for the AI Act?

Vonne concludes with a few practical tips:

  • "Learn from the implementation of the GDPR. Many lessons are also applicable here. Consider tailoring information to the recipient and experiences regarding when Privacy Champions, also known as Privacy Ambassadors, are or are not suitable."
  • "Start with a foundation and build gradually. Strict prioritization and a long-term plan can help with this."
  • "Involve legal expertise early in the process to prevent mistakes and delays."

The AI Act not only presents challenges for companies but also opportunities to differentiate themselves through responsible and inclusive AI applications. As Vonne aptly puts it:

"Responsible innovation will take you a long way toward compliance."

The rapid evolution of AI demands flexibility from organizations—businesses must be able to recognize AI risks while seizing AI opportunities. Start small, and don’t wait until everything is perfectly in place. Do both simultaneously—waiting another three years is not an option.

Does your organization have questions about the AI Act or need support?

Get in touch with The Data Lawyers or Conclusion AI 360 to ensure your organization is prepared for this new regulation.