
EU AI Act: AI literacy becomes law
The EU AI Act (the “act”) came into force on 1 August 2024 with a staggered implementation over the next 2 years. One of the first provisions to take effect is Article 4 regarding AI literacy, which must be complied with by 2 February 2025. It applies to all organisations that are involved in the development, import, distribution or use of AI in the EU.
What is AI literacy?
The Act defines AI literacy as “skills, knowledge and understanding that allow providers, deployers and affected persons, taking into account their respective rights and obligations in the context of this Regulation, to make an informed deployment of AI systems, as well as to gain awareness about the opportunities and risks of AI and possible harm it can cause” (Article 3 (56) EU AI Act). The definition is deliberate broad and generic. It does not address the level or quality of skills, knowledge and understanding required. Rather, it emphasises is arguably the terms “make an informed deployment of AI systems”, which clearly implies that the culmination of AI literacy is found in knowing, in holistic sense, how best to use AI. It also shows the importance ensuring that those who use AI have sufficient knowledge and skills to navigate the complex and ever-changing landscape of AI.
What is the compliance obligation imposed in the Act?
Article 4 of the Act contains the requirements of AI literacy. It states:
“Providers and deployers of AI systems shall take measures to ensure, to their best extent, a sufficient level of AI literacy of their staff and other persons dealing with the operation and use of AI systems on their behalf, taking into account their technical knowledge, experience, education and training and the context the AI systems are to be used in, and considering the persons or groups of persons on whom the AI systems are to be used.”
The scope of Article 4 is very important. It applies to providers (meaning those developing AI systems, per Article 3 (3) of the Act) and deployers (meaning those using AI systems in their organisations, per Article 3 (4) of the Act). This means the AI literacy obligations are likely to apply to most organisation that have use of AI.
Article 4 goes on to state that providers and deployers must ensure “to their best extent” that their employees and anyone else who operates their systems have a “sufficient level” of AI literacy. This is an obligation and includes considering their technical knowledge, experience, education and training as well as the context for using the AI. The aim is to ensure AI is used responsibly and to minimise risk and harm. There is currently little guidance on what is meant by “best extent” or “sufficient level”, both of which appear to be contextual and flexible concepts. Evidence for this is born out by the remainder of Article 4 which states that training must take into account “…the context the AI systems are to be used in and the persons … on whom the AI systems are to be used.” In addition, EU law is subject to an overriding duty of proportionality found in Article 5(4) of the Treaty on European Union which means that the provisions should not be applied in a manner which is clearly disproportionate to the aim it is designed to achieved.
How can organisations comply with the AI literacy requirements?
Compliance involves more than just technical knowledge: it includes broader issues such as ethics, commercial awareness and your organisation’s risk appetite. Most organisations will fall under Article 4’s provisions and will therefore need to comply with the Act and whilst the wording in Article 4 is somewhat vague, some steps can be taken now to meet the requirements of the Act. They include:
-
Staff training
AI literacy training should be mandatory for all staff, including new joiners and existing employees, involved in the use of AI. Your organisation should assess current AI literacy levels amongst employees and develop training programmes with continuously updated content and refresher training sessions. Your organisation should keep detailed records of all training sessions to demonstrate compliance. Cripps can assist with the provision of AI literacy training, should you need help with this.
-
Governance framework
Your organisation should have robust policies and procedures to ensure compliance with the Act. This is because AI literacy includes being aware of your rights and obligations under the EU AI Act and having a governance framework ensures employees and other staff understand the responsibilities and procedures needed to comply. Policies should cover the responsibilities and procedures for AI compliance, including what AI can and cannot be used for (for example not inputting sensitive or confidential information) to help mitigate risk. There should be a dedicated team responsible for compliance who can provide ongoing training and provide updates on new developments.
-
Inventory of AI systems
It is crucial to good AI governance to keep an inventory of all AI systems used within the organisation. This should include whether the AI is developed internally or externally, what the systems are used for, and what they are designed to do.
Compliance with the Act is a continuous journey that will evolve over time. By taking these steps now, your organisation can help to ensure it meets the AI literacy requirement without sufficient knowledge and skills, there is a higher risk of AI being misused, leading to potential harm. Additionally, organisations may miss out on the full potential of AI to drive innovation. AI literacy is not only a regulatory requirement but also essential for businesses to stay competitive in today’s landscape.
How we can help
If you would like help with complying with the AI literacy requirement, please contact our commercial team who are happy to help
Talk to us about
Related services
Related sectors