AI Act | Why AI Literacy is Essential
Even advanced AI models can produce absurd results if the user lacks the competence to use it properly. With the EU AI Act set to take full effect by 2026, AI literacy is now s a legal requirement. Article 4 of the AI Act highlights AI literacy as a key responsibility for anyone developing, deploying, or using AI systems. This article explains how AI literacy is defined, its purpose, and where regulatory accountability lies.

How many ‘R’ are there in Strawberry? It is a meme by this point but it is still an illustrative example. The user queries ChatGPT on the number of the character ‘R’ in the word ‘Strawberry’. The answer is of course simple; there are three ‘R’ in ‘Strawberry’. ChatGPT however replies that there are only two ‘R’. The user repeats and rephrases his or her query, but the AI relents – there are no more than two ‘R’ in Strawberry. Finally, the user asks; ‘would you bet $ 1 million?’ The AI replies – with all the confidence in the world – ‘Yes, I would! There are only two ‘R’ in ‘strawberry’. This example is harmless; there are even memes on the internet poking fun at the expense of GPT models. However, the lesson to be learned is that even an advanced model can generate absurd outputs if the user lacks the know-how of how to use the AI. The user must have a certain level of experience and know-how of questions – the prompts – to be able to generate output of value.
What defines AI Literacy?
AI Literacy is defined in Article 4 of the European AI Act as:
Providers and deployers of AI systems shall take measures to ensure, to their best extent, a sufficient level of AI literacy of their staff and other persons dealing with the operation and use of AI systems on their behalf, taking into account their technical knowledge, experience, education and training and the context the AI systems are to be used in, and considering the persons or groups of persons on whom the AI systems are to be used.
From the wording of the provision, all use of AI systems – regardless of risk level – appear to entail an AI Literacy requirement. Of course, technically for complex models and systems as well as systems of high risk will mean corresponding higher requirements of AI Literacy.
Consequently, financial institutions that use AI systems for credit scoring or recruitment as well as insurance companies that use AI for risk assessment in the case of life and health insurance products will have a corresponding higher requirement of AI Literacy.
AI Literacy is not to be construed as an education program of the AI Act or regulatory matters. It is a framework to understand, evaluate, and use AI as an emerging technology. A basic or more advanced understanding of the AI Act, the GDPR’s provisions of automated decision making and profiling, as well as legal acts on model risk management may be necessary as a part of an AI Literacy program. Other relevant aspects of AI Literacy may be the company’s AI and Data Strategy, data governance, cyber security and resilience, as well as basic use cases.
The purpose of AI Literacy
The legislator has two key purposes with the concept of AI Literacy:
- The first purpose is to ensure that companies – from the board of directors down the chain of command to individual users – have the know-how, experience, and when necessary, the technical understanding, to use this technology. If the user does not have a basic understanding on how to, for example, prompt an AI model and do not question the output data – the consequences can be severe.
- The second purpose is that the EU has high hopes of establishing EU as an ‘AI Continent’. By means of the AI Literacy requirements as well as other initiatives such as the AI Factories (two such factories will be established in the Nordics – one in Sweden and one in Finland), the EU indents to foster strategic leadership as well as a high level of knowledge of individual users.
A board of Director level requirement
AI Literacy can be subdivided into four different levels:
- The Board of Directors
- The C-suite
- The AI Governance and management
- The user
Perhaps it is to state the obvious, but the tone starts at the top and it is ultimately the Board of Directors that is responsible for the company’s compliance with legal obligations. The purpose of AI Literacy is also clear that it aims to foster strategic leadership and initiative. On the other end of the spectrum, we have the individual users. The example from the heading of the number of the character ‘R’ in the word ‘strawberry’ is such an example of the knowledge that individual users will require. In the middle of the organisation, there is governance, risk management, and compliance of AI. It is probable that it is this organisational layer that will have the highest requirements of AI Literacy.
Implementation of AI Literacy
The requirement of AI Literacy became applicable in February 2025 – i.e. it is already the law of the land. While the large part of the AI Act will become applicable on 2 August 2026, AI Literacy is a requirement that should be considered already today.
For more information on the EU AI Act please visit our designated AI site.


Our AI services
Advisense is here to navigate you in the AI Act, offering a comprehensive range of services that include:
- Establishment of AI Literacy programs
- AI Act Gap Analysis
- AI Model Validation
- AI Governance Audit
- Risk assessment of AI Systems
- AI Models and AI Act workshops