An AI Compliance Officer (AICO) within your organisation: a must-have?

From smart algorithms that assess mortgage applications to voice assistants that manage our diaries, artificial intelligence is no longer a distant future but an established part of business practice. More and more organisations use AI applications to optimise processes and reduce costs.

The AI Act requires organisations to critically review their use of AI and to take appropriate measures. In that context, appointing an AI Compliance Officer (AICO) is not only sensible, but in some cases a functional necessity.

Inventorying AI systems

The first step towards AI compliance is to identify the different types of AI used within the organisation. AI comes in many forms and will often meet the definition of an AI system under the AI Act. In practice, however, it is not always immediately obvious that a system qualifies as AI. Think of autonomy and inference embedded in software tools or complex automation processes.

AI systems must then be classified by risk level, distinguishing between prohibited practices, high-risk AI systems, AI systems with transparency risks, and a residual category. Drawing up an algorithm register in which these systems and their classification are recorded is strongly recommended. Please note that the use of AI systems falling within the prohibited category should already have ceased as of 2 February 2025.

Organising AI literacy

AI is fundamentally different from traditional software. Some AI models can even train themselves and make autonomous decisions. Small errors in a dataset or in programming code can lead to major consequences in an AI system’s outputs, such as bias or discrimination. The AI Act therefore sets explicit requirements for the level of AI literacy within an organisation. This means employees must understand which types of AI systems are used, how those systems work, and which risks are associated with them. Depending on their role, they must be able to identify, report and, where necessary, mitigate those risks. While basic AI knowledge is sufficient for most roles, IT administrators, lawyers and developers, for example, often require more in-depth expertise. Organisations can promote AI literacy through training courses and workshops. This knowledge can also be acquired externally, for example by attending our CAICO programme.

Does every organisation need an AICO?

Not every organisation needs an AICO, but every organisation that uses or develops AI needs some form of AI compliance. The application areas and risk classification of the AI in use determine the required level of AI compliance and governance. In many cases, it is worthwhile to assign responsibility for AI compliance to someone within the organisation, possibly in a combined role, provided that person has the appropriate expertise. As AI becomes more integral to the organisation, it becomes necessary to assign this responsibility explicitly to a specialist. Appointing an AICO is then not a luxury, but a strategic step towards making it future proof.

The AICO develops a compliance strategy and draws up a roadmap that enables the organisation to work in a structured manner towards compliance with the AI Act. Based on the identified systems, the AICO carries out impact assessments to map the risks. The AICO translates these insights into concrete policies, ranging from guidelines for the use of tools such as ChatGPT to comprehensive protocols for managing AI systems. The AICO also supports the implementation of appropriate control measures to ensure that compliance is workable and effective. Finally, the AICO provides periodic reports and adjusts where necessary, for example when legislation changes.

An AICO acts as a bridge between technology, legislation and day-to-day practice. Having an AICO not only helps to comply with the AI Act, but also builds trust among customers, partners and regulators. This makes the role distinctive and valuable for any organisation that wants to deploy AI responsibly.

Support with the AI compliance issue

We provide comprehensive support for AI compliance matters. We help organisations share knowledge and build expertise through a range of tailored training programmes, from in-depth CAICO courses and sector-specific study days to basic training on the required level of AI literacy. We also carry out full AI inventories to identify and categorise systems. For organisations that are not yet able to fill the AI compliance officer role internally, we offer flexible solutions. This specialist function can be fulfilled by us, either on a secondment basis or remotely, relieving organisations of the burden of complying with the new AI legislation.

CAICO

Back to overview