AI literacy is not optional

Credit: PerG.DArt

AI is probably going to become the most worn-out phrase in 2025. We're all fed up with the hyped promises, the power games of the tech bros, and the AGI circus.  However, AI is not going to vanish in hot air, like the Metaverse. If we look beyond the noise caused by greed and power games, AI is about mathematical algorithms wrapped in software and applied to large amounts of data for the purpose of automation, optimisation, and generation of outputs. I believe that 2025 will be a year of adoption. Algorithms will become part of a majority of digital applications. Most of them quietly working behind the curtains, solving real problems. And causing real problems.

 

A key element in responsible and value-creating adoption of AI is competence. In fact, Chapter 1 Article 4 of the EU AI Act deals with AI literacy:

Providers and deployers of AI systems shall take measures to ensure, to their best extent, a sufficient level of AI literacy of their staff and other persons dealing with the operation and use of AI systems on their behalf, taking into account their technical knowledge, experience, education and training and the context the AI systems are to be used in, and considering the persons or groups of persons on whom the AI systems are to be used.

 

In short, this means that companies providing or using AI systems in Europe are obliged to ensure that the relevant people in their organisations have sufficient knowledge about the AI systems in use. This obligation enters into force on February 2nd 2025. I suggest you read that last sentence one more time.

 According to a global survey by Upwork, 96% of executives state that they expect the use of AI tools to increase their company's overall productivity. Through my work as a keynote speaker, I meet thousands of leaders across industries across Europe every year. When I ask the audience, usually less than a quarter of them have tried an AI tool for professional or personal purposes over the last three months. To me, this indicates a situation where top executives have very little knowledge about the technology toolbox they expect their organisation to adopt. AI literacy is a much-discussed topic, and the offerings for courses are abundant; prompt classes, Copilot courses and chatGPT first aid kits. But do we have a conscious understanding of what type of literacy and skills that is needed for various resources and purposes in an organisation?

 I foresee that a majority of my incoming requests for keynotes in 2025 will be on the pragmatic use of mathematical algorithms applied to large amounts of data to automate, optimise, and generate outcomes. In order for leaders to get their hands around the AI beast they need to be able to see beyond the current fog of noise and hype.

 Generative AI has been hailed as the path to AGI. Nonetheless, it also has catastrophic environmental footprints. Leaning thoughtlessly on generative AI as a mean to "increase productivity" may be a fast and dirty path to burning this planet. Leaders need to understand that the AI toolbox is much bigger than chatGPT and that there is so much potential value to reap from using non-generative prediction models  for operational problem solving. Maths is just the natural next step of the digital transformation.

 This means that 2025 will be the year when leaders have to learn more about mathematical algorithms applied to large amounts of data. Fulfilling the AI literacy requirements of the EU AI Act needs to start from the top.

 

PS. If you need help, feel free to reach out.😉

Next
Next

Digitalization Gone Astray