“Artificial Intelligence” (AI) and “Generative AI” (or “Gen-AI“) is becoming more of a click-bait for the corporate pitches now. These days, every tool and platform promises to have AI-powered capabilities, from boardroom discussions to regular tech updates, and although the potential of AI cannot be denied, this tendency is getting a little out of control. It seems that all digital services, solutions, and products are marketed as “AI-driven,” completely ignoring or hiding what’s running under the hood. This is true not just for the IT industry; but other industries as well where the Superhero type AI expectation is frequently unmet.

Disclaimer: This is an “AI” generated image. I know! Such an Irony, LOL.
AI/ Gen-AI is being overused ad-nauseum. No doubt using AI and ML has its upsides. It increases productivity multifolds, helps you climb the steep learning curve on complex topics, hell it can even code for you. But the idea will always come from you. To treat AI as a one-size-fits-all solution, especially in something as human-centric as L&D, can lead to utter disappointment.
The Over/Mis-Use of “Gen AI” in the Digital World
Generative AI refers to algorithms that can create new content, whether that be text, images, or even music. The AI models behind this, such as OpenAI’s GPT (Generative Pre-trained Transformer), can generate human-like responses and outputs based on the vast amounts of data they’ve been trained on. This technology is groundbreaking in fields like natural language processing, where computers can now produce coherent and relevant text or images based on the data fed to them.
But as with any new technology, hype often outpaces reality. Today, the term “AI” is slapped onto almost every digital product, often as a marketing tactic. Companies tout their tools as “AI-powered,” even when what’s really under the hood is a basic set of automation scripts or simple data analytics—technologies that have been around for years. This AI branding can sometimes give the impression that the technology is more advanced than it actually is, leading to inflated expectations about what AI can do, particularly in areas like HR and L&D.
As with any new technology, hype often outpaces reality. Today, the term “AI” is slapped onto almost every digital product, often as a marketing tactic. Companies tout their tools as “AI-powered,” even when what’s really under the hood is a basic set of automation scripts or simple data analytics—technologies that have been around for years. This AI branding can sometimes give the impression that the technology is more advanced than it actually is, leading to inflated expectations about what AI can do, particularly in areas like L&D.
AI in L&D: Navigating the Polanyi’s Paradox
When it comes to Learning & Development, the application of AI is even more fraught with challenges. Yes, AI can assist in personalizing learning paths, automating routine tasks, and generating content for training programs. But there’s a big difference between delivering information and facilitating true learning. AI can churn out structured content—facts, procedures, and knowledge that can be clearly defined—but it struggles with tacit knowledge.
Tacit knowledge refers to the kind of wisdom and skill that can’t easily be written down or codified. Philosopher Michael Polanyi called this “knowing more than we can tell,” which is also known as Polanyi’s Paradox. It describes the way people know how to do things intuitively, even if they can’t always articulate the steps. Think of a master craftsman teaching an apprentice—there’s a lot that happens through observation, practice, and personal insight that can’t simply be transferred through instruction manuals or training videos. This is where AI hits a wall.
AI has all the knowledge in the world. And it is good in task automation. Musk says all the cumulative human knowledge is already exhausted in training the AI [1]. But that doesn’t mean it can draw insights and help humans back in accumulating experiential learning without getting the hands dirty. Nicholas Carr in his book (although it is from 2014, it still holds true) states that
…the assumption that computers need to be able to reproduce the tacit knowledge humans would apply to perform complicated tasks is itself open to doubt.
The Glass Cage: Automation and Us – Nicholas Carr
In simple terms, AI is not mature enough to articulate the wisdom from it’s structured knowledge. Not yet.
Another aspect is Learning from Experience and Emotions. In L&D, true development goes beyond structured knowledge transfer. It requires personal insights, human interaction, feedback, and experience-based learning. Attach an emotion to a concept and the learning sticks for years. AI can help with some of the more routine aspects of learning, like suggesting relevant content or monitoring progress, but when it comes to giving personalized feedback or fostering creativity, it’s just not equipped to handle the complexity.
Why AI is Only a Supplement in L&D, Not a Replacement
Generative AI can be a valuable tool in L&D, helping to automate certain tasks and analyze learning data, but it cannot replace the human element that’s crucial to effective learning. AI can deliver structured content, suggest learning paths, and even assess performance to some degree. But true learning, especially in the workplace, goes beyond facts and figures. It involves mentorship, feedback, collaboration, and experience—things that AI cannot provide.
An AI lacks the ability to connect on an emotional level. A human coach or mentor brings empathy, insight, and the ability to adapt feedback in ways that machines simply cannot. Learning is a deeply human process, and while AI can assist in many ways, it will never be a substitute for real human interaction and judgment.
My 2 Cents
AI/Gen-AI with all its might and glory, can only assist in transferring structured knowledge—it can code for you, automate mundane tasks, make you more productive. But, it can’t fully replicate the human intuition that drives true learning and development.
As AI continues to develop, it will undoubtedly play a growing role in L&D. But it’s crucial to approach this technology with a healthy dose of skepticism. AI is not a magic solution—it’s a tool, and like any tool, its effectiveness depends on how we use it. In areas like L&D, AI can assist in overcoming some challenges, particularly in structured knowledge transfer and data analysis. But, it will always need to be supplemented by human oversight, empathy, and judgment. It’s important to remember that while AI is powerful, it’s far from perfect – We are not there, yet!