Dharmesh Shah on Demystifying AI: Exponential Growth, Linear Learning, and the Power of LLMs
- Unpacking Generative AI's Core Mechanics
- Why Learning AI is More Manageable Than You Think
- The Surprising Truth About Large Language Models and Tokens
Dharmesh Shah, in his Micro Masterclass at INBOUND 2025, provides a refreshing perspective on artificial intelligence, highlighting that while AI's capabilities are expanding exponentially, the journey to understand it is surprisingly linear and accessible.
Shah emphasizes that the perceived complexity of AI often overshadows its underlying simplicity, particularly when focusing on generative AI. He clarifies that generative AI, capable of creating new content, is fundamentally built upon Large Language Models (LLMs) that perform a singular, yet powerful, task: predicting the next 'token' in a sequence.
He further breaks down the concept of 'tokens' as small, efficient pieces of text, akin to how Gen Z compresses language. This foundational understanding reveals that mastering AI isn't about grappling with an ever-expanding, complex system, but rather incrementally building knowledge on core principles, yielding significant value from even a little learning.
“Although AI is growing exponentially in terms of capability, the learning curve for AI is much more linear. It's a manageable thing.”
- Dharmesh Shah,




