-
Knowledge versus Language
The biggest limitation of today’s Large Language models (LLMs)—or just ChatGPT for many people who aren’t in the AI space—is the tight coupling between the language, knowledge and context. I call today’s kind of statistical approximation ‘lossy’ because its representation…
-
Understanding isn’t just memorization
We learn all the time, continuously, regardless of our age. We never stop, but would it surprise you that many scientists propose the model where we stop learning while we are young? That is false, although more research would help…
-
New language model for human conversation!
The breathtaking view from Kobe University looking over Osaka Bay – home to the RRG 2025 bi-annual conference. Linguistic Conference: RRG 2025 The linguistic conference in Kobe, Japan, has just wrapped up. Expert linguists from around the world gave English…
-
Magic from the Speech Genie
Learning a new language as an adult is difficult, requiring years of work to progress. Or does it? Today’s discussion is with Chris Lonsdale, the creator of Kungfu English, a system designed to mimic his success in learning Mandarin and…
-
What’s missing from AI – Part 1
A brain in hand, a robot here ponders the use of ‘contextual meaning’ to help it emulate humans. But how do we store meaning? Photo by julien Tromeur on Unsplash Background In the 1930s, the American focus on behaviourism turned…
-
Patom Theory Understands the Meaning Behind Language
For most of my life, I have pondered a question that sits at the very center of howour brain works: how do we understand language?The question isn’t how we repeat language, nor how we recognize its surfacepatterns, but how we…
-
Your Struggles Aren’t Your Fault – Rebooting the Brain’s Natural Language System
For more than forty years I have asked myself a simple question: why do some people learn languages quickly and naturally, while others struggle despite years of study? This question has shaped almost every major decision of my life. It…
