Discussion about this post

User's avatar
Ted's avatar

"Memory is the second half of AGI."

Yes. Absolutely and so much! Well. Part at least? This AI Apocalypse is so fun!

One important aspect of LLMs et al that I keep in mind is that they are not us. Most humans can keep 5-7 thoughts in short-term memory aka context. We only 'know' this because we created a model of human thought and based a test (IQ) on that. Should the concept even apply to LLMs? For now, it's the best model we have but I'd guess you agree that we need new models of cognition to grok AGI. So far we've been evaluating them against human tests. What would an LLM-native IQ test be?!

Dogs & horses don't understand humans minds and vice-versa. We have worked together effectively for 1000's of years to our mutual benefit. We may never understand AGI. AGI may never understand us. You & I will never know each others' mental processes but we can work together. There's likely a kernel of deep wisdom in this paragraph. Maybe AGI can figure out what it is. :)

Expand full comment

No posts