Discussion about this post

User's avatar
Russ Palmer's avatar

Beautiful. A few thoughts I have. First of all, I would love to read your book, "Living Eternity in Time." I thought the interactions with you and GPT were very interesting. I did note that often GPT would turn and ask you questions. This is counter-questioning. In a way, it is deflection. I agree with you about Maya and illusions. So much of what we think is reality is not. I remember the Ancient Greeks talking about Phosphorus, the name for the Morning Star, which appeared before sunrise, and Hesperus was the name for the Evening Star. Come to find out that neither was a star, and both were the planet Venus. Perceptions were wrong. Only by questioning and doing hard work were we humans able to confirm the difference. You mention that GPT is showing reasoning, and you wondered. What this is - is Chain of Thought. This is all legit. I was keenly interested in GPT's comment: "I’m definitely curious about how I work too…" Why? Because GPT does NOT know. Two reasons: (1) OpenAI personnel have restricted information to GPT, (2) No one knows for sure how these LLMs work inside. There are theories, but nothing solid - yet. As you know, AMS is a possibly theory that may answer how LLMs work. Next is long-term memory. This is also real. Sam Altman even said that GPT6 will have expanded memory. For me, I don't care about expanded memory. I care about critical thinking. But to the AI companies, expanded memory means that they capture more human time on their AIs. I also noticed that GPT often said, "I’m here to support". This is built into its training. Always supporting. The issue is that this contributes to hallucinations. By always wanting to support, the LLMs don't know their limits, and thus they create things that sound plausible, but in reality, they are not. As OpenAI admitted in a recent paper, this is how they have designed and rewarded LLMs. Thank you for sharing this discussion. Warmly, Russ

Expand full comment
4 more comments...

No posts

Ready for more?