ChatGPT is dumber than it looks
That’s not true for a screwdriver.
Or a table saw or even a spatula.
These are useful tools, but they don’t pretend to be well-informed or wise. They’re dumb, and they look dumb too.
That’s one reason that tools are effective. We use them to leverage our effort, but we don’t trust them to do things that they’re not good at.
The reason AI language models are dumb is that they don’t actually know anything, the model is simply calculating probabilities. Not about the unknown, but about everything. Each word, each sentence, is a statistical guess.
I’ve switched mostly to claude.ai because it’s more effective and less arrogant, but it’s still guessing.
If a guess is good enough, you’re set. If it’s not, plan accordingly.
In my experience, the most useful approaches to AI are:
Ask clearly bounded questions, where you can easily inspect the results.
Don’t let AI make decisions for you. Instead, challenge it to broaden your options.
Take advantage of the fact that it doesn’t have feelings, and use its honesty to get useful feedback.
Don’t ignore AI because it’s dumb. Figure out how to create patterns and processes where you can use it as the useful tool it’s becoming.