That's great, and I don't begrudge you, but most people want to be able to tell their computer what to do and not need to understand the discrete steps it took to get there.
Taking a completely different type of example, image editing. Let's say you ask your computer to remove a blemish in a photo. A professional could remove it, maybe better even, without AI. They know the tools to use, the keys to press, and effect change. Regular people don't give a crap about that, they want to circle the item (or otherwise identify it) and click "remove." When the computer removes the selected item they're happy, and generative AI is working on THAT type of solution.
It's not here yet, so yes you're right that Siri IS maddening to use.
> Regular people don't give a crap about that, they want to circle the item (or otherwise identify it) and click "remove." When the computer removes the selected item they're happy, and generative AI is working on THAT type of solution.
This feels dangerously close to a lack of empathy for the user. I understand that's not your intention, in fact the opposite. But in order to accept the notion that users actually want an intelligent employee instead of a tool I have to believe that everyone truly wants to be a manager instead of an individual contributor. I don't believe it.
Take a simpler case, hammering in a nail. What I want from my hammer is for it to disappear and become an extension of my arm. I just want to hammer in the nail. I don't want to negotiate with the hammer about how it's going to strike the nail, all I want is to hit the nail. There's no amount of "clever" the hammer can be which will help. Cleverness can only hurt my user experience.
In your example, what recourse does the user have if the AI didn't do the job the way they wanted? Removing something from an image implies (probably? or maybe not?) that the void is "backfilled" somehow. What if they're not happy with the backfill job? Do they have to argue with the tool about it? Will the tool take their feedback well or will it become a fight?
I think, generally, giving users tools that scale like hammers is the way to go. A hammer in the hands of a skilled carpenter, blacksmith, or cobbler with 30yr experience is no different than the same hammer in the hands of a 2yo child learning to drive their first nail. But that hammer's utility will scale with that child's skill for their entire lifetime. There's no "beginner" vs "advanced" distinction. What makes us (as computer hammer builders) believe that we can distinguish between "beginner" or "advanced" computer hammers? Or "regular" vs "special" users?
EDIT: or maybe we're not building hammers, instead we're building dishwashers. Dishwasher users aren't supposed to be skilled beyond loading and unloading the dishwasher, and hitting the start button. Do "regular users" really want an appliance, or do they want a tool?
EDIT: another way to phrase it -- are computers "bicycles for the mind" or are they just a bus?
That's very true, I dislike how Apple, etc. don't uncover the manual controls for things. So when the smart tools stop working it gets frustrating because there's no manual way to continue.
Taking a completely different type of example, image editing. Let's say you ask your computer to remove a blemish in a photo. A professional could remove it, maybe better even, without AI. They know the tools to use, the keys to press, and effect change. Regular people don't give a crap about that, they want to circle the item (or otherwise identify it) and click "remove." When the computer removes the selected item they're happy, and generative AI is working on THAT type of solution.
It's not here yet, so yes you're right that Siri IS maddening to use.