In the customer support example, he tells it his new phone doesn't work, and then it just starts making stuff up like how the phone was delivered 2 days ago, and there's physically nothing wrong with it, which it doesn't actually know. It's a very impressive tech demo, but it is a bit like they are pretending we have AGI when we really don't yet.
(Also, they managed to make it sound exactly like an insincere, rambling morning talk show host - I assume this is a solvable problem though.)
It’s possible to imagine using ChatGPT’s memory, or even just giving the context in an initial brain dump that would allow for this type of call. So don’t feel like it’s too far off.
(Also, they managed to make it sound exactly like an insincere, rambling morning talk show host - I assume this is a solvable problem though.)