Just a side note, I was showing some friends gpt3 and asking some trivial questions, trying to impress them, when one of them asked me: "Is it any different from like asking google for the same information?" I mean, In a pure naive and practical point of view she wasn't that wrong.
I watched Google grow up, and for people who’s main experience is with the last decade or so, I’d say they have a very insightful view.
Yes, some have argued that Google can’t make content and from a technological standpoint that’s right, but here’s the fulcrum: from the perspective of “I asked a question and got a result that seems right though I’m not sure”, there’s shockingly little difference between GPT-3 generated content, and low-effort SEO-optimized content found on the first page of Google.
When you use Google, you expect it to help you to find something tangible that already exists. When you prompt GPT-3, it feels more like synthesising something that didn’t exist, even though all of the words and phrases in the response have been used before.
The line is blurry, though. Google uses machine learning too, so it’s possible that nobody on the planet has ever seen your particular search results for a given query, which feels a bit like synthesis, although the building blocks are bigger. And GPT-3 isn’t really synthesising at all, of course, but it’s a convincing enough illusion that synthesis is a helpful mental model.
Yes, of course it's different. Google can point you to a real Wikipedia article while GPT-3 will generate links to fake ones (or you could ask it to generate a fake article.) It's the difference between truth and fiction.
Or at least, a first-level approximation. Wikipedia articles can be wrong.
At the level of abstraction I believe your friend was talking about, yes, this is effectively a condensed model of "The Internet" and the results returned by Google.
For those saying it can generate new things, well it is interpolating between existing data points in the corpus, but mushed through a bunch of math.
Yes, but also, mixing together different articles on the Internet by different authors results in a complicated mixture of truth and fiction. What you get back depends on the query and some amount of randomness (depending on settings).
By not mixing the articles together, a search engine lets you see where the information came from.
These are both useful things to do, but one is more useful for fiction and the other for nonfiction.
I'm not sure "interpolation" is the right word, though, for these mixtures. Transformer output seems more creative than that.
As a creative person, creativity is over rated. But dont let me speak for everyone.
It is a straight line in a higher dimensional space. We vibe with the space, or it chooses us by prior experience. Creativity is in some aspect, in the eye of the beholder. I think the creativity we see in ML models is not unlike the creativity we prescribe to humans.
Interpolation is the right word and the right concept and metaphor. A mixing between two concepts to create a path between the two.