Hacker News new | past | comments | ask | show | jobs | submit login

It's not like gpt models are some sort of random walk or Markov bot. They produce nuanced text with deep semantic relationships between large segments of text. There's a direct relationship between the algorithms human brains used to produce the training data and the algorithms being approximated by these models. No, gpt-3 isn't human level generally, but it is human level competent in some domains.

These tools are effectively less then a year old in production, but we're already seeing the potential for huge disruption in lots of markets based on relatively straightforward uses of the tech.

I can't wait to see what a skillful and artfully sophisticated use will be. I don't think we've even scratched the surface.




> There's a direct relationship between the algorithms human brains used to produce the training data and the algorithms being approximated by these models.

Eh. That deep learning networks is like the brain is like saying that cars are like cheetahs. Sure, they go really fast by converting some kind of fuel into kinetic energy, and they move by exerting force on the ground, but that's about it.

Brains don't have ReLU units. Brains have lots of different types of topology, not just an uniform network, and can handle some of that topology arbitrarily being shut off due to damage. Brains use global chemical changes for (otherwise) out-of-band signaling purposes. Brains don't use gradient descent. Etc...




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: