Hacker News new | past | comments | ask | show | jobs | submit login

I believe those demos involve "priming" GPT-3 by providing a few examples of (text → generated code), then during inference time passing in just the text. The model would follow the examples provided and subsequently generate a string of code/mockup syntax, which is then evaluated.

Edit: here is a tweet (from the author of the GPT3 layout generator) that seems to show this in practice: https://twitter.com/sharifshameem/status/1282692481608331265




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: