Hacker News new | past | comments | ask | show | jobs | submit login

there is actually a paper by OpenAI themselves on summarizing long document. essentially, break a longer text into smaller chunks, and run a multi-stage sequential summarization. each chunk uses a trailing window of previous chunk as context, and run this recursively. https://arxiv.org/abs/2109.10862

did a rough implementation myself, works well for articles even 20k tokens. but kind slow because all the additional overlapping runs required. (and more costly)




gpt-index can do that automatically




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: