Hey everyone, I wanted to share a new tool we've created called Jotte (
https://jotte.ai) which we believe can be a game-changer for AI-generated longform writing like novels and research papers.
As you may know, current AI like ChatGPT and GPT-3 have a token limit of around 4000 tokens or 3000 words, which limits their effectiveness for longer writing tasks. With Jotte, we've developed a graph-based approach to summarize information and effectively give AI "unlimited" memory.
Jotte remembers recent details like the meal a character ate a page ago, while avoiding getting bogged down by irrelevant details like the blue curtains mentioned 5 chapters ago. We've created a proof of concept and would love to hear your thoughts on it.
Do you think this approach could lead to better longform writing by AI? Let us know in the comments!
I have a running bet with a friend about whether future is going to be OBM (One Big Model) or LoLM (Lots of Little Models). I'm strongly in the LoLM/graph camp and have been working in that direction as well: https://github.com/Miserlou/Helix