Agreed, that is a good resource for sure. For tooling I like https://promptmetheus.com/ but any pun name gets bonus points from me.
> For in-context learning, I think it is fair to expect 100k to 500k context windows sooner. OpenAI is already at 32k.
It has been interesting to see that window increase so quickly. For LLM context the biggest thing is the pay-per-token constraint if you don't run your own, so have to wonder if that is what will be around in the future given how this is trending? Just in terms of idempotent calls, throwing everything in context up every time seems like it makes it likely that OpenAI will encroach on the stores side as well and do sessions?
Agreed, that is a good resource for sure. For tooling I like https://promptmetheus.com/ but any pun name gets bonus points from me.
> For in-context learning, I think it is fair to expect 100k to 500k context windows sooner. OpenAI is already at 32k.
It has been interesting to see that window increase so quickly. For LLM context the biggest thing is the pay-per-token constraint if you don't run your own, so have to wonder if that is what will be around in the future given how this is trending? Just in terms of idempotent calls, throwing everything in context up every time seems like it makes it likely that OpenAI will encroach on the stores side as well and do sessions?