This is an issue. I haven't experimented to see if there are workarounds, so the service currently checks the length of the article text and if it's very long, it will send a portion, otherwise we'll exceed the token limit. There's a note on the front page about it: "Limitations: The OpenAI API does not allow submission of large texts, so summarization may only be based on a portion of the whole article."