Encoding goes
E(T), E(E(T)), etc. let K be the per-char encoding overhead, so E(T) = KT: complexity is O(K^N * T) where N is number of levels and T is length of text at level 0!
It seems that due to the extra text from https://itty.bitty.site each iteration increases the size of the next link. I wonder how the urls are generated and if this is a linear or exponential increase.
My guess is 1st-degree polynomial. There’s a constant addition with each iteration, plus the contents appear to be base64(ish)-encoded into the next URL, which multiplies the length by a constant.