These regrets sound to me more like "your problems" than problems with this confirmation scheme.
> The problem is that the links become ugly, unwieldy for the users to handle
Why? The hash only needs to be big enough that brute-forcing becomes infeasible. Depending on the type of server-level flood control you're using, you can exactly determine the amount of bits required.
Or if you choose not to, a conservative estimate of 48 bits (8 characters in base64) characters should be more than enough. Remember that offline brute-forcing is not possible here, and at a rate of, say, 1M tries per second you'll be brute-forcing yourself silly for somewhere in the order of 2 years.
If you can make tighter estimates about flood control, these are rough (and still conservative x4) estimates: at a limit of 15k tries per second you can do with 7 bytes (times 6 bits). At a limit of 200 tries per second, just 6 bytes of base64 code (without proper flood control a large botnet could make this last one somewhat feasible, maybe).
... unless you mean to imply that the string "reset?user=John&expiry=987654321" is the ugly, unwieldy part :)
> For example the base64 encoding I was using for the authentication token included '.' as a one of the characters. Well, turns out that the linkification code in gmail will ignore a '.' at the end of the link (which is kind of sensible). So 1/64 of my validation links ended up being invalid.
Using the wrong encoding for your URLs has nothing to do with this technique. In fact, I don't think the article even suggested a particular type of encoding.
Did you not know what characters your base64 routine would output, or something? There's more than just the period that you don't want to end a URL with because of ambiguity with the ways people might want to use URLs in a sentence: Closing parens, comma, exclamation, asterisk, etc. Regardless of what one particular webmail provider will linkify or not, it's just common sense, if you think about it.
Closing parens are always a big problem with many Wikipedia links, it's not like this is some obscure "gotcha".
That is what the %XX URL-encoding was made for.
Or why not just use what YouTube does, they seem to have it figured out pretty well: A-Z, a-z, 0-9, _ and -
> The problem is that the links become ugly, unwieldy for the users to handle
Why? The hash only needs to be big enough that brute-forcing becomes infeasible. Depending on the type of server-level flood control you're using, you can exactly determine the amount of bits required.
Or if you choose not to, a conservative estimate of 48 bits (8 characters in base64) characters should be more than enough. Remember that offline brute-forcing is not possible here, and at a rate of, say, 1M tries per second you'll be brute-forcing yourself silly for somewhere in the order of 2 years.
If you can make tighter estimates about flood control, these are rough (and still conservative x4) estimates: at a limit of 15k tries per second you can do with 7 bytes (times 6 bits). At a limit of 200 tries per second, just 6 bytes of base64 code (without proper flood control a large botnet could make this last one somewhat feasible, maybe).
... unless you mean to imply that the string "reset?user=John&expiry=987654321" is the ugly, unwieldy part :)
> For example the base64 encoding I was using for the authentication token included '.' as a one of the characters. Well, turns out that the linkification code in gmail will ignore a '.' at the end of the link (which is kind of sensible). So 1/64 of my validation links ended up being invalid.
Using the wrong encoding for your URLs has nothing to do with this technique. In fact, I don't think the article even suggested a particular type of encoding.
Did you not know what characters your base64 routine would output, or something? There's more than just the period that you don't want to end a URL with because of ambiguity with the ways people might want to use URLs in a sentence: Closing parens, comma, exclamation, asterisk, etc. Regardless of what one particular webmail provider will linkify or not, it's just common sense, if you think about it.
Closing parens are always a big problem with many Wikipedia links, it's not like this is some obscure "gotcha".
That is what the %XX URL-encoding was made for.
Or why not just use what YouTube does, they seem to have it figured out pretty well: A-Z, a-z, 0-9, _ and -