I can see how Twitters length-limit led to external link shorteners, but once they took it in-house, why expose it? Why not just say that a link (of any length) "costs" n characters, and handle the shortening/expansion on the backend? What is the benefit of exposing the mechanism?
Redirects of all types are especially bad on mobile devices since they require a pointless new round trip, which anecdotally seems to double the amount of time it takes for a webpage to start loading.
Since Facebook and many other services do this on their mobile apps by default, I am surprised I don't hear more complaints about it. Twice through the cell tower is too long.
I couldn't agree more. I looked a short while ago for a firefox extension which just looks for a url in the body of an anchor link, then alerts you if the htef attribute is different. Couldn't find one or be stuffed writting one, but was bemused about the relative lack of info on the topic.
My last straw using Google Chrome, was when I noticed that the browsers "could not load webpage" (e.g. When you don't have an Internet connection) redirects through Google when I click the "try reloading the page". I dunno if they still do it, but that was crossing the line for me.
Thanks to google's recent shenenigan's, I've switched to using StartPage. Which, at least for now, is google without big brother tracking. Although I'm sure if it gets popular, google will stop them.
If he was not misled by the false "on hover" representation of the link in his address bar, then its possible he means:
1. That Twitter uses a "short link" where Google uses some kind of (I trust) token-secured "open redirect"
or
2. That Google uses a {Javascript, 301, ...} redirect where Twitter uses a {Javascript, 301, ...} redirect.
Depending on what User-agent I sent, I got Twitter to variously return a Javascript or 301 Moved Permanently response. I could only get Google to return a Javascript response to cURL, but I did not try hard, and I would not rule out Google employing different redirect methods, particularly on their search results page. Google is notorious for falling back to different methods depending on the particulars of the client. See:
So they at least both use the Javascript method. In any event, if you mod out the content of the hyperlink after the domain, and mod out the content of the text based HTTP response (!), which is fair here, then the methods are all equivalent, and all generate the same server diagrams.
In this case, there is not much difference between the browser parsing plaintext in HTTP headers that tells it to go to a different site and the browser parsing plaintext in an HTTP body that tells it to go to a different site once executed in a Javascript engine.
How does the fake link hover work? With Javascript turned off the first time I hover over a link I see the google redirect url but the second time I hover I see the fake url.
Well that sucks, I stand corrected. I was looking at browser's link hover and hadn't noticed the redirect. I still don't understand why the browser isn't telling me what I'm clicking on, that's bothering me.
"Right click - copy link" is what's most annoying about this URL redirection. You don't want to send an URL of 4 lines in an email or IM, so then you have to go visit the link and get the real address from there.
Can't they do something where the HREF is left intact, but an additional onClick is added to track clicks?
If you're a regular user it's not exposed. Links are shown with their full url on the site and the user is usually redirected before tthey have a chance to notice they're being sent through t.co
It could be as simple as a flag on the API: Do you want shortened links and a guarantee that the message is less than 140 characters, or full link, with a risk that it will "spill over". As most (all?) clients expand links anyway, they'd be choosing the latter.
> Am I supposed to be mad at twitter for forcing me to use their unreliable link shortener?
Yes.
> How is this any different from the countless times twitter.com was down?
This is different because if I read a tweet cached in my client or archived somewhere I can't reach an external resource because a middleman is down.
> It's a centralized service, things happen.
The WWW is decentralized by design. Link shorteners, and reliance on them, make it obnoxiously centralized. The tweet itself should contain the full link and shorten it only on display [0] [1].
So, why is (apparently) everyone — including Twitter — sending us through t.co?
Example here[0] where you can see Twitter displays "mlkshk.com/r/K29L.jpeg" in the tweet, downright lies to you when hovering over it with "http://mlkshk.com/r/K29L.jpeg while the real href is http://t.co/fqxnG84t.
"If you are an authoritative administrator of t.co and you have solved the abuse issues you can write to us at dbl-mmxi@spamhaus.org from either abuse@t.co or postmaster@t.co and inform us of the actions you have taken to clean up the current spammer URLs. Please also inform us of any steps you have taken to prevent future abuse of your shortener/redirector. We will review your request and, at our discretion, remove the listing or respond to your request."
I find my Twitter useless now (although I can still copy links). It's amazing to see how Twitter has become my information/discovery network. I use Facebook just to browse pictures and other casual stuff shared by my friends/family.
Twitter can kill Flipboard and other tons of "discovery engines" with one flip. I think Twitter has figured out this stuff couple of years back and that's what they are doing. A real-time discovery network. It is more powerful than any of the existing old mediums, be it newspapers or TV networks.
EDIT: It also makes extremely powerful advertising medium.
I'm sure someone is having an interesting conversation right now. Twitter needs to stop having these embarrassing down-ages if they want to be a permanent player in the tech field.
They've had them all along. Didn't stop them from getting to where they are. Why should they start caring now? From a cost/benefit perspective, it is probably not worth it to them to do whatever is needed to have a six nines uptime.