"And they're right- a boy who enjoys t-shirts and sports is at a tremendous social advantage over one prefers dresses and ballet"
Perhaps in some very conservative societies.
Where I live I'm confident that a boy who enthusiastically engages in things which actually interest him is at a social advantage over boys who halfheartedly pretend to enjoy in whatever his parents think is normal.
Ballet, for sure. Maybe not dresses so much, at least not if he is doing the wearing, but having an interest in the clothes of his female peers certainly isn't any kind of social problem.
For small messages it will be slightly slower in terms of CPU time because of JS vs native DOM parser. But for small messages the time taken with modern JS engines should be less than a monitor refresh.
For small messages on fast networks there really isn't very much time to save, so for this worst case in terms of looking for somewhere to optimise I think it is best to aim for no significant performance impact. The best improvements are where the messages are very large, on slow networks, or where the server can write out a stream.
The XHR2 spec specifies "While the download is progressing, queue a task to fire a progress event named progress about every 50ms or for every byte received, whichever is least frequent"
Because of "whichever is the least frequent", for messages that take less than 50ms from first byte to last, Oboe.js will get only one callback from the browser and notify all callbacks from inside the same frame of Javascript execution. This is exactly what we want it to do since it avoids rendering happening between the callbacks which would be very bad for overall download time.
I'm in two minds if Oboe should be dropped in to replace JSON AJAX calls, because everything will be accessed from a slow network one day, or if it should be used only for the very large, streamed REST responses.
Two different user agent strings for one version of one browser? That can change at any moment based on a list the site authors don't control? There's no way that'd ever confuse things!
Which is precisely why good web developers never trust user agents. Because they confuse things and cause bugs like this.
It's lazy/incompetent practice, which shouldn't really excusable for such a supposed massive advocate of web standards. MS are no shining example, but it's pot calling kettle black.
There are several reasons why UA detection is relevant where feature detection fails. The first is that you can serve prebuilt js binaries that have only the relevant logic for the requesting browser; strictly smaller js files for every user since you aren't serving irrelevant js.
Some things aren't feature detectable, like actual weird bugs in behavior that need browser specific workarounds.
In this case it looks like MS has the domain listed for their fallback mode, which doesn't just change the UA but actually changes the browser behavior (IE9 in IE8 mode has no canvas support and will fail the feature detection for canvas appropriately).
Well, you could have a JSON which was valid at the start and invalid at the end. It'd parse the first bit ok and only throw an error when it got to the invalid bit.
Nothing gets parsed more than once. SAX parsers already parse streams, they're just not used very much because they're a pain to program with.
It should make most calls faster. Exceptions are for small JSON files or on networks that are fast enough there is no streaming effect (the whole file arrives very quickly)
For most sites there'll be some users where it will make it faster (mobile, slow internet) and others that it'll be about the same. If the network is unreliable it should help as well because when the connection drops you don't lose what you already downloaded.
As the sun rises in the UK, this is half the business shut down during business hours.