Hacker News new | past | comments | ask | show | jobs | submit login

Definitely. I don't know if developers for e.g. TV apps get much choice in the matter, but it's like native vs webapps. The Amazon app feels like a webapp, while Netflix like a native app (this is on LG's WebOS).

And I know Apple is a weird one there. On the Apple TV, they offer pretty much a version of iOS. There's multiple options to build your UI, but iirc you can build it native if you want to.

And this has been Apple's differentiator; they were FAST. The code for apps compiled down to native, as opposed to a lot of Java based phones at the time (and later with Android).

I've always maintained that Apple had a 5 year head start on Android when it comes to performance (as well as UX, even in their skeuomorphic designs), and after 5 years it was mainly Android smartphone companies focusing on more performance than the Android OS or apps becoming faster. It was Android phones that went for quadcore (and beyond) processors first, while Apple was just fine with a single core, and later, almost reluctantly, a dualcore. Simply because their earlier technology choices made their stuff so much faster and more efficient.

I'm so glad Apple didn't go ahead and make web technology the main development path, as they initially planned (or so I gathered).




Yeah it definitely feels that way. I reckon it has a lot to do with the servers too though. Even netflix.com is far superior to prime video / apple tv+ browser versions. In fact it feels virtually identical to the app version.


For Netflix it has a lot to do with how they integrate with the TV's. They tend to integrate directly with the chipset vendor, and then ship their own SDK that the TV vendors integrate. Everyone else is relegated to use the terrible shitty webapps like development with no debugging capability. So for Smart TV's at least, Netflix is on a whole different level than everyone else.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: