Hacker News new | past | comments | ask | show | jobs | submit login

That's not the point though. The example given in the article was Google Docs which has the same UI paradigm to Word. Under the hood it's massively different obviously with real time collaboration and constantly up to date syncing.

So, the reasoning is that UI is fundamentally the same (or worse if not done right) to native UI from the 90's, yet it hasn't had a massive speed increase which seems wasteful.

But modern UI in Office is only an evolution of what was there in the 90s and hasn't changed fundamentally either yet it doesn't feel any faster.

UI is only a small part of an app, a well designed app will have most of the work performed outside of the UI thread and it shouldn't feel any slower than a native implementation. My thoughts are rendering speed isn't the issue but application design.




> But modern UI in Office is only an evolution of what was there in the 90s and hasn't changed fundamentally either yet it doesn't feel any faster.

Sure, and Office in the 90s didn't feel any faster than the word processing I was doing on an Apple II+ in middle school. This is because the people buying (and building) software care about other things than processor efficiency. If it's generally fast enough for their normal use, they won't switch to a competitor.

The notion of "wasteful" here is in terms of something like RAM usage or processor instructions. But the correct measure is user time, including the number of user hours of labor needed to buy the device. The original Apple II cost 564 hours of minimum wage labor, and you were up over 1000 hours if you wanted a floppy drive and a decent amount of RAM. Today, a low-end netbook costs 28 hours of minimum wage labor.

Suppose you managed to put on that netbook something with the efficiency of Apple Writer or Office 4.0. Would anything be better? No, because the spare cycles and RAM would go unused. They would be just as wasted. No significant number of user hours would be saved. Or, alternatively, the in-theory cheaper computer they could buy would save them very few working hours.

As long as the user experience is as good, then the hardware notion of "wasteful" is a theoretical, aesthetic value, not a practical one.


You are ignoring battery life which is a useful consideration on laptops which appear to be the majority of pcs.

You are also ignoring the notion that a user may want to run a variety of apps, and not want to close or have any of the lot swapped out and pretending the hit on performance, resources, and battery life isn't cumulative.


I'm not ignoring them. I just didn't mention them in this comment. They fit in the same rubric.

A user can run a few things even on the low-end netbook. Tabs are cheap. And if they hit the limits of their machine, they can either pay in a reasonable number of user-minutes to actively manage resources or a modest number labor-hours to get something beefier.

I personally would like to see things better optimized. After all, I started programming on a computer with 4K of RAM. But I recognize that there is very little economic incentive to do so.


Isn't it kind of offensive to suppose that billions of users should pay more money so that hundreds of developers can use less efficient tools to build apps?

Isn't this backwards?


If those are the only factors and the numbers fall in particular ranges, sure. Otherwise, no.

Try doing the math here. How much cheaper would a netbook get if every single developer coordinated to reduce RAM and CPU usage? $5? Maybe $10? Looking at market prices, old RAM and CPUs are cheap. They consume basically the same physical resources as new RAM and CPUs, so price competition for not-the-best hardware is fierce.

Now ask those people if they'd pay $5 or $10 more for assorted new software features. Any features they can think of. And keep in mind that in that price range, people are paying $10 more to pick the color of their computer.

So sure, it offends me a little, because I like optimizing the things I pay attention to, like RAM usage. But if instead I optimize for the sorts of the things users care about, especially as reflected by what they'll actually pay for it becomes pretty clear: users don't care about the things I do.

So then the moral question becomes for me: who am I to impose my aesthetic choices on the people I'm trying to serve?


Trivializing making bad software that is slower on devices orders of magnitude faster by trying to equate it to netbook prices is a particularly bad methodology of comparison.

This is especially true as people are promoting everyone moving to a platform that is substantially worse.

How about getting more performance and battery life out of the same machine which effects more than netbook users.


I am not trivializing anything. I don't like bad software any more than you. However.

You may have noticed that we are in the technology industry. That means the final measure of our work is economic. The final judges of our work are our customers.

If you believe that X is better in our industry, you must be able to demonstrate that betterness in terms of user economics, in terms of user experience. You haven't yet, and you seem unwilling to even grapple with my argument in those terms. Are you planning on trying?


I dunno. When I was overseas I had a Kindle which lasted for something like two weeks between charges; that was awesome. Much better than my laptop which I had to charge every day for hours.

I wouldn't mind a true low-power laptop which only needed a charge twice a month.


Eink displays only use however much the battery inherently loses when not changing pages. If you only read 500 screen of text that month then it only consumed a trickle of battery x 500. Your screen itself consumes power every second its on and you also ask much more than rendering text.

What you propose is interesting though none the less. What is the most battery life that can reasonably be packed into a device that is modest but still useful.


Sure, but that won't come from people programming differently. The laptop backlight alone is a few watts. If your battery is 40 watt-hours, you're not going to get to 2 weeks of usage no matter how little the CPU gets used.


Yes, so it's pointless for the author to say that a problem with Web Apps is that they're slower than native apps. It's redundant now days and a well designed web app using modern techniques should not feel any slower to an end user than a desktop app, in fact with the advanced rendering engines within modern web browsers they can feel more responsive and more usable than native.


I feel like this advice is coming from some alternate universe where this is actually so.


> "But modern UI in Office is only an evolution of what was there in the 90s and hasn't changed fundamentally either yet it doesn't feel any faster."

Evolution of a UI isn't as important as evolution of the features the UI exposes. As for whether it feels any faster, depends on what you're doing. To give an example, Excel functions can be calculated using multiple CPU cores, which AFAIK wasn't a feature of Excel in the 1990s. You'll only see that speed up if you've working with a large enough volume of formulas. Measuring speed by UI speed alone doesn't get you very far.

All that being said, you won't find me disagreeing with the fact that desktop apps are bloated (web apps even more so). I've experienced responsive desktop apps running on a 7.14MHz CPU. The fact that we've thrown away most of the hardware improvements since the 1980s should be clear to anyone paying attention.


That's precisely the point. The author of the article was complaining that web applications are slow and compared it to Windows 95.

And my point is that web apps have a lot of features that didn't exist back then, and because of feature additions Office and other native applications don't exactly feel snappy either.


> "That's precisely the point"

That was the general point, but I was responding to a side comment that I disagreed with.

> "because of feature additions"

Adding features does not require slowing an application down. The reason modern apps (desktop and web) are slow is to do with inefficient use of computing resources, which has very little to do with available features.


That's why I said:

> UI is only a small part of an app, a well designed app will have most of the work performed outside of the UI thread and it shouldn't feel any slower than a native implementation. My thoughts are rendering speed isn't the issue but application design.

at the start. :) So, we're in agreement.


Can you run web apps in a multithreaded environment? UI remains the largest overhead in a web app in my opinion..

Or, how much speedup would you estimate, if we convert all GoogleDocs functionalities into Word97? I'd estimate 1000 times. :) Or perhaps, the computation power for drawing a cursor alone will far exceed the whole Word97.


> Can you run web apps in a multithreaded environment? UI remains the largest overhead in a web app in my opinion..

Yes, you have webworkers for multi threaded development. They're basically independent applications which run on different threads and you pass messages (which are simply objects) between them. The browsers themselves are also moving their layout and rendering engines to be multithreaded.

A well designed app would do very little on the UI thread and would pass messages between the UI thread and the webworkers, it would also spin up webworkers on demand to offload work. It's not as easy as some environments to develop in, but it's also fairly straight forward once you make the effort to do it.

If I was designing react for instance I'd have all the virtual dom / diffing stuff being handled by a webworker and then would only pass the updates through to the UI when computation is completed.

> Or, how much speedup would you estimate, if we convert all GoogleDocs functionalities into Word97? I'd estimate 1000 times. :) Or perhaps, the computation power for drawing a cursor alone will far exceed the whole Word97.

Whatever the speedup would be the speedup the users would likely not notice or will only notice a slight improvement.

And yes, drawing the cursor as a 1px wide div is computationally intensive, I guess you're referring to that article posted on HN awhile back that VS Code used 13% of the CPU just to render the cursor? :) Doing stuff outside of content editable is not ideal for writing applications as you lose a lot of system settings (like keyboard mappings, cursor blink speed, etc) that the browser automatically translates to the built in cursor.


> Yes, you have webworkers for multi threaded development. They're basically independent applications which run on different threads and you pass messages (which are simply objects) between them. The browsers themselves are also moving their layout and rendering engines to be multithreaded.

Yes I'm actually referring to this. The programming model. Workers are great if you can divide and conquer the problem and offload (exactly what you have mentioned). But the messaging payload would be high under some circumstances when you have to repetitively copy duplicate a lot of data to start a worker. I don't have hands-on experience with web workers but I think it is unlikely to solve the messaging overhead without introducing channels/threads. Workers are more like processes. And currently they don't have Copy-On-Write. Of course we may see improvements over time, but this is to gradually reinvent all the possible wheels from an operating system, in order to be as performant as an OS.

> A well designed app would do very little on the UI thread

I partially agree. It may do little, but in turn, the consequence may be huge. This is because DOM is not a zero-cost abstraction of a UI. It does not understand what the programmer really want to do if, say, he/she constantly ramping the transparency of a 1px div. Too much happens before the cursor blink is reflected onto a framebuffer, compared to a "native" GUI program. I think it will be very helpful if the DOM can be declarative as in XAML, where you can really say <ComplexGUIElement ... /> without translating them eventually into barebone bits. Developers are paying too much (the consequence) to customize this representation.

> Whatever the speedup would be the speedup the users would likely not notice or will only notice a slight improvement.

There won't be a faster-than-light word processor but I really want it to: 1. Start immediately (say 10ms instead of 1000ms) when I call it up 2. Response immediately when I interact (say 1ms instead of 100ms) 3. Reduce visual distractions until we get full 120fps. Don't do animations if we don't have 120fps. 4. If the above requirements can always be satisfied by upgrading to a better computer.

The speedup will guarantee 4) and make the performance scalable. But currently the web apps lag no matter I use a cellphone or a flagship workstation. This clearly indicates that the performance of a web app does not scale linearly with computation power, and this is not about how much javascript is executed (that part will scale I believe).


> But modern UI in Office is only an evolution of what was there in the 90s and hasn't changed fundamentally either yet it doesn't feel any faster.

Sorry, but this is absolutely untrue. The Ribbon UI introduced in Office 2007 was a massive change functionally and visually. You went from a static toolbar that would just show and hide buttons to live categories which not only resize but change their options and layout as you customize or resize the window. There's now drop downs, input fields built in, live previews in the document as you hover over tools and options, and more.

Same for the new Backstage UI introduced in Office 2013 for saving files, viewing recents, and other file and option operations. You have full screen animations and interactions.

Hell, Microsoft even made the text cursor fade in and out instead of blinking, which needs more processing power.

Could Microsoft have optimized it more? Yes. But they definitely have added tons to it since the 90s and even mid-00's to justify why it's slower.


But the original article was saying that the UI paradigms are the same but the interface is slower. The UI paradigm on the web is as far removed from 90s Windows as modern Windows is if not more.

All these points are no different to how web tech is evolving UI so should be discounted the same way that web technology is.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: