My sense is that GUI evolution has stagnated because the driving force behind Engelbart's (and others) thinking, the idea of a computer as an augmentation device that empowers individuals has been essentially abandoned.
Peak "augmentation" seems to have been (believe it or not) the Microsoft era, complete with its "productivity suite" (except productivity has gone missing) and killer apps like the spreadsheet that is still a computing interface out of this world.
The demise of the MS monopoly came about by the exploitation of opportunities offered by interconnected computers with zero regard to lofty empowerment ideals. The role of the computing device has become to offer just enough attraction so as to extract value from the individual using it. Achieving that at scale has required the dumbification of the UI, removing any cognitive stress. Except, no pain - no gain.
If the hypothesis is correct a next generation of important (G)UI's may come about if and when the much delayed harnessing of the networked universe to augment individuals starts taking place. It might be driven by the challenges of the new stage we are in. Think of concepts to address information overload or managing online personas. Or it might facilitate emerging possibilities, eg interacting with local algorithmic agents etc.
> and killer apps like the spreadsheet that is still a computing interface out of this world.
I wish the expressiveness of tabular data representations was more obvious to more developers. All you have to do is sprinkle in a little bit of relational calculus and you can model literally anything as an xlsx document. Then, unlike virtually every other programming environment on earth, you can hand that document to any other non-technical human in the same business and they will immediately understand what you are trying to show.
9/10 times the spreadsheet will give the business what it wants, assuming you invest enough time in developing a decent one. The only reason you should add code on top of that is if you identify additional value-add with persistence, validations, systems integration, concurrent access, fancy UIs, etc.
> I wish the expressiveness of tabular data representations was more obvious to more developers.
On the other hand, I wish spreadsheets exposed a more developer-friendly feature set. It wouldn't take much: a sane language for formulas, some form of procedure/subroutine functionality with document or sheet scope, separation of semantic content and graphics.
Part of what's infuriating about Excel is how it's a local optimum not too far away from an even better local optimum. I could see myself actually using spreadsheets more, even professionally, but those poor things are so badly misused it's nauseating.
A mostly forgotten these days aspect of Office is that you can access pretty much everything in it using COM - including both extending and embedding Excel, where you get full control over the data inside the spreadsheet.
And even with just VBA, I've seen some impressive spreadsheet applications (in fact, some even were versioned and had official SDLC process in some companies).
For me the ultimate enhancement to excel would be to directly integrate some SQL dialect into the application.
Assume you had a few worksheets and you want to produce a new one based on some projection of those. You could press some hotkey "New Worksheet from Query..." and then type in the SELECT you want to use for the projection. The column headings could be optionally specified with the appropriate SQL syntax. The schema for the internal SQL dialect could be dynamically generated by inferring table name from sheet name, and column name from the first row in each. You could even have this re-evaluate in real-time, so any changes to base sheets would instantly update the projected sheets.
What I'm suggesting is the opposite of VBA. There should be no difference between the language you use to script the worksheet and the language you use to write formulas in the worksheet itself.
VBA, COM and things like that rest upon the assumption that the worksheet is just a complicated data structure like the DOM.
Want to take Excel seriously? Let's commit to that. Take sheets as the computational model and build from there. A natural operational semantics for them is a graph rewriting system, same as ML and Miranda.
What you describe as the "productivity" vs. "dumbification" seems to me more like the distinction between a professional environment and an entertainment medium, respectively. There is no point in comparing Windows 95 tools to IPhone apps without taking into consideration the upside down target audiences.
I'd argue that GUI evolution has all but stagnated: The shift from specialized tools for individuals to mass adoption brought us much more new ways of interacting with our devices. The adoption of personal computing devices in the last several years has been unprecedented. We'd still be reading paper newspapers, watching TV and using the landline network for communication if it were not for the recent GUI evolution.
The main problem I see is the trend to remove power user features altogether instead of having them be hidden in the settings. Ease of use and customizability can get along just well, but are rarely to be found. So your statement about augmentation is right anyway.
Edit: I think that different OSes, different devices for different target audiences is not an actual problem. The uses of a computer can be so vast. The trend to integrate smart devices has only begun and requires different usability approaches. We should not compare Excel UX to Tinder UX
On second thought it is true that there has been a fairly major UI breakthrough since Engelbart: The touchscreen. It is arguably still underutilized, mostly an ersatz mouse function. What would be the paradigm-shifting multitouch application? My guitar, with zero AI and a UI going back some millenia is far more sophisticated in its adaptation to ten digits (Yes, I know Django could do with two :-)
You've sort of encapsulated a large part of my underlying philosophy there... that computing evolution has stagnated because of a loss of motivation to take on the hard problem of making complexity accessible. I like to think that everything I write sort of converges on that idea, but I don't tend to say it outright because it's difficult to fully explain and last time I edged too close to it I got more angry emails than when I spoke ill of RMS.
There is still vast amounts of money to be made using the scale-power of web/data tech developed in 2000s to colonize various legacy markets still stuck on even older tech, like banking (Stripe, Nubank). Next-gen UI will come when someone proves out a business model to drive the investment needed to develop it. (I think we are in agreement here) Also the current macro climate is not really conducive to R&D, you'd have to be an idiot like me to invest in advanced PL research startups when GOOG did 21% CAGR 10Y historical
I don't think of augmentation specifically in terms of productivity, but more in terms of an external brain.
I'd say peak augmentation came with having Wikipedia available in my pocket or from a smart speaker. There's still so much more that could be done. If anything, word processors and spreadsheets are a local maximum and I think we are probably stuck on another one now.
Peak "augmentation" seems to have been (believe it or not) the Microsoft era, complete with its "productivity suite" (except productivity has gone missing) and killer apps like the spreadsheet that is still a computing interface out of this world.
The demise of the MS monopoly came about by the exploitation of opportunities offered by interconnected computers with zero regard to lofty empowerment ideals. The role of the computing device has become to offer just enough attraction so as to extract value from the individual using it. Achieving that at scale has required the dumbification of the UI, removing any cognitive stress. Except, no pain - no gain.
If the hypothesis is correct a next generation of important (G)UI's may come about if and when the much delayed harnessing of the networked universe to augment individuals starts taking place. It might be driven by the challenges of the new stage we are in. Think of concepts to address information overload or managing online personas. Or it might facilitate emerging possibilities, eg interacting with local algorithmic agents etc.