Hacker News new | past | comments | ask | show | jobs | submit login

This is great to see, but also really frustrating. I see Apple pushing to make iPadOS feel more 'real computer', but these sorts of basic Unixy workflows still feel very hacky, particularly when it comes time to save things to the 'File system', as it were. I look forward to a day when I can someday actually use things like XeLaTeX and Pandoc on an iPad to accomplish my actual work, but for now, I'm finding myself editing Markdown and TeX now, and compiling later. Not great, but good enough.

I've often wondered whether Apple could do well by doing something similar to Crostini on ChromeOS, to allow these things to actually work as intended, but without impacting security. But that also probably doesn't sell software as effectively, as free software doesn't pay Apple's cut.




They could very well steal the WSL idea from Microsoft and make something similar. Currently there is iSH which runs certain Linux utilities unmodified, but I'd love to see first-party support.


A shell based on [WASI] would be a pretty cool option to provide

[WASI]: https://github.com/WebAssembly/WASI


Apple doesn’t care about minuscule lost revenue because some people would want to run containers of free software. Drop in the bucket.


The number of iPads bought by those users might be just a drop in the bucket, but that is too shortsighted to look at those users. Those users are the ones who might push development of more advanced apps on the iPad, be it as developers themselves or because they would attract new app development to the iPad. Engineers and scientists might be a small market, but I think it was vital for OS X, that early on Apple laptops started to appear at conferences, slowly growing in share.


I said the lost revenue if the feature were offered wasn’t important. I didn’t say anything about the value of the users.


But you cannot reasonably look at the one without considering the other.


Clearly I’m not explaining myself well.

The point I objected to comes at the end of this quote:

> I've often wondered whether Apple could do well by doing something similar to Crostini on ChromeOS, to allow these things to actually work as intended, but without impacting security. But that also probably doesn't sell software as effectively, as free software doesn't pay Apple's cut.

The implication is that Apple wouldn’t develop such a feature because it might cut into their profits. I don’t buy that reasoning: Apple wouldn’t lose much if any money by offering that.

Update: and I also don’t believe that Apple would deprioritize it because would cost them some money. Far more likely they’d decide it isn’t sufficiently useful, or would cause other problems, be too confusing, etc.


If those users are so unimportant, why did Apple bother to introduce the Hypervisor framework on macOS? If there's a big, important market for virtualisation on macOS, surely it would be worthwhile to also address that market on iPadOS? Especially since Apple is trying hard to convince all kinds of professional users to adopt the iPad.


I think Apple has a different vision for needs of people who use Macs and iPads for work.

For example the new Sidecar feature (use iPad as a screen) supports the pencil but not touch. Apple believes that touch on macOS interface is not a good experience. I suppose that they feel that text entry and chaining CLI tools on iPad is not one either.


Apple has been trying to pull people away from writing kernel extensions for a while, so I don't see why the introduction of Hypervisor.framework (and with it, another user-space way to do something which previously required working in the kernel) is noteworthy as anything other than this.


I said the lost revenue if the feature were offered wasn’t important. I didn’t say anything about the value of the users.


The file manager improves but not by a lot.

I think iPadOS is only usable as an actual computer when they finally make the Terminal app on iPad and a much better File app


in the meantime you can take matters into your own hands and jailbreak if possible


But why not use some of those fancy tablet devices that aren't locked in the first place? Much more options here. Sure, Apple hardware is good, but there are alternatives. Not suggesting to switch to chrome OS or android here.


So all Macs up to OS X weren't usable it seems.


While I never used a Mac before OS X, they probably were quite usable, and your question was both rethorical and snarky. There is more than one way to use a "computer". Incidentally, I only switched to the Mac after they added a shell to it with the switch to OS X. And one of the reasons I disliked Windows was the crummy shell and lack of a terminal. Also, it is 2019 and it is no problem to get a full Linux installation on a smartphone, while Apple sets up artificial limitations to prevent that.

There are certainly other ways to make the iPad a computing device - like the Macs before OS X, but Apple steps into the way of that in many places too. An app like Termux would not only make a lot of potential users happy, it would be something which could provide this in a way which could be nicely sandboxed in the iPadOS/iOS.


Naturally it was snarky, because having a CLI to be an usable computer is not something that one can generalize and largely proven in the market that consumers don't give a damn about it.

Which phone is sold with a full Linux distribution, with any kind of market relevance?

Android certainly not, as regular Linux APIs aren't part of the NDK stable API list, which Termux actually needs to work within the constraints of ISO C, ISO C++ and NDK APIs.


I still don't understand why you felt it necessary to make a snarky comment. Yes, there are tons of users who never use command lines on their computers. I guess no small amount of the Mac users don't know what "Terminal" is for. But do I really on hacker news have to argue what the value of a command line for an advanced user is, especially the hacker news audience?


Maybe because REPLs are more powerful that an plain old command line, and specially the HN news audience should be aware of it, given that it is built on top of a Lisp variant?

Being a developer is not a synonym for being stuck with a PDP-11 concept of how a computer is supposed to be used.


The commenter you answered on was talking about a terminal app for the iPad in general, how do you get to repls from that? And if you want to talk about repls, why don't you do that in your original comment? You should look at the Hacker News guidelines, which in the very first item cover your post specifically.

And you don't need to teach me about repls, I am a full time Lisp programmer :).


Because something like Swift Playgrounds, but in the context of iPad automation is from my point of view the ultimate goal, lets call it the Dynabook Smalltalk transcript, not just replicating a green phosphor VT-100.


I am not saying that something like that wouldn't even better.


This comment is hilarious. Apple switches from Classic Mac OS to OS X, the defining difference being that OS X has Unix underpinnings.

Apple then goes from almost going out of business to being one of richest companies in the world (sometimes the richest).

And this is supposed to be evidence that the command line isn't important?


The original comment argues a different claim: a terminal is not required to be considered usable.

As for success, I'd say that what brought Apple back to making money was iTunes and iPods, followed by iPhones.


Sure, as if Steve Jobs had nothing to do with it.


What defined a usable computer in 1999 is very different to what defines a usable computer in 2019.


Having a CLI isn't surely part of it for 99% of the consumers, neither in 1999 nor in 2019.


I always think this is such an odd sentiment. That 1% builds 100% of the software for the other 99%. Doesn't that make them disproportionately more important?


No, because many developers are part of that 99%.

Being a developer is not a synonym to be enamored with an UNIX CLI.

In fact, a graphical REPL is much more powerful.

So no, that 1% does not target the other 99%.


Liking using something is a bit different than having to use something. Are you saying you don't think the majority of developers need to use the command line for things like git, npm/general dependency management, running local web servers, compiling apps, continuous integration, etc...?


Yes, that is a tiny portion of what being a developer is all about.

Not everyone is writing UNIX daemons, stuck in UNIX workflows.

As I said, a graphical REPL for scripting languages is much more powerful.


If you're talking about web developers, then the majority literally are writing Unix daemons[1]

[1]: https://en.wikipedia.org/wiki/Usage_share_of_operating_syste...

Again, I'm not talking about what's more powerful, and I'm not saying that everyone works this way, but I am suggesting the majority of developers do need the command line to do their job.

Unfortunately, I'm not aware of any usage statistics for the command line in isolation. But if you look at the most popular technologies, e.g., Node[2], then you can most likely extrapolate that the majority of developers are working with the command line.

[2]: https://insights.stackoverflow.com/survey/2019#technology-_-...


Since we are speaking about Apple ecosystem here, most devs live in XCode.

Just like we used to live on MPW and Metrowerks before.

Scripting is perfectly doable from macros and IDE callable scripts.


CocoaPods, Carthage, Fastlane, git, all require the command line, and most teams use some combination of those. Not to mention Xcode literally is just a wrapper around Unix processes like `xcodebuild` and `SourceKitService`. Then there's a whole host of other support command-line utilities for doing Cocoa development like `codesign`, `xcrun`, and `xcode-select` off the top of my head.

Again, I wasn't making argument about where developers spend most of their time. Just that they need the command line as part of their workflow.


How many processes an IDE uses is an implementation detail.

The large majority of Apple and Microsoft developer communities live on their IDEs as part of our workflow.

Scripting and automatization can be easily done from the IDE as well, thanks to macros, REPL and build integration points.

Dropping down into a UNIX like CLI is the exception, not the norm for a large community.

Ah, but what about WSL? Well, Microsoft saw a market opportunity to win developers that buy Apple computers to do Linux work, instead of buying laptops from the likes of Dell/Asus/Tuxedo/system76 to start with.

So they are making UNIX devs comfortable on a foreign platform, just like NeXT was built on top of UNIX to take a piece of the pie from the UNIX workstation market being lead by Sun.

Using the shell was never a thing for NeXTSTEP nor classical Mac OS development workflow.


You seem to be saying that because developers spend 90% of their time in an editor or IDE and 10% of their time on the command line, that they don't need the command line? I think that's what you are saying with this "dropping down into a UNIX like CLI is the exception, not the norm for a large community"? Because I'm not aware of any "large community" of developers that don't use the command line at all? So the exception is when they actually have to use it? But using something a small percentage of the time of course doesn't mean it isn't important (humans spend a small percentage of time eating and if we don't we die).

Regarding Xcode being implemented by managing processes, my point is that IDE's and text editors have moved to a model of using external processes to implement features. E.g., language server protocol, linters. That iOS bans this type of application is why iOS for programming is a wasteland. The tragedy of that fact is my whole point in this thread.


We happen to live in different developer communities, it seems.


Yeah, but I'm not arguing whether developers exist who don't use the command line, of course they do. The question is simply whether the majority of programmers use the command line? I'm proposing that they do, unfortunately there's no available data for this point specifically, so we end up falling back to anecdote. But I am still super curious about your perspective on this. From what I've observed anecdotally, and from the most relevant data I've looked at, I'd guess the percentage of professional developers that rely on the command line is at least 80% (I am giving a lot of leeway for Microsoft-stack developers, which I'm not really familiar with, excluding developers using Microsoft technologies, I'd say well over 90% of professional developers rely on the command line). Do you disagree with these estimates? And if you don’t mind answering, where would you put the percent? (I'd also be super curious which technologies developers that don't use the command line are on.)


The thing you are missing that what UNIX devs rely on the command line for, is easily achievable by scripting, REPLs, and IDE integration.

For example I can do everything I need from Python or PowerShell from inside Visual Studio, just like Xerox PARC devs used to do with their REPLs on Lisp, Smalltalk and Mesa/Cedar workstations.

Naturally I am forced to drop down to old style command line every now and then, but that is forced upon me by tools I don't create, and most of the time they are ports from UNIX tooling.


Got it, yeah as long as we agree most programmers do use the command line. I actually agree that there are hypothetical ways a command line could almost always be replaced, but to me the important part is that most developers choose not to replace it. (And I’d argue that dependence on the command line is actually increasing overtime, with package management, version control, continuous integration, and modern editor features being implement on top of it.)

The question I pose then, is if these things are possible to do in other ways, and those other ways are better, why don't professional developers choose to use them? For example, why isn’t programming on an iPad popular? I.e., where these other methods are literally the only way to do programming on device? Using SSH to program entirely in a terminal is almost certainly more popular for professional developers than any other approach of programming on an iPad (using blog posts about how professional developers develop on an iPad as an indicator).


Because those blog posts are from UNIX devs trying to use iPads for UNIX like work.

This is how Apple devs actually do development on iPads.

- Swift Playgrounds

- Pythonista

- Continuous for C# and F#

- Lisping

- Raskell

- GLSL Studio


I’m only interested in whether the majority of professional developers are able to ship software to their users. Is your stance that you can do that just fine with just the tools you listed and therefore professional developers don’t need the command line on iOS?


I guess that I am not a professional developer, apparently.


Haha, not sure about that, but you most certainly aren't the "majority of professional developers". Neither am I, that's a whole bunch of people who couldn't care less about how you or I want to work.

It may be worth noting here that iOS 13 adds support for external USB sticks and network Samba shares to the Files app.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: