Hacker News new | past | comments | ask | show | jobs | submit login
Unix as IDE (2012) (sanctum.geek.nz)
111 points by pabs3 8 months ago | hide | past | favorite | 69 comments



IMO tmux + a scriptable terminal editor (Vim, Neovim, Emacs, etc) turns Unix into a very nice IDE — but it's pretty dependent on becoming deeply familiar with those two tools: a multiplexer and an editor.

I wouldn't say Unix is an IDE, really; but you can build a very nice IDE for yourself using Unix tools, as long as you have a sufficiently configurable terminal multiplexer and a sufficiently configurable terminal editor.

One pretty major difference between "Unix is an IDE" vs "you can build an IDE around Unix" is that, well, the terminal editor is actually pretty close to an IDE — you can have language server based autocomplete, refactoring, linting; UI for git; etc. So, it's not really that Unix is your IDE; your editor is. But the nice thing about running it in a multiplexer in a terminal, though, is you have instant and easy access to just about any other command-line tool just a keyboard shortcut away, and it can be organized visually wherever you like — a pane on the side, beneath, zoomed in temporarily to encompass the screen, hidden in a tab, etc.

(Sufficiently advanced terminal emulators could also fill in for tmux here.)

I think you get many of the same benefits using a tiling window manager... But well, the best ones require running Linux (or one of the OSS BSDs); the options for your company-provided Macbook are fewer and less good. Tmux + terminal editor is pretty nice as something that works in any Unix environment.


I don't know about Macbooks, but if the company-provided Laptop runs Windows, tmux + editor is nowadays very easy using WSL2.


Works great on Macbooks. Some would say better since copy/paste on a mac, even to and from a terminal in tmux is much less of a mess than on WSL or Linux.


Native copy/paste integration is pretty out-of-the-box these days on WSL as well, FWIW.


True. It's kind of funny that Microsoft did a better job implementing linux than they did implementing windows.


Windows Terminal has support for copy and paste that is integrated into the OS natively. CTRL+C/V works perfectly.


Agreed, though i prefer in-built multiplexing like in Wezterm to tmux on macos for performance reasons.


The nice thing about tmux/screen is that they're more widely available by being separate from the terminal emulator itself. I can use them on any *nix system and with one modification (I use a different prefix than default with tmux and screen, I'm too used to emacs keybindings to let them have C-b and C-a) have the same experience on all of them.

Another positive is that they survive if, for some reason, you close your terminal. I can completely kill the terminal emulator and my tmux session (or screen, but I haven't used it much in years) will still be around and can be reattached.


I've used tmux and vim/nvim on macOS and never experienced performance issues but I'm really curious about your experience, could you share more details?


I have no measurements or anything but if I open a file in neovim within tmux, I get more lag than if I open the file directly without tmux in between. I've seen people on github having similar issues


tmux defaults don't play nicely with the Apple Terminal. Resizing split panes for example collides with switching workspaces in macOS finder.


Unix is a DE, no I about it.

This isn’t a ding against Unix, the integration provided by IDEs is great for some types of work, but it’s also more hindrance than help for other types of work.

Also, the toolset that you get from a typical Unix environment is more or less language agnostic, so you have your tool toolset available to you even when you’re working on something obscure with poor dedicated tooling support.

That is not a replacement for IntelliJ knowing about my DI bindings, though.


Yeah, the article very conveniently ignores the "integrated" part of the IDE acronym. Which was one of the maint selling points of them: you don't have to cobble various small pieces of functionalty together by yourself, it's been figured out and done for you by the IDE developers once, now you can just write your programs.


I was super excited about customizing every little detail about my setup as a kid, but as an adult I'm pretty much using ubuntu lts and vscode because I can't be bothered to read a half assed repo on configuring my lsp.


> Yeah, the article very conveniently ignores the "integrated" part of the IDE acronym.

How's that? In unix (mostly) everything is a file, which in turn is just a stream of bytes that you can pass to any unix-y application, the output of which can be passed to any other unix-y application. That may not be your cup of tea or preferred environment, but it's very much integrated.


"Easily" integrable perhaps, then.


What’s an example of a type of work where an IDE is more of a hindrance?


An example from my work experience: I was converting about 15 million lines of C++ code from pre-ansi to ansi-conformant compilers across 3 platforms. So step one was setting up a parallel build with each new compiler and step 2 was fixing the tens of thousands of compile errors. There’s no way you can tackle this in any IDE that I am aware of. You want to be able to use an overnight build and you want to be able to load the compiler output as a quickfix list[1]. Super easy in vim (and probably also in emacs) completely infeasible in any kind of normal IDE as far as I know.

A lot of the time the error fix would boil down to some general rule so after I had fixed the error a bunch of times I would realise “ok when I see an error like this I need to change every ‘class’ in a template definition in this project into a ‘typename’ and every typedef of that class that looks like `typedef foo::<A,B>::iterator iterator` to `typedef typename foo::<A,B>::iterator iterator`[2]. Then once I realise the pattern I would write some find | perl -pie type thing to walk down the codebase from a certain point and fix all remaining instances of that particular error. These were typically more complex and subtle than the sort of autofixes/auto refactorings that I have seen available from IDEs although admittedly I’m sure the state of the art has improved beyond all recognition since then.

[1] Each of the errors were of the type described by Alexandrescu in “Debugging the error novel” in that they were probably about 10 pages of ascii template hell per error times tens of thousands of errors. So I actually made two copies of the output. One went unadulterated into my quickfix list and one went into a separate buffer that I ran a bunch of regexes over to get them into a state where I could understand the error message. Then I could work down the qf list one error at a time while also paring down my “working error file” in parallel.

[2] or something. It’s been a while.


Really big code bases, where your IDE grinds to a halt, for example.


embedded systems/firmware where the same code may need to be built for multiple HW platforms/cpu architectures.


Unix shell utilities as an idea/philosophy are a really good IDE, but in practice a lot of them need major UX overhauls. `find` is awful, for example, if only because it uses single hyphens followed by full words for it's flags (different from almost all other conventions...)

Doing anything which involves more than one pipe or xargs is just painful. Bash as a language is not ergonomic at all.

I don't know if there's a nice alternative that keeps the same spirit. My feeling is something like emacs + eshell (a shell which can execute elisp expressions)? Elisp is not a very ergonomic language either, of course... but something like this.

VSCode was a revolution in giving an IDE with a good UI, every feature one could want, and fairly good performance. In 2023, I feel like everyone's on containerized/distributable work flows so there is a big need to actually return to "UNIX as IDE" type work, and now would be a great time for some new CLI tools that improve on the past.


Many (most?) of them have been overhauled with success. For find there is fd[1]. There's batcat, exa (ls), ripgrep, fzf, atuin (history), delta (diff) and many more.

Most are both backwards compatible and fresh and friendly. Your hardwon muscle memory still of good use. But there's sane flags and defaults too. It's faster, more colorful (if you wish), better integration with another (e.g. exa/eza or aware of git modifications). And, in my case, often features I never knew I needed (atuin sync!, ripgrep using gitignore).

1 https://github.com/sharkdp/fd


There are modern alternatives to many of the classic utils e.g. `fd` for find, with better CLI args and other improvements

ripgrep for grep also comes to mind


I've totally switched to the newer, dev-orientated variants for my work.

fd, sd, ag/rg, fzf, bat, xh/httpie all work how I'd expect them to and I don't need any archaic knowledge. Plus, fish's autocomplete does fill in blanks when I need it.

I'd kind of consider tmux to be my ide rather than Unix. That's where I combine everything above (along with nvim), and it's as integrated as I need it to be.


> and it's as integrated as I need it to be

Same for me. The "pipe streams of text" is so very powerful, I doubt anything can replace it.

Obviously, the "streams of text" need to make sense and have somewhat consistent structure - which most older tools don't really do - some mix tabs and spaces, others offer very little consistency between e.g. --verbose and normal. And modern ones often forego entirely. I hate it that docker pumps giant datastructures of JSON. And jq helps a lot there. Or ansible that has genormous blobs of json, mixed with yaml placed inside - I kid you not, cowsay - outputs. What I really want is just text, that I can minimize with flags and is delimited with spaces and newlines, so that a grep, cut, sed, sort, uniq, and, if all else fails, awk, can map/reduce me the data I need.

Simple text streams. If we can have that, we can bind everything togethe and "build" IDE features that no IDE can offer me without severe programming, plugins or other shenanigans.

Some things that I did this month, that I have no idea if VS code or Jetbrains can do:

- git log --format=format: --name-only | grep -v '^$' | sort | uniq -c | sort -nr | head -n 10 # Give me the 10 files with the highest churn - most often edited so potentially most problematic ones.

- < dev.log | grep "WARN:" | cut -d " " -f 2- | sort | uniq -c # List all warnings most occurring at the top.

- git log --reverse --pretty=format:"%H" | while read commit_sha; do git checkout "$commit_sha" -- . ./lint.sh 2>> linting.log done # Loop through all commits to record all linting violations so to find commits with exceptional amounts of linting violations.


Cool, didn’t know about sd before. The missing part is a more modern awk somehow.


Unix was probably a great IDE by the standards of 1975, because literally every design decision about Unix was made based on the criterion of convenience for a 1970s programmer. Those decisions haven't aged well: fork() sucks, lack of completion-based async I/O sucks, ill-thought-out file, socket, and process APIs riddled with TOCTOU bugs suck, two-letter YAFIYGI commands suck, untyped pipes suck, and don't get me started on C or its standard library.

It turns out, however, that we haven't really strayed far from the "Unix as IDE" approach, as in order to turn Vim or Emacs into an IDE the typical approach is to just use Unix primitives and make the editor into a sort of super-shell that orchestrates pipelines of processes consisting of Unix command line tools. And that's fine.

Time has marched on, however, and we have been the lucky recipients of not only better IDEs than Unix, but also things that are better than Unix at what Unix was thought to do well. PowerShell, for instance, is a vast improvement over Unix command-line environments with its ability to operate over typed pipelines of objects. We should be expanding our horizons beyond what 1970s programmers found convenient to implement and embrace new ways of thinking about, writing, testing, and debugging code.


I agree that PowerShell is theoretically better since it can deal with structured data.

But in practice, I could never get as efficient with it as I could with "the Unix way" (everything is a string).

If I was writing a "production" script, then it's better. But when I'm just trying to quickly do something on the command line, the "everything is a string" approach is good enough.


It's a very interesting topic, since the amount of brittle sed/grep/awk done by everybody using *nix on earth probably requires 128bits numbers, yet structured interactions does slow you down due to idiosyncracies. That said I wonder how quick things like powershell reward you. It's a bit like having a unit standard.


The thing that we'd ideally want to happen, but will probably never will unless a mad person does it, would be to have something that wraps around a POSIX shell, and extends it in a sort of non-compatible way. Let's call this new tool $NEWSH.

Larry Wall basically tried to do this with Perl but I guess his vision was too lax for such a pursuit, Perl tried to be everything, all at once.

What I mean by this new approach is that this new tool would implement everything in POSIX shells, as a sort of v1 layer and then everything else and new and modern (typed, etc), as a sort of v2 layer. There would be simple ways to call back and forth between v1 and v2, but inside a layer, you couldn't mix and match (maybe you could do it with a construct such as "unsafe".

Long term the idea would be that this tool becomes pervasive, installed and available everywhere and everyone can safely and reliably use just the v2 layer.

The v2 layer would be extensible by design, so it can keep up with modern practices.

You'd want to be able to go $NEWSH my-bash-script.sh and it should just work. $NEWSH my-newsh-script.nsh should also work, obviously.

The reason this would never happen, IMO, is because it's a thankless job that needs to be done for at least 1 decade. This new shell would need to:

- implement full POSIX shell compatibility - the v1 layer (done before, doable)

- implement the new, designed from the ground up, language - the v2 layer (sort of done before by alternative shells, doable)

- design a clever, simple to use, safe interop + unsafe construt for v2 (I imagine this would be hard)

- package and promote this new thing, once it's stabilized, so that it's picked up by: Debian/Ubuntu/Mint, Arch, Fedora, FreeBSD, etc, and wait for literal decades for distribution to happen; Debian Stable and RHEL are especially egregious, since for ubiquity you'd want the new tool to be available in the 2 latest LTSes, so DevOps people can rely on its availability... so as I said, count 10 years for that (crazy hard and super boring)

I said only a mad person would do this, because with this kind of volume and intensity of work, you can probably start a successful company or an open source project with much higher impact.

Oh, and this project will be absolutely HATED (think "getting death threats" type of hate) by large parts of the community. See systemd.


This resembles OSH/YSH, especially the v1/v2 layer things.

http://www.oilshell.org/


Yeah, that would be the only thing that sort of resembles what I'm saying, but I'm afraid it's biting off more that it can chew: the long term goal is to have some sort of distributed language plus it's using its own custom Python runtime of sorts, it's implemented in a convoluted way.

Also, right now the project is a 1-man-show, for something of this magnitude to move fast enough it needs to become a full blown community project.

Oh, Oil also need to mature. Osh IMHO needs to be fully usable as interactive shell (I don't think it is), it needs to be packaged for everything.

It could be that Oil gets there but it's going to take a looong time. Oil itself was started in 2016, that was 7 years ago.

I'm keeping an eye on it for sure and hoping it succeeds.


I think it would sort of be like GNU. Stallman created the tools before the kernel because without tools it’s got no value.

You need the programs to make this hypothetical shell useful. That’s structuring the output of pretty much every program as data and passing it through as data conserving types. That’s arguably a very large barrier to entry.


(author here) Yeah you just repeated what https://oilshell.org/ is

You'd want to be able to go $NEWSH my-bash-script.sh and it should just work. $NEWSH my-newsh-script.nsh should also work, obviously.

    osh my-bash-script.sh  # works, it's the most bash-compatible shell by a mile
   
    ysh my-new-script.ysh  # works
YSH looks like this - https://www.oilshell.org/release/latest/doc/ysh-tour.html (everything in this doc works, you can try it now)

And then you do

    shopt -s ysh:upgrade
to opt into new behavior.

https://www.oilshell.org/release/0.19.0/doc/upgrade-breakage...

YSH isn't done yet, but it's coming along nicely -- see the latest status update - https://www.oilshell.org/blog/2023/11/status-update.html

---

The project isn't a solo project anymore! There have been 6 people funded by two NLNet grants, and we just got a third one, mentioned in the post.

If you want it to succeed, then you should try it and report bugs.

One way to view it is that Oils has a 7 year head start on any other project that wants upgrade shell. Like you say, it's a huge amount of work, and it's the ONLY project that's compatible with bash (which BTW is a much bigger job than being compatible with POSIX sh).

Shell became the #6 fastest growing language on Github in 2020, so all those shell scripts people have written in the last few years make the Oils project more valuable. That is, people are writing scripts compatible with OSH at a greater rate than any other new shell (e.g. new shells == ones with precise error messages)

The new command line tools like fd and bat are ALSO compatible with OSH and YSH because they use the Unix process interface :) There have been several shells that want to bundle everything into the shell, but I disagree with that philosophy, because it limits growth to being "inside" the project.

(although YSH has a lot more functionality than bash built-in, like JSON support)

---

People who have contributed are acknowledged in the release announcements.

Testing is a good contribution. Writing weird HN comments isn't a good contribution :-P

There has been some misunderstanding of the project, but very little "hate" ... Mostly encouragement!


I'm going to throw a curveball and say: can I run Oil on Windows/Cygwin? :-)

Also, how's the interactive mode? I don't want to just run scripts with it.


It runs on WSL, not sure about Cygwin

Interactive shell screencasts was the last blog post: https://www.oilshell.org/blog/2023/12/screencasts.html

Real Unix users download and build tarballs, they don't wait for the packaging gods to bless them with software :-P


Real Unix users aren't computer users then cause computer users are lazy :-p

I'll see if it doesn't take a huge chunk of time to get it going in Cygwin.


To clarify, I love the great news and I love the idea of Oil, I hope it takes off. Linux badly needs it.


> Those decisions haven't aged well: fork() sucks, lack of completion-based async I/O sucks, ill-thought-out file, socket, and process APIs riddled with TOCTOU bugs suck, two-letter YAFIYGI commands suck, untyped pipes suck, and don't get me started on C or its standard library.

Of those, only

> two-letter YAFIYGI commands suck, untyped pipes suck

sound like problems with unix as an IDE; most of what the article discusses is unaffected by underlying APIs.

I also feel like I should question whether you're not attacking a strawman; modern unix-likes are continuations of unix, but they're not stuck in the 70s. For example, AIUI everyone agrees that fork() has shortcomings... which is why we have vfork (and I think others?) now.


vfork is a lot worse than fork, don't use it


"We should be expanding our horizons beyond what 1970s programmers found convenient …"

You're right; we should be going back to the Lisp and Smalltalk machines of the 1980's.

In that sense, I've found PowerShell to be something of a letdown. I was hoping for something that was an ergonomic abstraction over NT's primitives (or Linux's) and over the CLR's primitives at the same time, in much the same way that Lisp worked on Lisp machines, but PowerShell is only frustratingly alright at both.

With the deprecation of the ISE, the debugging story is also something that's pulled into question. Is the VS Code extension the place to go now?

I wind up spending more time in F# because the whip-it-upitude story is a little better there, especially with the Jupyter notebook support, and the choice of using the same debugging and development tools I use for the rest of my tech stack is an obvious one.


It still find that PowerShell is the best we have from Smalltalk/Lisp Machine kind of ideas, even if the execution could be better.

Yes, the VSCode extension is the official ISE replacement.

What keeps VSCode around as the only Electron app I tolerate on my own computers, is the way Microsoft is pushing it for certain workloads.


"… even if the execution could be better."

The execution could be so, so much better.

I also think a lot about how some parts of PowerShell were inspired by CL on IBM i, and I find that I'm similarly frustrated when I can't just whack F4 in the middle of muddling through a command's parameters in order to be presented with a structured form interface for interactively filling out said parameters.

As much as I'd really like having Genera's hoverable and clickable command listener in PowerShell (like, imagine being able to do a `Get-ChildItem` and getting back an object display that allows you to left-click on an item to run `Get-Content` on it), I'd absolutely settle for the level of interactivity and hypertextuality offered by IBM i.


see, that's what i thought in 01998, but you may have noticed that google, facebook, aws, openai, github, android, ios, docker, instagram, slack, whatsapp, fastly, tesla, etc., all run on unix for some reason

so i guess fork and epoll and the racy unix file and socket and process apis are adequate, or anyway in some way not as fatally flawed as windows. a lot of those successful tech things of the last 25 years do involve some java or golang or erlang

stack overflow runs on windows tho

almost everything in my list runs on linux, mostly ubuntu. the exceptions are that ios runs on darwin and whatsapp runs on freebsd


I'd argue that it's Worse is Better (https://en.wikipedia.org/wiki/Worse_is_better) in action, together with the lack of cost.

The entry point is super low, both for companies and for individuals, and once a somewhat acceptable base is in place, people just build on top of it.

The quality bar is: "it shouldn't fall over on its own when left unattended", it's not super high.


Completely agree.

Also, now with an IDE, code completion is table stakes and we are moving toward LLM integration where the IDE will assist in coding.

The Unix pipeline model is not suitable for this as you need persistent processes and low-latency two way communication.

It is interesting to note that the Language Server Protocol which is the biggest innovation in terms of scaling language completion across multiple languages and editors came from Microsoft.

We are moving into a world where although the code is stored as text, our programming tools understand and manipulate it at a deeper level than just a sequence of bytes which is what the Unix tools do. Rename that uses the AST is much more safer and powerful than sed. Context aware search is better than grep.


Expanding horizons = introducing longer pipelines of learning.

At this point you'll need 10 years just to get the basics down much less be competent across the stack.


People routinely become immediately productive in Visual Studio Code.

Contrast this with the potentially multi-decade journey of becoming proficient in something like vim or Emacs plus command line tools.

It is possible to improve on the old ways, from a productivity and ease of learning standpoint as well as from a standpoint of what's possible.


Your argument might hold water, if the standard for productivity was simply churning out code. The Linux/Unix world, already has tools, like nano, for such users.

The software world, might be better served, however, if programmers slowed down, and engaged their brains, before bashing keys. I’ve solved most of the difficult problems, in my programming life, in the bath - where I can allow my mind to wander, and come up with the best solution.


Oh I don't mean around just IDEs, but really I have a ton of new guys on my team who can understand how apis work but not any futher down. Now, 90% of the time it doesn't matter but, anytime the problem is 1 layer lower it falls apart. There are so many layers when we should be reducing layers and simplifying.


This is how Mitchell Hashimoto (HashiCorp cofounder) developments.

His “IDE” is Nixos + Neovim.

https://sourcegraph.com/blog/dev-tool-time-mitchell-hashimot...


Related:

Unix as IDE (2012) - https://news.ycombinator.com/item?id=22438730 - Feb 2020 (103 comments)

Using Unix as an IDE - https://news.ycombinator.com/item?id=22392220 - Feb 2020 (2 comments)

Using Unix as an IDE (2012) - https://news.ycombinator.com/item?id=17359533 - June 2018 (54 comments)

Unix as IDE - https://news.ycombinator.com/item?id=12653028 - Oct 2016 (209 comments)

Using unix as your IDE - https://news.ycombinator.com/item?id=4105768 - June 2012 (127 comments)

Unix as IDE - https://news.ycombinator.com/item?id=3594098 - Feb 2012 (104 comments)

Unix as IDE: Files - https://news.ycombinator.com/item?id=3575565 - Feb 2012 (1 comment)


Programmer's Workbench (PWB/UNIX)

https://en.wikipedia.org/wiki/PWB/UNIX

While looking at the old Unix Archives, it's easy to get the impression that Unix really is the ultimate development environment, by and for programmers. At least at it's inception, before it started seeing more general use.


Tangent:

I've been programming for many years, and the old VI vs Emacs inside-marker/debate about which one "real" programmers use has been a constant for the entire duration.

(Because it is only one of those two. Any other is out of the question.)

Anyway, how much of this, quite frankly, non-issue turned important geek status marker is *not* about cargo culting? The programmers back in the day, real or fake, had to use some editor, right? I'm sure they would have been happy to use any editor, as long as it was free.

The programming profession is, in my not so humble opinion, a commodity profession by now.

What I mean by commodity profession: Exactly what you need to know to be a web dev (to take a prominent example) is well-defined since long.

Caring about small things like this just corroborates my point.

I think it is time for all status-chasers (?) aged 20-something to stop caring about ridiculous geek markers such as debating terminal IDEs.


I can't remember where I heard this, but the idea is that every hobby X is secretly more like four sub-hobbies:

1. People who like to do X

2. People who like to tinker with the stuff around X

3. People who like to talk about doing X

4. People who like to talk about the stuff around X

For X you can substitute biking, cars, whatever, but for this discussion, X is "programming".

1. Some people like to program

2. Some people like to tinker with shells and editors and languages and frameworks

3. Some people like to talk about programming

4. Some people like to talk about the tools of programming

Of course people don't fall neatly into one of these buckets for a given hobby, but we certainly tend towards some subset of them. Your "status-chasers" may just be folks who tend towards bucket #4 when it comes to software stuff.


> I think it is time for all status-chasers

You're more or less just telling people they should socialize less.

Get back to me once you've convinced people to stop arguing Ford vs. Chevy or Dodgers vs. Giants or Makita vs. Dewalt or....


No, I'm saying that caring about VI vs Emacs is nothing else than high brow gatekeeping/in-group signaling.


You're right that it is in-group signaling[1]. But it is also socializing. Comparative-opinion jawboning passes time and serves as a social learning mechanism as well as an assortive signifier. That sort of thing is also, by volume, a huge fraction of interpersonal communication - much like gossip, it is a double-edged mechanism hard-wired into human behavior.

Like I said, let me know when your utopia has eliminated assortive signaling about sports, tools, accents, clothes, art, food, housing, hairstyles, schools, employers, sex, shoes, religion, cars, text-messaging background colors, music, politics... all of which is far more pervasive, and some of which has historically escalated into literal wars.

[1] I also personally agree that it is boring. I couldn't care less what editor people use unless they've configured it wrong in a way that messes up a repo I have to use, and even then only to the extent that I probably have to explain to them how to fix it.


> You're more or less just telling people they should socialize less.

Which is in my opinion not a bad idea ;-) , but for many people getting rid of the habit of socializing is similarly hard as stopping smoking or stopping being an alcoholic.


Still the best IDE.


I've joked that I have a disintegrated development environment.


My IDE is a tiling window manager, neovim, and several terminals. There is a browser somewhere for reading documentation.


Since the author mentions RHIDE it reminded me of Motor. I haven’t used it but might be interesting for anyone interested in a 90s style TUI IDE on Linux. It’s written in C++ and using classic ncurses.

https://blog.damnsoft.org/developing-on-a-pocket-chip-with-m...

Looking for new maintainer.

https://github.com/rofl0r/motor


I agree, UNIX is a great IDE and even more, it is a good documentation system. I believe originally it was used for documentation purposes when it was first designed.


VS code is most importantly a window manager, actually three window managers - sidebar, panels and editor groups.


People love their integrated IDEs. And I don’t want to spoil that happy feeling. But perhaps we should actually focus on making programming languages more human friendly so we do not need complicated tooling around them to make us feel productive.


OK. How?


I'm sure the answers to that will show that what your parent-comment "demands" is impossible to generalize.

We've once done a roundtable at our local developer meetup with this question. There were 20 people there. There were 20 different must-haves, requirements and ideas. We concluded that there's hardly a common denominator that isn't already present in most languages. And that all the other trade-offs in existing languages are warranted. Basically "it depends" :)


I strongly agree, and the fact that the poster made no suggestions suggested he had none to make, it just being a nebulous wish for 'better things' . Nothing terrible about that but hardly actionable. Then again, if I was wrong I might've learned something useful from him. Very occasionally you do strike gold.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: