Hacker News new | past | comments | ask | show | jobs | submit | xuejie's comments login

Just provide a not-related-at-all but IMHO still interesting case: I used to have a Kioxia CM6 U2 SSD drive, it would pass all sorts of benchmarks the reseller is willing to run, but whenever I tried to clean-compile Rust on it, the drive would fail almost certainly somewhere in the build process. While there are configurations you can compile Rust using pre-built LLVM, in my tests I'm compiling LLVM along the way. So I can agree with the comment here, there might be some unique property when doing multi-core compilations, though my tests show a potentially faulty drive, while the above comment here is about Intel CPU.


I've been using micro as my main code editor(well I do use vscode for writing coq but that's the only exception) after 10+ years' time with emacs. I simply treat micro as the modern compromised version of acme. It almost has all the features to support the core idea of acme, I have written a plugin to exploit this direction: https://github.com/xxuejie/micro-acme So far it has been working perfectly for me.


I keep trying stripped down terminal editors as alternatives to emacs -- because these days I use like 1% of emacs anyways -- but I always severely miss the common emacs major modes for C/C++ etc and especially its approach to indentation there (tab to force correct indent). I really wish other editors would follow this convention.


> I really wish other editors would follow this convention.

In editors with configurable keyboard shortcuts, you can bind this to tab yourself. For example, in Vim, the normal-mode command `==` would correctly reindent the current line. (Here, `=` is an operator, so you can also apply it to other regions like `=i{` to reindent correctly everything inside braces.)

To bind tab to reindent correctly in normal and visual mode, you can put this in your vimrc:

    nnoremap <tab> ==
    vnoremap <tab> =
You can also remap it in insert mode if you want, but then you might want to throw some if-statements in there to determine when to indent and when to reindent :)


Knowing nothing about micro, is there a reason why development has slowed down over the last 1-2 years?

https://github.com/zyedidia/micro/tags


A big reason is because I started a working towards a PhD recently, and so I've been more focused on that. I think micro has also reached a relatively stable spot, where it would only be significantly improved with some large new features. It is serving its purpose well as a simple/familiar terminal-based editor. I have plans to give it more love and release a version 3, but no timeframe.


Like other comments here, just want to say thank you for this amazing editor! It has been my primary code editor for the past few years. I did use a few different editors for specific purposes(vscode for interactively proving coq code, joe/nvi for editing super large files), but micro does provide what I need in 90% of the cases :P


I recognize that username :) Thank you so much for micro, it's singlehandedly the best terminal editor I've ever used, and I use it every day!


Thank you so much for micro ! I've been using it every day since 1 year and I'm a fan. I'm really glad you achieved to bring it to a stable spot. Congratulations also to keep it that way and not fall into 'a vim mode could be great' as some users requested.


As a faithful user of micro for the last couple of years, just know that its one of the first things I install in a new system along with the fish shell, especially in servers I have to manage via console. Thank you so much for your work!


Hi! Great editor!

It could use a fix for saving hidden files on Windows :-(


because it's a text editor that's already fully capable of editing text. not every thing needs to constantly evolve and add features at a break-neck pace.


Emacs, which is ~40 years-old, still regularly has multiple updates per year.

https://www.gnu.org/software/emacs/history.html


You can order pizza and get a quote for a mortgage within emacs


Oh yeah! Good ol' C-x M-x M-pepperoni-fixed-rate...


Same here. Went from Emacs to Acme, while using mg in the terminal. Now I’ve customised Micro to get that Acme feel.


Also a fan of structural regular expressions in the exact same shoes that every time when I try to use it it's an unpleasant experience. Maybe I'm just not good enough at it.

That being said, I've been recently employing comby (https://comby.dev/) in my workflow, which solves similar problem, but understands certain languages to simplify the usage.


> The Raspberry Pi 4 is THE cheapest 2Gflops/W computer ever made and probably that will ever be made in the future too!

Can you expand on this one? I was curious why you think there probably will not be a cheaper one in the future with similar or better specs.


Reminds me of the first version of Redis in tcl: https://gist.github.com/antirez/6ca04dd191bdb82aad9fb241013e...


That is really cool to read! I love seeing the early seeds of big projects!


Man, that's so cool thanks for sharing !


IMO such a category existed for quite a long time, JVM and .NET all falls into this category. Yes WASM has its fame for becoming a browser standard, but fundamentally, there is very little difference between WASM and JVM.


JVM has objects as the basic building block. For WASM it's linear memory(ies), so it's lower level, and was created from the start to support languages like C/C++. .NET indeed looks quite similar, given that it supports C++ compilation, tough I'm not familiar enough with the compilation model to draw comparisons.


Personally, I already pulled the trigger 2 years ago when I first starting to write Rust professionally. A laptop just becomes far tooooo noisy compiling Rust code. These days I use 3 machines:

* A fanless Chromebook with decent screen for travel use

* An Intel NUC that is hooked to a big monitor, which is also the device I'm typing this on

* A beefy Ryzen desktop that sits in the corner of my balcony, which I usually connect via ssh and perform all the heavy tasks

To me I'm getting all the benefits of each computer, and the combined cost is still less than a so-called macbook pro :)


> I feel like the plethora of (partially incompatible) extensions make the language very complicated and messy. There is no single Haskell. Each file can be GHC Haskell with OverloadedStrings or GADTs or ....

Aren't macros, especially proc macros these days in Rust having the same effect? Personally I feel like this is a tradeoff every language has to play with: you either limit to a special way of writing, or adding some sort of ad-hoc system that enables rewriting syntax and even to a degree, semantics.


Anecdotally, proc macros just aren't that common. Almost every Haskell tutorial I read introduces language extensions, and it seems like many users have a set of extensions that they always enable by default. I don't think proc macros are really comparable in that sense, although maybe they will be in the future?


Overly powerful proc-macros aren't in common use; most common proc-macros are either ones that automatically Derive a trait, or ones that serves as attributes on methods or functions and perform some transformation of the source code (without introducing a DSL)


The author first goes with:

> Idris 1 is implemented in Haskell, but that has little (if anything) to do with the difference.

But latter they also go on to say:

> Idris 2 benefits from a robust, well-engineered and optimised run time system, by compiling to Chez Scheme.

I must say I'm slightly confused here. Yes a rewrite might also enable to avoid all the legacy part that might slow down the code, but what is also possible, is that a new language and a new runtime could enable new optimizations that are not possible before. The author did mention Chez's profiling tools help a lot in the rewrite. So I was curious: is it really true, that we cannot attribute some part of the speedup to language differences?

Also I was interested in the rationale behind using Scheme to replace Haskell, but I failed to find some reasoning behind this, anyone can shed some light on this?


That's not relevant at all. I think you have misunderstood the situation here. In Idris 1 the compiler and code generator is in Haskell, but the runtime is in C. So Scheme isn't exactly replacing Haskell but replacing C, for the runtime. (What replaced Haskell was Idris itself, by virtue of being self-hosting.)

The author even says that it's difficult to write C that deals well with the functional style of lots of little functions, and this is a problem Scheme also has and has solved. That's enough rationale to switch to Scheme:

> Generating good C code corresponding to a functional program which might have lots of small (and higher order) definitions is extremely challenging, so it is better to take advantage of someone else's work here.


Scheme isn't used to replace Haskell. Idris 1 is written in Haskell and compiled by default to C (it comes with a Javascript backend too and it has an API for making others). Idris 2 is written in Idris 2 and compiles to Scheme so far (Chez by default, there is also support for Racket and Gambit), it's not tied to scheme but there is no pluggable backend functionality yet.

So Scheme is replacing C, and Idris is replacing Haskell. The goal was self-hosting by writing the compiler in Idris. Chez Scheme is a nice compilation target because it's performant and Scheme is a sugared lambda calculus that goes well together with FP languages.


Thanks guys for the explanation! I truly have misunderstood this post.


I used to have a similar opinion, that it requires a special language to leverage such power. But recently I've discovered that Rust, with its latest advancement in async/await design together with swappable runtime, can provide a decent story for building determinism tests like described in the video. Yes of course one still needs to mock all the IO as well as clock part, but the deterministic core is already there, and with a carefully designed runtime and surrounding library, we might have pretty good support in Rust for this.

Disclaimer: I'm never a Rust zealot, I do understand Rust has its tradeoffs, and it's not the panacea for every problem. In fact I do have side projects which are perfectly suited and written in Go. I'm just saying Rust's design turns out to be suitable for such a deterministic testing structure, and Rust's target for system programming, can also benefit A LOT from this style of testing.

And also a shameless plug: I do have some initial work exploring this area: https://github.com/xxuejie/diviner. It's still quite rough and a lot of work is needed but I do believe this is something that is worth exploring.


Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: