Hacker News new | past | comments | ask | show | jobs | submit | tikej's comments login

What kind of emacs distribution is on the screenshot?


It's a new theme and/or distribution that I'm working on under the working title `hyper-modern`. It clearly takes inspiration from things like Doom (and I'm using their modeline which is hard to improve on) but it's mostly ground up and AI-first.

It's heavily integrated with my custom model server and stuff and I'm slowly getting it integrated with other leading tools (vscode and nvim and stuff).

I plan to MIT it all once it's at a reasonable RC. If I get there it will be available at `https://hyper-modern.ai` and `https://github.com/hyper-modern-ai`.

Thanks for asking!


The 128 bit arithmetic part is really interesting. I wonder whether they used libquadmath and whether they needed to implement any extra funcitons beyond what’s available there.


My favorite thing on the TI SR-50 was to calculate 69! ... It was the largest number that the 2-digit exponent could hold and also the slowest calculation. I also discovered the 1 hidden digit of precision (13 digits) by subtracting know digits of pi away from the pi constant to reveal the invisible digit in scientific notation ...


In the process of “shut up and calculate” (SUAC) you often come upon much deeper insights about the involved processes and quantities than you ever could with wordy and “understandable” explanations. Of course simple, explainable models are great, but they have limited ability to describe the world, which is much more complicated. I don’t see how successfully handling this complexity (and getting the successful predictions) could take away from understanding.

It’s not like those SUAC disciplines don’t have simple explainable models. In my opinion It’s just that in order to describe phenomena really accurately or if the described phenomena are complicated enough the only way to get reliable and improvable predictions is via complicated SUAC-type calculations.


My point is not that there are no insights coming from SUAC, but rather that explanations are not a required thing of science, because predictions work without them, and the accuracy of predictions, and making new predictions makes science move forward, whether with explanations or without.


I don’t have full explanation of the connection that would be satisfying. However, I think it mainly comes down to symmetry group of the spinors and their 720 degrees symmetry. It is called plate trick for a reason after all: https://en.m.wikipedia.org/wiki/Plate_trick


I find Julia’s Pluto to be reasonably close alternative.

Of course it lacks many features, but plugins systems and overall architecture gives hope for possible extension to include many of the desired features.


This is of course very true but doesn’t take into account invention of computer that is relatively recent.

Since it in principle never makes mistakes (in practice there are of course bugs, but they are usually different in nature than human errors) it changes what is possible and most convenient. You no longer have to optimise for simplicity as heavily for example. On the other hand computers basically can’t deal with ambiguity, so the rules and statements have to be stated very simply and clearly.

EDIT: One example that comes to mind are indexes in functions. Usually they are just additional arguments that are different somehow from the “main” arguments, for example often being non-negative integers. For humans it makes it easier to think and operate about indices separately from the rest of arguments. But for the computer it’s all the same, as all arguments are treated just as argument, (of course it depends on the implementation etc) and there is no need to treat them separately, since every argument is “special”.

I believe computers can change the landscape of what’s best notation. This is an interesting, interdisciplinary topic to explore.


I think APL was originally created as a fix to this problem. A completely revamped math notation to make it more fit to computers as a medium, instead of pen and paper.


If it was created thus then it could hardly be considered successful. To all but the dedicated and obsessive cognoscenti APL is nothing but gobbledygook.


Chess programming is fascinating but it seems for me that it has somewhat stalled in creativity in terms of classical (non DNN) engines. They all seem to use improved versions of min-max/alpha-beta. Which results in computer players that are extremely powerful but also quite dull. This is because they assume that the opponent will play perfect game and will do it at the full strength of the engine.

Basically there is no point playing against the engines at full strength as a human, since it will beat almost any human beyond grandmaster level. And that’s for weaker engines – the medium tier ones will regularly beat grandmasters and the top ones around the stockfish tier will probably not loose 1 game in 100 plays.

That considered I’ve wondered what’s possible dropping the assumptions about perfect play of the opponent. Some probabilistic models of opponent or different way of rating whole tree of moves in the context, instead of position only. For example considering the alternative moves and how difficult it would be to play perfect game for the opponent, it might be better to play combination with a lot of traps with possibility of loosing some position if the opponent plays perfectly. Current engines wouldn’t play such kind of move because they’d assume (via min-max/alpha-beta) that opponent will have perfect answer. Of course, if the opponent is an engine it will, but against humans it would make things much more interesting.


You might find https://www.youtube.com/watch?v=DpXy041BIlA interesting. He creates a tournament of "imperfect" engines.


https://boristrapsky.com (https://lichess.org/@/Boris-Trapsky) plays against a typical, rather than perfect-playing, opponent but I could not find any source code.


Mathpix is a great product that perfectly fills its niche and I’m very happy it exists.

I wish some open source alternative existed. Even much less accurate would be quite useful. I know latexify but it does only single symbols.

Maybe it will be implemented as an extra language in tesseract someday.


This is an open source alternative that works quite well https://github.com/lukas-blecher/LaTeX-OCR


Wow, thanks! I had no idea it existed. I’ve searched for something like it not that long ago.


I absoultely admire creativity and skills it takes to pursue developement of novel branch/framework of mathematics like this Super Calculus. I think that such original thinking at the fundamentals of mathematical formulations that underlying physical theories is required to overcome at least some difficulties modern physics struggles with. The problem is that it's nearly impossible to tell which one (even combination) is the right one.

The process to create one is so long and difficult, that there often isnt enough time to pursuit and find applications in which these novel mathematics coud prove to be superior over the existing ones. It creates a chicken and egg problem, where there are no arguments strong enough for the physics practicioners to switch these new formulations (as it takes a lot of practice and time with no guarantee of being any better than classical mathematics) and, on the other hand, the creators of these theories dont have enough time/audience/manpower/practice in practical calculations to go as far as where the problems of modern physics lie.

I've been thinking about this problem for some time and I think the right kind of design of the symbolic manipulation and calculation software could be of great help. The design of such software is certainly not an easy task, but hopefully somewhere along the road I, and hopefully others, will find some time and creativity to get started with it.


I second this thinking. You may be interested in reading about Geometric Algebra[1] and Infinitesimal Calculus[2], as two other alternative frameworks for the symbolic tooling used in the physical sciences. Together with symbolic manipulation and calculation software, I also think alternative, simpler symbolic frameworks may help solve what I call the "receding shoulders climbing problem". There's a phrase attributed to Newton "If I have seen further it is by standing on the shoulders of Giants". The problem today is that the giants became so big, one can spend many years studying and still die before getting anywhere near their shoulders.

[1]. https://arxiv.org/abs/1205.5935 [2]. https://en.wikipedia.org/wiki/Nonstandard_calculus


Yes! Those are exactly some of the alternative frameworks I had in mind. Also things like using differential algebra instead of classical epsilon-delta analysis and using Liouville's theorem to calculate integrals gives similar vibes.

Large parts of constructive mathematics seem to also be aligned well with physics and engineering (using only things that can be explicitly constructed) and hopefully could lead to interesting mathematical physics results. Same with, for example, quaternion analysis, which is very rarely used (contrarily to complex analysis), due to difficulties in operating quaternion valued functions, or computable analysis with its surprising result about differentiation of real an complex functions. I get similar feelings about non-classical logics. Somewhat more fringe examples that come to my mind, but are also interesting are for example non-Diophantine arithmetics or Holors theory (tensor generalisations).

Thanks for the "receding shoulders climbing problem" it’s a very nice way to frame the problem. I agree that there is a lot to be done to make shoulder-climbing process faster, easier and more widely available. It should also be possible, to make it more easy by for example storing not only results, but also derivations of many mathematical relations and computations. E.g. nowadays, there is no reason to put only results of integration in the integration tables, but whole derivations should be available as supplementary materials (preferably even in some form of symbolic computing code; this is somewhat realised with RUBI – rule based integration package). Size of the paper book is no longer the problem, so such “interactive derivations catalogues” should be extended to as many possible branches of mathematics, physics and engineering as possible.


Your comment reminds me of an old math professor of mine that told an anecdote about some famous mathematician that used to have a very promising student. However, after a while the student left and a colleague to the famous mathematician asked about the student. The mathematician said: "He lacked creativity, so he left mathematics and became a writer."


That “some famous mathematician” was David Hilbert. When he heard that one of his students had dropped out to study poetry, he said, “Good, he did not have enough imagination to become a mathematician.”


In Europe this is required in European grants, but also in many country specific funding agencies as well.

In 2018, 11 European research-funding organizations formed what is known as cOAlition S with the primary objective to ensure full, immediate, open access to all publications containing research data obtained in projects funded by its member agencies. The chief premises of cOAlition S were laid down in Plan S, which is scheduled for implementation in June 2020. So nowadays when someone wins grants from these agencies he’s required to publish his results in open access.

It’s not ideal, because many people reserve grant money for open access fees but AFAIK the policy also allows to publish behind paywall as long as you provide publicly available copy in repository (institutional or public like arxiv etc). In my opinion the second option is much better (as it saves money), but not all journals allow to put copy of the article in public repos (sometimes they require that the public version is either before editor’s revisions or puts some time embargo, like a year or 6 months, for publishing final version of the article).


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: