Hacker News new | past | comments | ask | show | jobs | submit login
Reverse.jar (tmorris.net)
50 points by henning on Nov 7, 2010 | hide | past | favorite | 39 comments



[only tangentially related to the sentiment here but:]

There is this tendency (which I can sympathise with but only to an extent) in mathematics and to an extent computer science.

"Let me solve the problem once, in as general and as abstract terms as possible. Leave the lesser minds to prove the corollaries, to apply the work. Let them take on the cognitive load of rephrasing their problems into my abstracted vocubulary in order to benefit from my vast insights and my generalised theorems."

But, crucially in software development, formal systems are designed for the human brain -- and not just individual brains but whole teams of them. Programs become as much a medium of communication between humans and other humans (of varying skillsets) as between humans and computers.

You can't escape UI work and UI considerations, I guess is the point :)


You can’t escape UI work in math, either. The comprehensibility of the notation system is crucial to forward progress. Which makes it unfortunate that a lot of mathematical notation ends up being unnecessarily arcane, taking ideas that make perfectly clear sense graphically or intuitively and mangling them into an impenetrable tangle. And I don’t think there are enough serious challenges to bad notation: if a notation has worked for a few people trying to prove things, others too often accept it and move on.

Some of the complexity may be necessary in the name of precision, but a lot of the time it’s simply a design failure. Part of the problem may be that it doesn’t seem like mathematicians get any real training in notation design or abstraction communication. Theorem proving exercises in school almost always stress rigor over clarity of explanation: proofs in homework are always either correct or not correct, and what is tested by them is students’ comprehension of existing notation and previously developed abstractions, but there is no exercise which tests students ability to invent good new abstractions or notations.


I think this line of reasoning is why the C# team avoided the M word and introduced Linq for a range of things (sequence comprehensions, reactive programming etc) and is introducing the async syntax for more stuff in 5, even though they could all be implemented with the same more general model.

[edit] the M word is Monad


I don't think there's anything inherently wrong with monads, it's just that nobody's figured out how to explain them yet :) This is somewhat compounded by the awful name and Haskell's less than practical character. My guess is that they'll turn up in a new language in a few years and everyone will wonder what they did without them.


I would agree. Also, the LINQ interface is the monadic interface with Bind / >>= renamed to SelectMany and return renamed to ToIdentity. You can use these directly or with syntax sugar, so i would suggest that they have already turned up in a language.

Anecdotally, I think microsoft flubbed the introduction by overemphasizing LINQ to SQL. This leaving a lot of developers I know thinking it was a one trick pony and never learning (for instance) LINQ over IEnumerable. Those who have since taken time to learn the non-SQL LINQ features do wonder how the lived without them.


Don't people argue that jQuery is DOM manipulation in monads? I sure ask what I did before jQuery :)

http://importantshock.wordpress.com/2009/01/18/jquery-is-a-m...

HN discussion: http://news.ycombinator.com/item?id=439116


People argue this, but not people who know what a monad is.


Care to clarify, or do you prefer just telling people they're wrong? The linked article is on the surface convincing, but I have never touched Haskell


I've spent more time explaining monads to HN than I have anything else. It comes up at least twice a day and I am tired of writing an explanation twice a day.

Go read the Typeclassopedia.


I think there's also the issue that they're a lot less convenient without built-in syntax (or macros). Haskell's do-notation makes >> look like a newline and >>= look like an assignment. Without this syntactic sugar, monadic code rarely looks like an elegant solution.


Without this syntactic sugar, monadic code rarely looks like an elegant solution.

I'm not sure this is widely agreed-upon. Here's what a program composed of pure functions looks like:

  quux = baz . bar . foo
This is a function of one argument that filters the input through foo, then bar, then baz. Just like UNIX pipes, but backwards.

Now let's try it with monads:

    quux = baz =<< bar =<< foo
Same thing; this is a function of one argument, one that passes that argument through foo, then bar, then baz.

>>= notation is quite elegant. do notation just makes functional programing look imperative.


A monad is simply a monoid in the category of Haskell endofunctors.


It seems to be overwhelmed. Google cache: http://webcache.googleusercontent.com/search?q=cache:16XHnrP...


Text-only version of cache (Full version also isn't loading) http://webcache.googleusercontent.com/search?q=cache:16XHnrP...


This happens, but sometimes the academic doesn't make the breadth of immediate utility for their intentions clear.


Part of the problem is that people using said languages may not see that those problems have anything in common. When somebody comes along and says, "Dude, you can automatically handle all of those edge cases at once, with one general rule!" it sounds like a bunch of mathematical jibber-jabber.

(The fact that it often is mathematical jibber-jabber doesn't help their cause, though.)


Early implementations in research languages often don't actually simplify programming, even if in principle they're a good idea, which may contribute to resistance. A and B might really both be instances of some more general abstraction C, but that does you no good if specifying abstraction C is a pain in the ass, and instantiating it to make A and B each happen is tedious and complex as well.


IMHO the difficult part is seeing the commonality of those edge cases. That requires firstly knowing about the edge cases, and secondly seeing the problem (the commonality of those cases). The final step is a solution, which is where the maths comes in (or might).

Anecdotally, it seems fairly common that maths is independently re-invented by people applying it - famously for relativity, IIRC. That's because the maths guys don't actually know about the applications.

It's a truism for our industry that if a new approach really is significantly better (eg. x10) in practice, it will be adopted. You don't need to convince people; you just beat them. OTOH, there's a common wish to over-automate: to spend a week saving a second, and then it turns out to not handle the very next case. So, some people don't like to use frameworks because they are too constricting (don't handle all the cases in practice); and some (Alan Kay) even say if you can build your own infrastructure, you should.

Mathematical ideas usually only work on their own assumptions - a difficult part is matching those assumptions to an application. Though maybe this isn't a problem for the generics example.

There's also incidental practical issues, like the need to ship, then of back-compatibility, resulting in a current Java implementation that can't express List<Circle|Rect> (you need an explicit Shape interface/superclass). Although *ML has proper algebraic data types, does C# do it properly? I don't know.


> That's because the maths guys don't actually know about the applications.

That's often a big part of it: Someone who isn't a day-to-day user of something sees an issue in it and recognizes that it could be done in a cleaner way, but explains the solution in their own terms rather than in the local language.


The article seems almost a direct response to the ``A statically typed language I'd actually want to use'' article: http://news.ycombinator.com/item?id=1872501

I agree wholeheartedly with the general sentiment. Even if it is a bit outdated. Type polymorphism (generics) is now well accepted.


It's a balance. With overly simplistic languages, one ends up with too much code and repetition. However, when languages get more, well, 'advanced', it just gets harder to hire people who can work with it. Sure you need fewer people, but they're harder to find.


In reality, the obstacle is in realizing the need of list.reverse library, rather than in making it decent.


You know, a lot of non-academic languages are pretty crappy... like Ruby, or PHP. But I'm not impressed with languages out of academia either. I don't care about your -morphisms or fancy type systems...


You know, creationism and demon possession are both quite crappy. But I'm not impressed with biology out of academia either. I don't care about your evolution or fancy germ theory...


Your argument is so compelling. Now I see how wrong I've been. I'll just accept everything the ivory tower says from now on, on any topic, unconditionally. Brilliant.


Reminds me of something I read on http://skeptoid.com/episodes/4217 :

Appeal to Lack of Authority

Authority has a reputation for being corrupt and inflexible, and this stereotype has been leveraged by some who assert that their own lack of authority somehow makes them a better authority.

Starling might say of the 9/11 attacks: "Every reputable structural engineer understands how fire caused the Twin Towers to collapse."

Bombo can reply: "I'm not an expert in engineering or anything, I'm just a regular guy asking questions."

Starling: "We should listen to what the people who know what they're talking about have to say."

Bombo: "Someone needs to stand up to these experts."

The idea that not knowing what you're talking about somehow makes you heroic or more reliable is incorrect. More likely, your lack of expertise simply makes you wrong.


> The idea that not knowing what you're talking about somehow makes you heroic or more reliable is incorrect

That's a really good definition of anti-intellectualism. It's easy to sneer at academic languages (and easy to hate them, just observe a college frosh/soph struggling through Scheme or Haskell), but things we now take for granted e.g., garbage collection, virtual machines, IDEs, object orientation, templates/generics were all (even recently) considered academic.


the only struggle with scheme is with falling asleep


maybe next year, if you keep up the good work, you'll be reading about Appeal to Some Website.


If you've taken the time to learn Lisp and it hasn't impressed you even a little bit, you're made of more stubborn stuff than most programmers.

But most of the "non-academic" languages wind up cribbing from the academic languages anyway. For example, Python is very much a Working Man's language, but it pervasively uses not only garbage collection (from Lisp), but also list comprehensions borrowed directly from Haskell!


I'm impressed with lambda calculus as a mathematical formalisms and its equivalence with TMs, and I am impressed but not overly enamored with homoiconicity & metaprogramming/macros. I like conditionals. Not a fan of marketing minimal syntax but having special forms. Not a fan of cons as a a building block for everything. Not a big believer in garbage collection. And certainly not a believer that Lisp is a practical programming language for much of anything these days.

But don't let me stop you from caaaaadring away!


> Not a big believer in garbage collection.

This makes me seriously wonder whether you're trolling. Out of purest and most innocent curiosity, what sort of programming do you do?


I am mostly doing programming in languages with GC, which in this case is fine because it's OK for it to go, you know, kind of slow.


Often, manual allocation is even slower.

http://lambda-the-ultimate.org/node/2552


With only three times as much memory, it runs on average 17% slower than explicit memory management. However, with only twice as much memory, garbage collection degrades performance by nearly 70%. When physical memory is scarce, paging causes garbage collection to run an order of magnitude slower than explicit memory management.

I don't see how this article is helping your point; that's a pretty massive hit. Even worse, this is a GC-style program modified to use explicit malloc/free. Programming with manual memory management encourages a very different style of programming where you try to use malloc/free as little as possible, and you try keep memory for various things in contiguous chunks of memory.

EDIT:

This comment is interesting: http://lambda-the-ultimate.org/node/2552#comment-38915


They may have had Appel's "Garbage Collection Can Be Faster Than Stack Allocation" (http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.39.8...) in mind.

Also, the technique described in that comment (forking processes with finite lifetimes, allocating but not freeing, terminating and letting the OS clean up all at once) is also what Erlang does, and it seems to work pretty well in practice. Each process has its own arena, for allocation purposes.


I think it's the discussion that we're meant to read, not just the OP. Most of the comments are pointing out questionable assumptions in the study (e.g. even though they admit that actually changing the program to use manual memory management would be nearly intractable, they assume that essentially running an AOT garbage collector is equivalent).


No no, he's an expert programmer that can do repetitive tasks better than a computer.


I disagree. It's true that a lot of programming language research is impractical in the real world, but occasionally it produces gems. I'd rather have the gems than nothing at all.

The real danger is not the languages, but their proponents. Academia can't make you use Haskell, but other people can.




Consider applying for YC's W25 batch! Applications are open till Nov 12.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: