Hacker News new | past | comments | ask | show | jobs | submit login

This is where the Ruby and Python communities fundamentally disagree. The Ruby community is great at moving fast and replacing bad things, while the Python community takes a much slower approach to the whole thing.

I wouldn't say any approach is better or worse, it has to fit your personal style. I like the Ruby approach, others love the Python approach. This means that Ruby breeds a lot of interesting stuff, but with less stability guarantees.

At the same time, Rails still builds on the ideas of Rails 1, which is an achievement in itself. So Rubyists is not totally not caring about backwards compatibility - they just tend to throw out bad ideas quickly again.

On the other hand, the quick pace allows the MRI team to build improved interpreters with a reasonable chance of quick adoption.

This is a much more fundamental difference than any snickering about the garbage collector of the main interpreter, from a users perspective.




Differences in userbases could be part of it. Scientific computing is an increasingly important part of the Python community, for example, and they tend to be averse to backwards-incompatible changes. In part that's because you have many good but very lightly maintained libraries that stick around forever, so people prefer if they stay working when nobody touches them, rather than bitrotting and needing constant updating by thinly stretched maintainers. Heck, old Fortran stuff is still in regular use, some of which hasn't been touched in years.


It seems that communities that mainly use programming languages for reasons of "getting stuff done" value backwards compatibility the most; they are the ones who would rather not have to spend the time "upgrading" things that used to work perfectly fine, and would rather use that time to do something more useful and related to their ultimate goals. Personally I think backwards compatibility is getting less attention than it deserves, and that software should evolve slowly and gradually and not suddenly make huge leaps, because all these breaking changes start to look like they're just creating unnecessary, wasteful work. Imagine if things like mains plugs changed every few months, with appliancemakers all adopting the newest backwards-incompatible version. In some ways software is easier to change than hardware, but at the same time we must remember that effort needs to be expended to perform these changes as well, effort that could be used in other ways, and often there is a lot of interconnectedness in the systems we are trying to change.


"Imagine if [...] plugs changed every few months, with appliancemakers all adopting the newest backwards-incompatible version"

like apple?


iPod/iPhone

October 2001 - Original iPod - FireWire

April 2003 - Third-Generation iPod - 30-pin Dock Connector

September 2012 - iPhone 5 - Lightning Connector

Apple Laptops

1998 - PowerBook G3 - Unnamed Circular Power Connector

2001 - PowerBook G4 - Smaller Unnamed Circular Power Connector

2006 - MagSafe

2012 - MagSafe 2


Similar to Innovators/Visionaries vs Pragmatists (http://www.chasminstitute.com/METHODOLOGY/TechnologyAdoption...)


Furthermore, with scientific computing, it's important that code you publish in a journal (fossilizing it) can still be at least near-usable to others over time horizons of years to decades.


That's definitely one important aspect. And even stuff not formally fossilized often becomes de-facto fossilized due to the way funding for development works. Things are often very bursty: a large library or piece of software may be written over a period of 2-5 years of concentrated effort either by a PhD student, or by programmers/research-scientists/post-docs hired on an NSF/DARPA/EU-funded research project. But then the PhD student graduates, or the project ends (and therefore its funding for programmers), and the software goes into much lower-staffing maintenance mode. In that mode there aren't resources available for anything but minor fixes. There are some very high-profile projects that are an exception to that pattern, because they're seen as important enough that they manage to string together continuous development for years or decades, either through a series of PhD students or a series of grants. But lots are more or less write-and-then-maintain. Despite being lightly maintained, if the initial work was solid and produced a reasonably "complete" output, it might still be useful to other researchers for years into the future, if it doesn't bitrot. Some of the R packages are a good example: plenty of stuff hasn't been touched in 10+ years but is still in daily use.


Differences in userbases are definitely part of it.e.g. Ruby is very often used in the whole devops space - where constant forward-change is usual, because of the mindset of "never being done".

Constant change is a thing that doesn't get enough mindshare as well. Not training your team at handling change is probably as bad as not having programmers care about backwards compatibility.

To be quite clear: I think it is very beautiful to have 2 languages in the same space, evolving into different community directions.


There's plenty of rapid iteration in the Python world, and as far as I know, there is no fundamental disagreement with the concepts behind rapid iteration. Development still occurs in the open, but releases are well-planned and well-versioned such that backward incompatibilities are clearly billed and rarely introduced if not necessary. Pyramid is an exemplary Python project that has very rapid development but still respects compatibility concerns.

My opinion is that Python projects are simply more likely to have more mature release processes because Python is more likely to be used by mature engineers than Ruby.


I didn't talk about release processes. There are lots of projects with mature release processes in the Ruby world. What I meant is that agressively replacing parts that became obsolete or turned out problematic is much more accepted.

> Python is more likely to be used by mature engineers than Ruby.

It is very sad that you waste a good post for such an ungrounded attack.


It's not an attack, it's my opinion that more mature hackers eventually converge on Python, and I think the community reflects that. Maybe it's simply that Python has never had a project with the same sex appeal as Rails, and thus has avoided an "Eternal Summer"-esque influx.

For the record, the last time I wrote Ruby code was about 4 days ago, and the last time I wrote Python code was yesterday afternoon. I'm a part of both communities and I think that there are a lot of people who are. From my experiences in both communities, I think that the Python community is much more mature and professional.


I believe cookiecaper is right for the foreseen future, there is more demand for Python Data-Scientists than Ruby Developers.

But we ignore one important thing here and that is the lesson! People need to learn something from the Python story!

Moving forward with an evolving concept, requires the (mathematical) coherence of all ideas. You cannot invent a spoken language, then break it and say now we speak a different dialect, because we use 2 words less this way. Nobody will adopt, not because the feature of reducing verbosity and increasing expressiveness is actually bad, but because the new concept branches out and technically seen it's "noise", because it adds more without integrating it. You can throw a second motor into a car, but without integrating the second motor, you will have no benefit. Python has to learn this the hard way, Dennis Ritchie and Ken Thompson made a decision for the C programming language about 40 years ago and all of the C code written back then can still be compiled, albeit some things have changed and require minor changes. But this is something you can introduce in a timespan of 40 years. You cannot make a new present every 5 years and say, that all many of the old presents you've made back then, have to be given back, when you accept the new present. Coherence and Evolution are powers that's use should unify and only diversify when required or requested by the diffusion of technology into the userbase.


I think the same about old people btw. we branch their value out to an old value that is not compatible with the values in our current system. Oh boi, we do that so wrong, it's laughable and very sad at the same time how our society thinks about old knowledge, old people etc.

I think HN is the community that most loudly would agree with new != better but that's exactly what we do wrong. Holy cow, I can't explain how much value we have at our disposal that we throw away + pay to keep it away comfortably. New businesses don't integrate old people, because they don't really know how to make value out of them. That's a simple equation, if you see it this way. It's not because old people cannot contribute to the development of IT, Startups and the Hacker scene. We just have no business model, not even a concept that considers these elder men and women.


I think this is well said. It seems to me that python iteration happens in the space ahead of current (x.x.dev).

I can't really say how ruby does it because I haven't use ruby much.


The problem with the fast-evolving Ruby approach is the cost of staying up to date. I was intimately involved with several projects to upgrade nontrivial codebases from Rails 2/Ruby 1.8 to Rails 3/Ruby 1.9, and these consumed serious amounts of engineering time and introduced lots of obscure bugs, with the primary benefit simply being not getting pwned when the next exploit comes trundling along. Fortunately our management understood and was willing to put up with the pain, but many (most?) would not.

Now, I work primarily with Python 2.7, with migration to 3 being a distant pipe dream. After years of Rubying, I find it a bit old-fashioned, awkward and occasionally infuriating (strings default to ASCII? srsly?) -- but I do appreciate knowing that I will not have to rewrite everything anytime soon.


The 1.8/1.9 switch was problematic and can be seen in parallel with the python 2.7/3.x switch. Rails 2/Rails 3 was a similar big jump that reengineered the whole framework. Rails 4 was a much tamer release in that regard.

But one has to realize that 1.8 was the series of Ruby that was started _before_ Ruby even got popular (I started Ruby using 1.6, which was even more problematic). So being aggressive in breaking stuff with the next iteration (basically, to do it proper, with more manpower and data to work on) also opened up a lot - for example, Ruby 1.9 to 2.0 is a much simpler switch and I have many clients running 2.0 now and testing on 2.1.

The fear I always have with python 2.7 is the distant future. It just irritates me to have a huge migration in front of me. The Rubyist in me wants to keep up with the current state of things.


1.8/1.9 and 2.x/3.x were handled very differently by the respective communities. That's evident even from the version numbers.


That's because until well into the development of Ruby 1.9, Ruby was still using the odd (dev)/even (release) approach. 1.9 introduced things that were different, and it wasn't stable enough for production use until 1.9.2, at which point work on 2.0 started.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: