This is very eloquent distillation of the thought process underlying the evolution of my toolset over the last 5 years.
After having jumped into Rails and shifting from BBEdit to TextMate as my primary editor overnight back in 2005, when TextMate started withering on the vine I became disillusioned that I had put so much effort into pursuing such a short-lived tool.
Reflecting on my history with a UNIX shell going back to the late 80s, I realized that things I had learned 10, 20 years ago from the UNIX world were still relevant today. I committed myself to getting serious about vim because I want to optimize for A) learning many programming languages and B) not using verbose Java-like languages that require IDEs for the all the boilerplate and rote refactoring.
While this kind of toolset will never provide quite the bang for the buck of a contextual IDE in a specific language, it's a phenomenal hedge against all the career risks I face in terms of Ruby becoming irrelevant, the web becoming irrelevant, Apple nerfing OS X, or any other probable sea change. No matter what happens I feel like vim + bash will bring me an immediate level of productivity in any new task I face, even if I start flattening out before I reach the Eclipse or Visual Studio level of wizardry, I don't expect any one thing to last long enough in this industry for such optimizations to pay off.
Very similar to my view of Emacs. It doesn't matter whether I'm on my Mac, my work Ubuntu, or SSHed into a Debian server, my development environment is the same. With Cygwin, I can even make it work on Windows (heaven forbid! ;). It is nice to have the biggest context change be "where is my source here?"
Of course, but a reasonable chunk of my development work happens in shells. It continues to shrink over time, but there are a lot of things pipes can handle better than lisp. :) I would be lost without it.
> not using verbose Java-like languages that require IDEs for the all the boilerplate and rote refactoring.
You probably need to revisit your opinion on IDE's, they haven't been used for boiler plate code since the late 90's with Visual Studio.
Java IDE's make your more productive and they help you keep your code base in a healthy state with very little technical debt. Not using them would be like preferring a screwdriver over a drill.
My understanding of dasil003's point is that it is not a complaint about IDEs, but rather about languages that are designed in such a way as to require IDEs. For example, in Java, if you want to sort a List by a given field (using the Native Collections.sort method), you would need to do something like:
Collections.sort(myList, new new Comparator<E>() {
public int compare(E o1, E o2) {
return 01.val-02.val
}
}
(You could also make your class implement Comparable, but then you need to own the class, have only one ordering used in the entire program, not mind making the ordering a property of the class).
If you are in an IDE, then writing all that out is not to bad because of auto complete. If you are stuck in a plain-text editor, you would much prefer something like:
Collections.sort(myList, lambda<E->int> x->x.val);
Java 8 does incluse lambda's, so maybe this particlar case has been solved. But languages with a strong CLI pressense in the community tend to need alot less IDE than languages with a strong IDE pressense in the community.
Another way of looking at this is how much code you have to read. This is one of the reasons I hate wizard generated code in a way that I don't hate InterfaceBuilder or similar serialized object trees. If I'm gonna have to read the code eventually, the fact that your IDE makes it easy to draw a button is cold comfort to me when it takes 100 lines of cryptic C to do it (actually pretty common in early Windows programs or Xaw hacks, but mercifully diminished in frequency in these more enlightened days).
Code generation can cause people to neglect library design; the reason why the MFC wizards spat out volumes of code instead of calling a library method is because you might need to rewrite some or all of that. You are not expected to modify InterfaceBuilder output, nor should you have to any more than you would have to regularly rewrite bits of Xlib to customize them appropriately.
This is correct, I just don't like Java. IDEs are really beside the point; but the fact is there is no IDE that supports the breadth of languages that vim or emacs support. And there's certainly no IDE that I would want to run over SSH to machines spanning oceans, whereas vim copes incredibly well.
I don't use Java but the equivalent in C# is literally `thing.OrderBy(p => p.Whatever);`. So even simpler than your example.
The boiler plate code has been disappearing in C# a lot faster than Java, but I believe Java is catching up.
So needing an IDE to actually edit code to relieve the tedium of boilerplate is is not the case these days, although you still need it to manage things like references, I'd not want to manage that myself.
I used to be in the same camp about not using languages that require an IDE. But I have recently started using something akin to a Haskell IDE, and I have to say, things like displaying the inferred types of expressions in a tooltip and jump-to-definition are nice to have.
> Java IDE's make your more productive and they help you keep your code base in a healthy state with very little technical debt. Not using them would be like preferring a screwdriver over a drill.
All analagies suck, but let me just refine this one by saying that the unix philosophy is like preferring a toolbox with a screwdriver, a wrench, a saw, a hammer and other simple tools vs a power drill—the power drill is great for driving screws but it's completely useless for other tasks, and it won't necessarily fit into tight spaces that a small screwdriver would.
I think much of Java programming we are talking of today is basically using the IDE.
Opening a file and reading from it takes tens of lines in Java, and that's just a trivial task. Stuff like that is better left auto generated.
You don't learn Java these days, you just learn eclipse. Much of the magic is happening in auto complete. I am not sure who picks up a book to learn java these days.
For your particular example, the api has been better since Java 5.
import java.util.Scanner;
import java.io.*;
public class ScannerTest {
public static void main(String[] args) throws FileNotFoundException {
Scanner in = new Scanner(new File("some_file"));
while (in.hasNextLine()) {
System.out.println(in.nextLine());
}
}
}
> You don't learn Java these days, you just learn eclipse. Much of the magic is happening in auto complete.
I code in vim with eclim. There is nothing to be gained by manually writing the code for getter setter, or find-replace an identifier, or write placeholders for n methods of an interface...
> I am not sure who picks up a book to learn java these days.
Also, what good would eclipse do to someone who doesn't know what to write? Consider my example above. Unless you know how to read a file, how can eclipse generate the code for you?
Or consider generics. How will eclipse help you understand what does <T extends Comparable<? super T>> mean? Eclipse is an aide. Unless you understand the language well, it doesn't help.
> I code in vim with eclim. There is nothing to be gained by manually writing the code for getter setter, or find-replace an identifier, or write placeholders for n methods of an interface...
None of which is necessary in better languages. I believe that was the parent's point.
It isn't necessary in any language, but it sure is useful.
I don't know what you mean by better languages, but I program comfortably in variety of languages(Ruby, Python, C, C++, Java, Clojure, Lua, Racket, go, JS, perl...) and haven't found a single language in which context aware auto-complete, assisted re-factoring, looking up inline documentation etc isn't useful.
"Generating get/set and abstract method bodies is only necessary in Java"
On smalltalk instance variables are private, so if you want get/set it's value you need a getter/setter.
You can argument that this is bad oo design, but it's not java exclusive.
I'm reminded of The IDE Devide (http://osteele.com/posts/2004/11/ides), which describes the differences between a "langauge maven" (which I'll admit to being) and a "tool maven".
Is switching tools really so catastrophic? I have emacs keybindings burned into my muscle memory but just about every tool I use supports them.
But if I'd insisted on sticking with Emacs instead of moving in the mainstream (Xcode & IntelliJ for mobile) I would have ultimately been much less productive than I've been after spending a week getting my head around more specialized tools.
My brain-stem knows what to type when I want to go to the end of the line. If the tool doesn't do what I expect when I hit that keystroke, I update the keymapping. In about 1 hour of coding/retraining, the keymap works quite well.
Thank you for this "when TextMate started withering on the vine I became disillusioned that I had put so much effort into pursuing such a short-lived tool."
It's a trade-off, right? As most things in software are.
Recently, I inherited a project that had 10,000s of lines of ant scripts. Resource constrained, I didn't have anyone to maintain that build process. So, I took a few hours and converted those projects to several hundred lines of maven 'scripts'. Now, it works way better than the previous system, because now I can have a person focused on coding features, rather than maintaining process.
Maven was what I knew at the time; and, it fit well enough. Maybe Grails is the right answer, though. Or, something else.
Anyway, ultimately, it comes down to what trade-offs we make and what things we prioritize with our limited knowledge that we have available. And, that is the art of software development.
The poster probably means gradle. I am not pointing this out to be pedantic, but because it's not a given that every reader will know their way around the universe of build tools.
Gradle doesn't have the library of plugins Maven has.
Do you really need stupidly concise code in a build process that gets updated .01% of the time compared to the rest of a project? I'd rather have the verbosity and formality (Maven plugin vs. customized groovy)
But he is absolutely correct about the key observation: a basic tool with "plugins" is simply NOT the way to create a build tool. A built tool should allow (when needed) the full flexibility of a Turing complete programming language. Because eventually your project is probably going to need it.
So...the way that you do that in Maven is to create a plugin, which is very easy to do. There's your full flexibility, as soon as you want it.
Your plugin has to play by the rules, which is a good thing.
Maven's useful for more than the beginning of a project. If you're working on real-world software, you're probably not the only person who has to understand your bespoke build, if that's the way that you've chosen to go.
The problem is that there is a large overhead associated with writing a plugin. Sometimes the things you need to do are as small as copying a file from one directory or another. Having to write a plugin to do something so small is overkill.
Does this already exist? Based on the top result I get from a Google search for 'maven copy file' [1] this seems to be not very well solved. The top response in that Stack Overflow page recommends calling out to a completely different build tool. It seems like part of the problem is that frequently you will have several plugins that each solve part of a problem but it is difficult to combine the plugins so they solve the complete problem. This seems to result in a lot more 'reinventing the wheel' because something that should be easy involves a lot of hoop jumping.
The stack overflow question is about copying files to a remote server during a build, not just copying a file to a directory. Use http://maven.apache.org/plugins/maven-resources-plugin/examp... — if your build copies files to a remote server you have other problems.
As far as I can tell, that question doesn't say anything about about a remote (as in on another machine) server. It sounds like the question is just about copy files to a directory outside of the Maven project.
I know that Maven is ultimately capable of being used to copy a file. However, there are multiple ways to do this because all of the options have limitations and are somewhat awkward to use. For instance, I don't think resource plugin you linked to supports renaming as part of the copy.
Why is using your build to push files to a remote server a problem? A common use would be to deploy local code to an environment that simulates production. If that cannot be done as part of the build then it will exist as scripts that exist outside of the build. This means that there will probably be a lot more 'reinventing the wheel'.
Ah, like some of the commenters, I interpreted this line:
"I have config files and various documents that I want to copy from the dev environment to the dev-server directory using Maven2."
To mean a remote dev-server and then maven-upload was mentioned in the list of solutions. Let me step back a bit.
Maven is primarily interested in the process of building. There are functions that allow deployment but generally that is not Maven's domain. Using Maven as part of your deployment process is definitely going to be at odds with a smooth experience. I think of it like making non-default choices in Ruby on Rails, you are asking for pain.
As to your point about "reinventing the wheel" for deployment using maven, I would suggest that there are many tools designed for deployment and would use those instead, like cap.
I can understand making that interpretation at first glance.
I agree that the process of building should be the primary focus of a build tool. I also, think that the secondary purpose is to make development easier. That is why you do 'rake rails' instead of having to use cap to deploy your changes locally. It's not that hard to come across a scenario that falls in to one of these goals that requires custom logic (like copying a file, or something similar). Some of this should go in custom plugins so it can be reused. However, some of it is small and non-generic and makes no sense as a plugin. The solution you pointed to for copying a file involves substantially more code to include and configure the plugin than it would take in other JVM build tools (such as SBT or Gradle) to simply specify that you want to copy a file.
I agree that doing deployment for final builds or to shared environments are better served by a different tool.
agree fully. IF you are doing something remotely common, chances are theres already a maven plugin that does it. It might have bugs, but chances are, the plugin is also opensource, so its no worse than you writing it yourself than having to fix the bug(s) you encounter.
The only reason i don't like maven is not because of its rigidity, but because of its network dependence. To use maven, you really need to run your own repository server, and that comes with huge overheads. I m warming up to the idea of always mandating offline mode in maven, and then check in to the scm a preloaded repository which includes all the libs and plugins for your project, and never have to worry about external repos, or network connectivity.
I used to be really annoyed by that too, but I'm not any more. It's _really important_ in commercial software development to understand where your code comes from. Having a local repository act as a gatekeeper does a nice job of that. It's a good idea to always have builds run where maven settings force use of a local repository. You can then easily track everything that's going into builds, and scale that up over larger numbers of projects.
You can also push libraries into the repository that aren't open source, and know that you've done so.
Running a maven repository doesn't take much. Sonatype's basic offerings are fine and quite effective.
I think that's an interesting concern, but you might be in a different enviornment than I am.
If "network dependence" is an issue for you, you could check in your ~/.m2 directory iff you don't want to run a repository server and you're deploying to many machines that have access to source control (which you probably do if you're running maven on that machine) and you don't have access to the internet. Then it's almost like a Ant 'lib' directory type of build, but with version information, docs, and sources all available if you want it and easily refreshed when you do have network access.
I thought that, then I installed nexus and let it proxy the remote repositories and really haven't looked back. Takes some time to setup (few hours). But makes life so much easier that it's well worth the effort, especially that with Java you WILL have a dependency on a 3rd party jar from Oracle that for whatever reason won't ever show up in maven central or want to include some other 3rd party repository.
A built tool should allow (when needed) the full
flexibility of a Turing complete programming language.
Because eventually your project is probably going to need
it.
I've never seen a situation like this - but perhaps I've lived a sheltered life. Can anyone give me an example of when this would be a good design decision?
You're generating configuration files from a database as part of your build.
OR
You're reading a file and using the details in that file to control your build.
OR
You need to loop through a set of properties and apply a specific update to each value before the build.
OR
You need to invoke complex build logic based on the platform you're building for (nested ifs).
etc.
There are lots of odd situations you end up in with real projects.
The flexibility this provides comes at the cost of build complexity though.
Obscure syntax and odd custom DSL's (>_> cmake, make, powershell) are bad because they introduce complexity and destroy the readability and maintainability of the build.
You basically need a test suite for your build code to make sure it's building correctly. Terrible.
...but, necessary. I'm not a java guy, but writing C and C++? You need to do this stuff all of the time. scons or cmake really make life a lot easier than trying to force Makefiles to do things with their obscure lambda syntax.
>You basically need a test suite for your build code to make sure it's building correctly
Absolutely, which is one of the best things about maven - because all your build steps are plugins written in standard Java, you can test it the same way you test your regular code.
You should take a look at http://www.gradle.org/ if you haven't already.
I worked on an ant project and decided we needed to convert to a better build system. I started down the path of Maven, but Gradle changed my mind because of the conciseness.
It uses the groovy language so hopefully it doesn't fall into the same trap as described in the post.
Gradle has given me a lot of headaches for small projects. I keep going back to it now and then and run into performance issues with it where builds that take 10 seconds in Maven take 10 min in Gradle.
It's something about the dependency resolution mechanism. Something is broken, but I haven't been able to pin-point it yet.
It's frustrating because I love Groovy. I write a lot of Groovy and I'd love to use a Groovy-based build tool, but I can't justify wasting a lot of time on Gradle when I know how to do what I need to do in Maven.
I've used Gradle on a handful of small projects and am very surprised that you're seeing an order of magnitude difference between a Maven build and a Gradle one. Can you provide a little more detail about what the script was doing for 10 minutes?
Last time I used it which was right around when 1.3 was released I turned on debug logging while the project built so I could see what was causing the issues. It was hanging long periods of time around a task I had to make a fat jar. If I remember correctly it was having some real trouble with locks on Ivy caches... I tried clearing out the Gradle caches and that did not resolve the problem. Since that was a small project I was able to replace the build file with a POM and assembly descriptor be done with it.
If I get some more time to play around I might go back and try and collect some more info for a bug report... I have had this issue since the late Gradle betas each time I go back to using Gradle I hope the problem is resolved.
Well, 1.4 claims to have improvements to dependency resolution, so soon you should be able to take another kick at the football soon. I'm growing to like Gradle more and more and haven't seen any problems like this. My major desire is for more/better plugins, but I do think this will become less of an issue. (I'm even hoping to write 1 or 2)
I should also mention that GVMTool is worth checking out if you're on a UNIX-based system. An easy way to install and switch between versions of Gradle and other tools: http://gvmtool.net
I think developers are dogmatic and stick to a particular religion. I don't mind mind maven but I hate when I see an example piece of code and the only way to get it to work is complicated maven(pom) script. Then I have to spend a hour trying to figure out what is in the maven script. And you end up with only 2 or 3 jars (log4j and commons-logging junit and some other jar). Why don't you just give me the version I need in a bundled jar and I can go about my coding.
My only fear with maven is when things break or someone how you have a wrong version of maven that the script requires or downloads fail or something else. If the script fails you can't do anything. And who knows, the only that was required was just compiling the java source with minimal dependencies.
If you look at the practical real-world environments, Maven beats out most other approaches for large projects. I think you have three or four different build setups. I prefer Maven for large projects and Ant for small projects. Up votes for the person that makes working, maintainable, easy to read and modify maven scripts. (not you Spring or JBoss).
Anyway, you have three levels of builds.
- Complex Perl scripts, bash/unix scripts : these people are insane and generally have degrees in physics and mathematics. They write scripts that are unreadable, can't modify, are cryptic and I imagine that that the authors can't read them. I sometimes see these in old Linux programs.
- Complex Make, Autoconf, bash/unix scripts : A little better than the perl build scripts but still a complex because you have to understand each command used. I am not fan and never was a C/C++ developer. I guess these are still used in the non-Java world. I am looking at you 'OpenJDK/Sun JDK'
- Complex Ant scripts : These aren't bad, easy read, a little bit difficult to maintain over time. I think most developers prefer Maven but don't mind an Ant script if the project is small
- Maven Scripts : I think people use maven because not much else exists. And the problem of compiling code is not that serious as the actual application development.
I've had the opposite experience. I've converted 2-3 complex build systems implemented in different flavors of make to ant. It's tedious but it's doable. And once you get to ant, it's a lot easier to maintain compared to make.
I looked at maven a couple time but could never figure out how to convert a make-based build to maven because there is always a bunch of weird hacks lurking in the makefile. However, I do see how maven would be fine for a new project.
Honestly, make based projects are generally infinitely easier to parse and manipulate than Maven. Maven helps by providing a fixed organizational structure that is immediately familiar (and this is the advantage of being opinionated that I do think the article understates), but you can keep the structure and use make and be much better for it.
"These abstractions apply to tools and frameworks as well, particularly tools that must scale in their power and sophistication along with projects, like build tools. By hard-won lesson, composable build tools scale (in time, complexity, and usefulness) better than contextual ones."
And then the person that wrote them leaves and we're fucked. If maven doesn't do what you need to do, then chances are you don't need to do it. You may think you need to do it, you probably don't. Maybe what you need is a more powerful deployment system, or something that takes over after your java/scala/whatever is built.
If you're building perl, C, C++, sure, maven isn't going to do it for you. If you're building java, you're sorted.
My favorite is when this sequence happens (I have seen it first hand):
1. Maven is this external system that we have no control over.
2. We will write our own system that will be awesome.
3. Wow, look, every project uses our Build System!
4. Oh, look, someone on project X changed B.S. and now Projects A-N don't work!
5. We will lock down B.S. so only the one person who wrote it can change it!
6. ...
7. B.S. is this internal system that we have no control over!
8. ...
9. Can we add support for merging war files?
10. No that'll take weeks, and Bob isn't available anyway.
I'm sure you can fuck that up just as well using Ruby. Go for it.
I work in a java shop currently. There's got to be a hundred of us. Every single one uses an IDE.
When I use maven, my IDE just works. eclipse, intellij, netbeans all support maven. Want to put a bunch of files in a common war to share? maven has a merge-war plugin and intellij and eclipse know what it means: they both build projects that copy the files just like maven would do. Better yet, as you edit those files, no matter where they are, both eclipse and intellij update the files in the deploy target and they are available when you refresh your browser. There is a huge amount of investment in making this stuff work. This is convention over configuration.
No tool can do this with Ant (or Ruby) unless it runs the Ant and the Ruby, and then you've side stepped the IDE and reduced them to dumb text editors. With an Ant project, the IDE can't know that when I edit file A, it is transmogrified into file Z. (If it could, it could solve the halting problem). With Maven, the IDE implements convention, and so it know that File A is compiled to File B and then must be copied to location Z because its part of a merged war. Add JRebel to the equation and pretty much any change to the code is immediately runnable. Its dangerously easy to spike away instead of writing tests!
If what you need to do can be done in Maven you don't need to do anything very difficult. Ant scales poorly; dependencies are a particular headache. Maven scales astoundingly poorly.
Here's a very short list of things that are massively obnoxious to do with Maven, but are perfectly reasonable:
- submitting code to a code review site like Gerrit.
- generating code (for example, a parsed SNMP MIB that you want as a Java class so you can refer to it easily).
- integration with a tool like Sonar (yes, there's a Maven plugin. You ever used it?).
- code coverage analysis (yes, there's a Maven plugin. You ever used it?).
- FindBugs style analysis (yes, there's a Maven plugin. You ever used it?).
- interesting dependencies on external libraries (to pick an example from Ruby, the json gem was horribly broken at 1.4.2 and generations of projects have varying requirements for json < 1.4.2, json > 1.4.2, and many other worse things.
- C code (through JNI or anything else, a perfectly reasonable thing to want to do).
- deployment.
From bitter personal experience it is possible to get so wrapped up in this that you think you have achieved something amazing when you finally finish, when it could have been done in a few hours in make or rake.
"Oh, but you can integrate it with your IDE!" The only thing your IDE actually needs to get from the Maven POM is to understand where your source code is, what you depend on and where it is, and how to run your tests. Everything else they do with command line calls, just like it was Ant.
Maven is particularly good at generating code for later compilation and processing. Your custom code generator just needs to put code into src/generated and it will be picked up.
Given that Maven is used to organize and build some pretty large projects out there in the real world, you might want to amend your statement from "Maven scales astoundingly poorly" to "I am astoundingly poor at scaling Maven".
It's not all wine and roses in the land of Maven, but it gets a whole lot of the job done, and done well.
You can add pretty much anything to do with CORBA to your list.
Yes there are plugins for it, but they only work in the most trivial of cases.
The original article was helpful to clarifying for me why I have such a low opinion of maven - the project I was working on when Maven came out was a large complex codebase and we'd built lots of interesting things into our Ant build files. I could not work out how to ever do those sorts of things in Maven.
Badly written programs are far from an inevitable outcome with contextual tools. That is more a symptom of Maven's choice of a rather narrow context that has limited its appeal (that and all the ways it sucks ;-).
>- submitting code to a code review site like Gerrit.
Why would you want your build tool to do that?
>- generating code (for example, a parsed SNMP MIB that you want as a Java class so you can refer to it easily).
There are plenty of plugins to do that. If you want one that doesn't exist, write your own. It's not hard, and means this operation will be encapsulated in a structured, testable, reusable way. If you're generating code in an ad-hoc, specific-to-a-single-project way, you're doing it wrong.
>- integration with a tool like Sonar (yes, there's a Maven plugin. You ever used it?).
Yep, I've used it. It works fine.
>- code coverage analysis (yes, there's a Maven plugin. You ever used it?).
Yep, I've used it. It works fine.
>- FindBugs style analysis (yes, there's a Maven plugin. You ever used it?).
Yep, I've used it, it works fine.
>- interesting dependencies on external libraries (to pick an example from Ruby, the json gem was horribly broken at 1.4.2 and generations of projects have varying requirements for json < 1.4.2, json > 1.4.2, and many other worse things.
Huh? You can depend on a specific version, a range of versions, or a range with gaps in. You can't depend on two versions of the same library because it's impossible to set the classpath up like that, but that's a limitation of Java, not maven.
>- C code (through JNI or anything else, a perfectly reasonable thing to want to do).
There are ways to do this, I'll agree it's not pretty. If you want to use another build tool for those projects that include C code that's fair - it should be a minotiry of your projects.
- cross-platform C/C++ code and JNI libraries work just fine using the NAR plugin (was developed at CERN for this use case, IIRC). I've used this extensively over the last year to integrate legacy C++ libraries with our Hadoop jobs.
- code coverage / findbugs etc. are better done using something like Sonar. Again, this works just fine for us. You can also set up Sonar & Jenkins on your desktop in minutes and have a working analysis suite + web interface without centralising your builds.
And I assert that you're wrong on the IDE integration - Intellij appears to shell out to Maven for some things, but M2E in Eclipse is a completely different beast (and has some rough edges as a result, but can work well).
"interesting dependencies on external libraries (to pick an example from Ruby, the json gem was horribly broken at 1.4.2 and generations of projects have varying requirements for json < 1.4.2, json > 1.4.2, and many other worse things."
This seems like a java problem more than a maven problem. In fact, I don't really know what you could do about this. Maybe OSGi has a solution.
"The only thing your IDE actually needs to get from the Maven POM is to understand where your source code is, what you depend on and where it is, and how to run your tests."
And how does it get that from ant? That seems such a small thing. "Everything else they do...." must be so important. I don't think it is. I think the most important thing is that when developing in the IDE I have a reasonable certainty that when it builds on the CI server that its going to do the same thing. The next most important thing is that our team can develop using the IDE each prefers (or vim if they want). Maven is the only "project description" that is understood by every IDE and also runs from the command line.
"Everything else they do with command line calls, just like Ant".
Not so. For example, both eclipse and IntelliJ, when faced with a merge-war project, will set up a project definition that provides the same behavior as running maven, but without running maven. Modifying a resource in the common war project causes that file to be deployed to any running targets. Its instantaneous and automatic. Its the difference between an Integrated Development Environment and a text editor.
Everything else, don't use maven. Maven is a tool for building java.
Our tools for deployment make maven look like "hello world". I wouldn't use maven to deploy. Likewise for submitting code to anything. I use maven to build deployable targets from java and to upload them to a repo. End of story.
Basically, if it hurts when you do that, don't do that. I use bash on my build server. Somewhere in the middle, bash runs maven.
So how would I handle generating some code from another format? Depends what it is. One fellow said reading from a database for example. Well before I ran maven, I'd run the program that generated the java files, and then I'd check those files into source control so that we know what exactly got built. Then I'd run maven. If it was generating java from a text file, I'd probably have it as an Ant task in my IDE, and whatever got generated, I'd check that in too. Sometimes I've gone as far as to write an IDE plugin that builds the file automatically.
Checking in generated files? Isn't that an excuse? Well, no, not if you want to guarantee to be able to build it in 18 months time. I've worked at a place where they couldn't even build an 18 month old product to support a customer because the build system itself wasn't versioned!
Thank you for espousing the Maven opinion. I would take your argument a step further and explain that mvn fits into common development workflows much the same tr fits into the word-counting example. Mvn is composable.
Let me illustrate what we do at work to explain. We use Jenkins as a build server and a hosted git repository. Every accepted pull request to mainline-dev is picked up by Jenkins, Jenkins runs mvn test on the commit, if the build passes, Jenkins integrates the change into mainline-stable, and Jenkins runs mvn deploy to get the latest build on a mainline-stable server.
Setting up this workflow required little customization because mvn's output was well-defined and designed to be composed within other environments, and jenkins is one common place that understands it. So are the IDEs at your work. So is the next person to tweak the environment who can look at docs to figure out what's going on. It's no different than how sed knows how to interpret tr's text output.
Does this workflow work for everyone and every language? No, not at all. Does mvn the tool work for everyone and every language? No. But I see it as an example how mvn, a build tool, fits into larger process and is a composable tool the author lauds.
P.S. The biggest benefit for our shop is twofold:
1) Dependencies get resolved automatically in all environments. Unlike my last gig, developers run the same version of libraries as the prod machines do.
2) Developers run the same test cases locally as jenkins does. We've only had one build break.
And then the person that wrote them leaves and we're fucked.
Where in the article does he say in-house tools are better? His point is about tool composability. Just because you haven't run into the wall he describes Maven having yet doesn't mean it doesn't exist. He says as much in fact. He recommends starting with maven till you outgrow it which seems perfectly reasonable.
I think you missed the point. The problem with comparability is that you have the freedom to compose things however you want to. It's not exactly uncommon for "composable" build systems to be all but inscrutable to the next guy.
The consistency of an opinionated build tool does provide some not insignificant benefit.
This is so right that I feel like I live in a different universe from someone that wants to write code in their build system. If you do write your own build system, make sure it outputs a pom.xml file so I can still use maven.
Sounds like they've cocked up the jars they've produced; you should never have to use exclusions, and it's possible to have a dependency that can be satisfied multiple ways with a default, so you shouldn't need to choose the driver separately if you don't want to. Much of that xml file is what already exists for your project. So the fair comparison is with the section:
<dependency>
<groupId>org.seleniumhq.selenium</groupId>
<artifactId>selenium-java</artifactId>
<version>2.28.0</version>
</dependency>
So: you need a groupId; we've learnt through painful experience that this is a good idea to prevent collisions rather than just using the name. That said, your tool should autocomplete it if there's only one option.
You need to specify a version; this is 1000% the Right Thing. Again, your tool will give you a list to choose from.
And yeah, it's a bit of a verbose way to specify those three pieces of information; the point is for it to be easy to automatedly manipulate the pom file.
Exclusions happen, when you need to make do with libraries that have different release schedules or are poorly organized but critical for your current priorities, needs and time constraints. One should see exclusions pretty much as Technical Debt.
Maven becomes brilliant when you start dealing with a large number of modules and profiles, and becomes more tedious when you pass a certain complexity threshold, usually when you have multiple layers of compile-time code generation going on top of each other.
Until you want to do it on a system you just want binaries.
Or until you need to guarentee the version of the library you're bundling
Or you want to run on something other than a MAC-in-crap, like Linux.
RVM has the same problems.
In my opinion, one is for play, and one is for work. If you just want to fuck around and spew code in one long controller that looks like spaghetti, and you don't want to worry about reproducability or quality guarantees or modularity then Ruby build tools are fine to use.
It's incredibly frustrating to hear about people complaining about how tool X doesn't do task Y. They could just STFU and write the plugin.
There is really only one type of plugin that wouldn't work well in Maven, and it's when for some reason you're task doesn't fit into it's "life cycle" pattern. For example, you want something running in the background when you edit SASS, LESS, TypeScript files and you want some processor to just run and deploy in the background continuously. In this scenario, you could keep the Mojo API, but you would have to lose the Maven Framework.
I've worked on two build systems (and used many more). The conclusion I have arrived at is you don't want a declarative-only system (like Ant). They can be convenient but as your needs grow more complex they start becoming a hindrance.
Instead, you want general purpose language that builds the build dag using the build system as a library. This gets you control and customisability for free, and you can still get reuse through code libraries.
Last year replaced a maven shitfest with an old fashioned Makefile which calls javac and manual dependencies. The whole port process took about 2 hours.
Integration and test run time has gone from 7 minutes (to drag all the maven plugins and dependencies down) to just over two minutes.
This saves us a fortune on build agent capacity as well (we use TeamCity).
That's my favorite. "Maven is taking too long. It must be shit. I'll write my own!"
Question: Maven is taking too long and downloading plugins and dependencies everytime. What should I do?
Answer:
1. Its shit! Write my own!
2. Well, that seem suboptimal. Would the smart people who wrote it have made it download all this stuff everytime it builds? Surely that is wrong. Maybe I could google that?
Yes we have disabled the local cache. Our build machines are not stateful - the environment is wiped before each build intentionally. We have had problems where maven caches are poisoned and the builds start failing randomly which cost us a lot of time half way through a major release.
As part of your build were you wiping out your local repo ( ~/.m2 ) for some reason? It would only need to check if you don't already have the version specified in your pom in your local repo.
Yes - all our builds are done from zero state i.e. and empty directory. We even deploy a new JVM each time (as we have to test against 3 different JVMs).
Either make your 3rd party librariers part of an OS snapshot, or make it part of your source repository. The flaw seems like it's in your build as opposed to the build system. I'm all for wiping away intermediate work products, but you don't need to test the ability to download 3rd party librariers each time. You could use the md5 or sha1 files that get published to the standard maven repo. and that would be a miniscule download, and once you have it, you could just compare it to the existing files.
I wholeheartedly disagree with this post, though I do like his conceptual framework of composable and contextual abstractions.
I've worked on countless Maven builds and have done everything under the sun. Maven's plugin system allows you to ultimately achieve whatever you need out of your build--in the 10% case that you have a need that isn't already covered by the varied plethora of plugins out there. You can even call out to Ant using the antrun plugin.
I've found so much simplicity, reliability, continued payoff in my use of Maven. I've heard complaints but usually from people that refuse to understand how it functions declaratively.
Declarative systems like Maven ask that you learn something first, and then reward you infinitely if you do; imperative systems let you do something immediately without learning a model, but then make you infinitely pay for it.
I think the article title is a big provocative and is drawing attention away from the main point: composable tools are better (in the long run) than contextual ones. Maven (and Ant) are contextual tools and they give a lot of benefits early on. Extending their use beyond simple things tends to become difficult and makes very messy build/deploy code.
Tools like Rake and Gradle (and Buildr) are more like thin wrappers on a turing complete language and are able to stretch and bend to fit the problem. Sure, that makes them harder to get started with, but it provides much more runway when the build gets complicated.
I've used Maven in the past on some projects and it starts off very nicely. However, without fail, I end up with some part of the build that doesn't work quite right. Or throws unresolvable errors. Or just fails, some of the time. I guess I (and everyone I've ever worked with) sucks at Maven and just can't make it work. If so, that tells me much more about Maven than my team's abilities.
P.S. I (mostly) blame XML for my Mavin chagrin. Over the years I've grown to hate XML and everything associated with it. It should burn in a fiery pit of lava. But that's just me. :)
Oh wow. I just raged at my manager for exactly this type of problem.
I'm not sure if it's due to someone's lack of technical ability, or if they get distracted by other externalities, but we always tend towards something that seems really simple at first but instead incurs a lot of technical and process debt later on.
My experience is this is caused by a lack of refactoring/ rewriting code when it can no longer support easily modification into the desired functionality.
I enjoyed the article but I feel it overlooks one of the biggest reasons to use something like maven:
It gives us the language to efficiently reason about and discuss build concepts with colleagues.
This is a big thing, think about the most recent new starter situation you experienced, about how communication of these common build tasks was expressed. Chances are if it was with a more composable tool, the discussions were fairly detailed, the words 'just like' frequently appeared. If it was with a more rigid tool, the conversation would have been in terms that tool commonly defines.
The compromise of composabilty is that if you don't already understand the problem domain, you don't know how to structure your code. Frameworks are brilliant for paving a golden path for us to follow - we get to reuse someone elses knowledge and experience, codified into the framework.
A great example is the present state of browser development t. Google "backbone sucks site:news.ycombinator.com" and you find tons of comments from people who struggle with backbone - a supercomposable library, not a framework by design.
Composabilty pays off huge in the long, but you need experienced leadership to wield it.
There's also a happy medium in things like Play Framework, a functional scala framework but designed almost as a collection of composable libraries so you can lean on Plays opinions for most things but since it is designed to be composable (compare to "pluggable", or "extensible") so you can swap different pieces out if you like. That is my experience, at least.
I've found buildr (http://buildr.apache.org) to be a great replacement for maven. It provides many of the capabilities that maven does, but it's also got all of the power of ruby/rake for taking care of those things that come up that the plugin/declarative approach just cannot handle reasonably. We converted our maven builds in a day and never looked back.
Doug McIlroy's solution is genius, and I hesitate to try to improve it, but it seems to me that if you reverse the order of the 'tr' commands, then you can slightly simplify the one that turns non-word characters into newlines, like so:
tr A-Z a-z | tr -cs a-z '\n' | ...
Since you first turn upper-case characters into lower-case ones, you then get to replace only things that aren't in a-z with newlines, rather than things that aren't in A-Za-z. It saves a whopping three characters!
"Once you start fighting with Maven, it’ll never return to the rosy days when your relationship was young."
The key is, you don't start fighting with Maven.
"The philosophy of Tai Chi Chuan is that if one uses hardness to resist violent force, then both sides are certain to be injured at least to some degree. Such injury, according to tai chi theory, is a natural consequence of meeting brute force with brute force. Instead, students are taught not to directly fight or resist an incoming force, but to meet it in softness and follow its motion while remaining in physical contact until the incoming force of attack exhausts itself or can be safely redirected, meeting yang with yin." (http:// en.wikipedia.org/wiki/Tai_chi_chuan)
If you fight with Maven, you will lose. If you meet it in softness and follow its motion you will attain build enlightenment...
After about a year of experience with Maven, I find that:
* The "surefire" plugin is named ironically. The build often fails without a message about what unit test failed.
* Maven is extremely slow. Eclipse builds the source in 30 seconds max; Maven takes 5 minutes.
* Maven's output is extremely verbose, but if anything goes wrong, it "helpfully" tells you to rerun it with -X -e to get the actual error message and stack trace. You can run it with -X -e all the time if you want, but I think it generates so much output that it actually slows down the build. Yes, really.
* There are a lot of things you just CANNOT DO in Maven.
One example is bundling a Maven plugin together with the rest of your Maven source and use that same plugin elsewhere in the project. It fails during the early phases of the default lifecycle because the plugin isn't installed yet. But only by getting to the later phases of the lifecycle can you install the plugin. But you can't do that because the early phases fail. An infinite cycle of despair is right.
* Even the most trivial project takes a few screenfuls of
XML to set up in Maven.
* Maven's dependency management is poor, too. If you have some library A that pulls in B, and another library C that pulls in a slightly different version of B, you can get misbuilds. It's your job to identify and fix these problems. They will show up at runtime. And let's not even talk about SNAPSHOT and how horribly that can go wrong.
* Maven is also known for pulling in tons of unnecessary
dependencies. But hey, you don't mind 50 more megabytes of jars on your CLASSPATH, do you?
I was honestly expecting the article to address some of these points, instead of vague philosophical ramblings. Seriously, Maven is horrible. This article gives you no inkling of just how bad it is.
I was surprised to see him include ant with maven. I always thought of ant as more a set of composable tools than a maven-like context, and for that reason I preferred it.
That said, I recently had to use ant after many years doing Ruby, and I was surprised to find how many old annoyances were still there, like scarce/painful support for conditional execution. I wanted a task to `cp` in development and `scp/rsync` in production, and I wound up having to duplicate a lot of code. Also I'd have thought that someone would have added the copy task's filter support to scp by now. (If anyone knows a clean way to do any of this, I'd love to hear it.)
Ant and a lot of the build tools that grew up around Ant don't support what make calls pattern rules very well. It's a lot like the difference between using Prolog and C to do the same task. A good example of how this is painful is to try to imagine how difficult it would be to handle the dependencies between various Java source files automatically if <javac> didn't do it for you.
To pick an SNMP related example, since I've done a bunch of that, suppose you have a MIB for your organization (using a hacked private OID that you haven't told anybody about for development, because it's easy) and then a lot of other MIBs that depend on that. And then one bright day you get your IETF approved SNMP OID for your organization, and you need to retarget your MIBs to be under that. It could be as simple as changing the base MIB to the new, official, OID and then letting your build tool rebuild the generated files in the correct dependency order--but that's hard to do with Ant. Usually you'll just rebuild all the generated files and then rebuild all the downstream code and then ... It's not trivial in make but it's not especially difficult--make for example has no real knowledge of C code, but with cc -M it doesn't need any.
I love the Access analogy. It burned me quite a few times in prior lives, when every project started with, "Let's build an access database." Ultimately it's a tradeoff in abstraction and usability.
i actually doubt the problems with access are really that bad - sure you outgrow it eventually, but you'd be surprised at what its capable of.
Often the real underlying cause is that a business owner with little to no programming knowledge makes something which works ok, but is architectually wrong. For example, bad table structure/relationships. Or after a while, they want to 'put their app online', and asks a poor intern to convert it.
After 3 years of SBT for Scala development, which plainly does not work, b/c SBT developers do not like simplicity (the irony!) and are more interested in doing cool stuff than stuff that works, I'm only a small push away from going back to Maven/Zinc, that favors 'just works and you can't shoot yourself in the knee with an arrow' over 'powerful'.
The usual backstory is that the old thing was cobbled together over a period of time while the requirements shifted about underneath it. The new thing appears to be "more awesome++" because the requirements are fixed at that point in time. Inevitably, the new thing is forced to cope with changing requirements and the cycle repeats.
What are the biggest differences between Rake, Gradle, and Leiningen? All 3 allow their respective language (i.e. Ruby, Groovy, Clojure) to be used within the build script, but besides the pros/cons and popularity of those languages, what are the differences between the build tools themselves?
Maven just works and if you stay within it's intent, it does it's job.
We use Maven with Bamboo for continuous integration and it just works. We play with the idea to move to Gradle b/c it's more powerful, but the effort is non trivial and the real benefits - beside the 'but it's programmable' - not clear.
All that's not always necessary, but if you know the thing you're building has a shelf life of at least 2 years then ruby/rake are pretty shitty tools.
XML is a great language since it has the power of schemas and DTDs. It makes it easy to validate the data you expect in each of the fields.
And fuck am I tired of hearing about how saving 10% of your keystrokes during authorship is such a great thing. Most of the time spent on software in in maintenance, so you want better readability than terseness. coffeescript, yaml, etc... are far inferior solutions for most of the uses they get put towards.
>XML is a great language since it has the power of schemas and DTDs. It makes it easy to validate the data you expect in each of the fields.
Quite the opposite. XML Schema is an awful mess - it's not even representable in XML; it's not a good language for expressing validation constraints, it's neither simple enough to be easy, nor powerful enough to express all your requirements. And the need to support schema leads to DTDs and namespaces, which are responsible for most of the screwups of using XML in the real world.
If XML were just tags, parsers didn't try to connect to the internet to validate everything, and tools didn't bother with namespaces, it would be usable enough that we probably wouldn't have needed to invent JSON.
The author didn't provide any background about the original Knuth "Literate Programming" paper and subsequent review by McIllroy. Knuth was using that code to demonstrate Literate Programming style. While the quote / anecdote was taken a bit out of context, I agree with the fundamental idea: code reuse.
I totally agree with the article. I love somethings about maven but XML is a pain to work on.
I am thinking that a build system can be built on top of Java. But Java for shell scripting has some drawbacks that can be alleviated. I am working on these little tool to use regular java for shell scripting, it is auto compilable and it has better abstractions for file manipulation. https://github.com/huherto/jashi
I now use bash instead of ant. It's really fast, makes sense, does whatever I want.
But it has the lisp problem (and now ruby problem) of everyone making their own tooling, without standardisation. (where a "standard" is procrustean: it doesn't exactly fit any situation, but the situation is forced to fit it). Pre-chasm innovators/early-adopters love it; everyone else hates it.
BTW: I don't think "contextual" is a very descriptive term ("composable" is great though)
i'm really wondering, why xml is burried so deep down in the comments. I would argue that Maven is just reflecting a methodology implied by xml, some like it, but well...
I think most of it is verbosity, both that inherent in XML and extra caused by crummy schemas, that makes people hate XML. It turned into an "enterprise" thing, with all the philosophy that entails.
Imho Maven, despite its wide range of plugins and extentions mechanisms, is quite "convention over configuration" (cought cought) and really is made for projects that can adapt to its conventions, and to the popular Java artifacts layouts.
The problem is, when it does not suits you, the competition is not really well known. Gradle maybe ?
I can't understand the Maven hate. People prefer their own half baked ant tasks which don't handle dependencies? IMHO Maven takes some getting used to, but it brings a lot to the table and can make the build process very repeatable and extensible.
Good article. I like the idea of what Maven tries to achieve but I prefer monkeying with ant because I can get things done. If Maven or its successor ever gets to where things can't get into a mucked up state; I'm in.
strange, by default maven is much more strict about things, so it's hard to muck up. Perhaps you weren't creating the maven project in the format it wants you to create it in (which IMO, is a good one). Have you tried creating new projects with
>> Generates a new project from an archetype, or updated the actual project if using a partial archetype. If the project is fully generated, it is generated in a directory corresponding to its artifactId. If the project is updated with a partial archetype, it is done in the current directory.
I don't recall trying that however, sorry that whole description from the link you provided, it just doesn't make sense to me. In general, when I try to use Maven, Eclipse ends up complaining. Then I fix it so Eclipse is happy then Maven complains.
Maybe the answer for me would really just be get the Maven and the Eclipse people to agree on how to create and manage projects with the Eclipse wizards.
> Donald Knuth was asked to write a program to solve this text handling problem: read a file of text, determine the n most frequently used words, and print out a sorted list of those words along with their frequencies. He wrote a program consisting of more than ten pages of Pascal, designing (and documenting) a new algorithm along the way. Then, Doug McIlroy demonstrated a shell script that would easily fit within a Twitter post that solved the problem more simply, elegantly, and understandably (if you understand shell commands)
Fine anecdote, but anyone want to take a wager as to the relative time and space complexity of the two solutions? I imagine this is sort of comparing apples and oranges.
I guess it is part of the idea that you can ignore that. The shell script is _good enough_ for 80% (I made this up) of use cases. Knuth's solution is probably faster and requires less resources but if I want to reverse the filter order or change something else You are way faster using the shell script. It's also probably not so difficult to look up how each of these tools work internally to make an informed guess on time and space complexity. This will come in handy if you use the tools next time. And unix tools aren't so slow. I wrote some C++ code using mmap() files to filter out some chars. It was only marginally faster than some pipe using tr.
To be more precise, I think the biggest problem with this anecdote is the notion that this guy somehow "one-upped" Knuth by coming up with a pragmatic rather than a theroetically optimal solution. Knuth is a mathematician after all, and if someone asks him to write a program they're implicitly asking for a mathematical result, not an unremarkable engineering tool.
So no, taking this anecdote at face value, this guy didn't "one-up" Knuth; they were effectively answering two different questions. There is more to computer science than finding the simplest shell script to solve the 80% case.
I'll take rigid maven any day. Much preferable to some cobbled together build system that is undocumented and hard to reliably reproduce.
I'm curious what people are doing with Maven that they manage to get into such a fight with it? Although to be fair, due to issues in its dependency resolution mechanics, it can pull in incorrectly versioned artifacts without ever telling you anything.
It certainly isn't easy to get running if you don't use t regularly. I only have bad passing reasons to use maven, and every time it has been a huge timesink just to get it to spit out a jar file.
There are more options besides a completely rigid build system and cobbled together build system. There is room for something in between. I think the build tool should encourage you to use common structure but allow you deviate from that if necessary. Sometimes, Maven makes it very difficult to accomplish small tasks without writing an entire plugin.
That and don't forget:
1. The tools that can work with it [Hudson, etc]
2. The support for SVN if you want it
3. Plugins aren't the worst thing in the world. Instead of fighting arround with Tomcat to get a dev server going and deploying the artificat manually [or getting the ide to do it] you just have to use mvn tomcat7:run ... easy.
I once did an Ant to Maven conversion job on around 10 mature projects I didn't code. So that a large enterprise can move over to Maven and reap the productivity benefits.
Various Maven modules were restricted, including Antrun. Because the projects are old and tied to their directory structure of resources, it was more practical to script in Maven rather than follow conventions. Scripting in Maven was quite... challenging.
In this case, the problems arising from Maven also stems from the organization's context.
> the problems arising from Maven also stems from the organization's context.
that's just euphemism for not following maven's conventions, which is where most people's complaint about maven stems from. You either do things the maven way, or the highway. Twisting maven to do what _you_ want, but not how maven likes it is just asking for disaster.
Ding ding ding. "Scripting maven" sounds like a nightmare. And rightly so. (and if I'm scripting a java build process, I'll take a bash script over Any any day of the week.)
Building java libraries isn't really all that difficult. Not is packaging them. Damned if it doesn't take 30 lines of XML to do it in maven, but these kinds of see-spot-run situations are pretty easy.
Extremely difficult in maven and five minutes work in make or ant is things like "running LaTeX to generate a PDF of some of your documentation, because it is math laden and HTML isn't suitable". This is not theoretical; I had to do this at the place that was using maven and we depended on some semi-obscure mathematical formalisms like the stable distribution.
Intermediate between these two is driving things with maven plugins, which is either easy (because you're doing something trivial) or fiendishly difficult because the debugging output is terrible, the docs are worse and the only thing left to you is to Use The Source, Luke. Every nontrivial maven plugin I've ever used has tripped this at some point, be it loading config files, including the dependencies, or something else.
Or you could use org.codehaus.mojo:latex-maven-plugin, stick your latex and graphics in src/main/latex/<doc>, and watch it 'just work', like I do.
I agree that not every maven plugin does exactly what you want it to, but a lot of them get pretty damn close, and as the authors iterate them, they gain a lot of flexibility.
I don't have too many issues getting latex running either, but I do mostly agree, despite my humorous aside. If I had a quid for every extra hour that Maven added to my work above and beyond the coding, I could take you all to lunch. Tools should at least attempt to get out of the way. Maven is a Swiss-army knife with a blade as the handle.
I managed to do my entire Java career by dodging Maven but I'm now learning Clojure and I really like the language but, sadly, I'm not fluent enough in Clojure to do everything "by hand" so I'm using Leiningen...
Leiningen doesn't use Maven under the hood. It understands the Maven repository format, and imports a few classes from Maven for searching Maven repos, but that's pretty much it.
After having jumped into Rails and shifting from BBEdit to TextMate as my primary editor overnight back in 2005, when TextMate started withering on the vine I became disillusioned that I had put so much effort into pursuing such a short-lived tool.
Reflecting on my history with a UNIX shell going back to the late 80s, I realized that things I had learned 10, 20 years ago from the UNIX world were still relevant today. I committed myself to getting serious about vim because I want to optimize for A) learning many programming languages and B) not using verbose Java-like languages that require IDEs for the all the boilerplate and rote refactoring.
While this kind of toolset will never provide quite the bang for the buck of a contextual IDE in a specific language, it's a phenomenal hedge against all the career risks I face in terms of Ruby becoming irrelevant, the web becoming irrelevant, Apple nerfing OS X, or any other probable sea change. No matter what happens I feel like vim + bash will bring me an immediate level of productivity in any new task I face, even if I start flattening out before I reach the Eclipse or Visual Studio level of wizardry, I don't expect any one thing to last long enough in this industry for such optimizations to pay off.