Yes, Maven and Gradle are weirdly complex, but they are workable. And .jar files are just downright amazing compared to most other things (I'm looking at you madman Python dependencies!)
Hint: maybe you could come up with something like npm for Java ? Maybe it's waiting to happen?
And Javadocs are mundane but still better than 'nodocs' which is the standard on Node.js which really needs to change, and could start with some standardization and leadership from somewhere ... maybe the Typescript team?
Oh, and also, since everything is JSON, and that blends nicely with JS ... my god man that's so much nicer than JSON/XML in Java which is often unwiedy.
It's not just Java culture, though, it's also Java. For instance, you can't have "npm for Java," because that's not how Java works at a very fundamental level. Java gets gigantic monolithic framework libraries because Java's class loader works fundamentally differently from Node's CommonJS module system.
Wha? I don't think so. An npm-like framework could work for Java. The difference between classloading and node.js 'require' module loader is not super relevant.
Frankly - the reason that Maven and Gradle are more sophisticated is because many Java projects are massive compared to most JS projects, the amount of complexity, various kinds of packaging, testing etc. necessitates some inherent complexity in modules. Java modules tend to be a lot bigger as well.
npm works because you can have lodash v3 and lodash v4 existing at the same time, at runtime. Node sees two pieces of code at different locations on disk, and loads them twice, considering them to have no relationship.
This means that a library can turn its API completely inside out and not cause dependency hell. Your dependencies can update their sub-dependencies in their own time. This means there's no pressures against releasing new updates, and leads to the beautiful and very productive chaos that is npm.
The same class, loaded by different classloaders, is a different class.
Thus, you can load different versions of the same library at the same time, by loading them with different classoaders. Even, the same version.
Now, there's necessarily the problem of which one of them a particular piece of code wants to use, which is determined by which classloader loaded that piece of code. You also can (and routinely do) have a path of nested classloaders that load different things.
Java was designed this way, right from the very beginning.
I don't get how anyone can call it beautiful and productive. Ever take over/inherit some 'older' node code and try to update it or even get to work... it is hell. Even worse if it is from the time when there were competing forks of nodejs, so you have to try to figure out which it prefers. And maybe even then it won't run. (probably because npm's version notation can be unpredictable, so npm install still installs a version that breaks something even though it shouldn't)
npm allows multiple versions of the same package - not ideal, but works. To do this in Java, you need multiple classloaders etc (basically what OSGi provides), but you are now in a different kind of hell.
edit : I forgot to add that Java modules may address this, but i am now out of the Java ecosystem for quite some time.
Dependencies in python has evolved a lot in the last 2 years. You can use a simple setup.cfg to make them, and you can use a wheel like a jar if you want.
That's not so much the issue - as Python package/dependency problems are not so much a problem of 'how it works' - it's an issue because nobody is on the same page. It's byzantine.
If python dependencies boiled down to 'point to a directory' well then fine, everything would work that way and we'd build tools around it, docs would be clear etc. etc..
But as someone who uses Python seldom and loves the syntax, I generally stay away from it mostly for that reason.
And lack of typing which is obviously another discussion.
Actually, "point to a directory" works. Any directory can be added usong sys.path.append() and its python content becomes importable. No needs to do anything more.
You are right, "almost always the case" was a baseless claim and binary wheels are better than nothing.
However, how many projects still have "psycopg2" in their setup.py (or is it requirements.txt? ;) instead of "psycopg2-binary"? I just found out that the latter exists when I checked PyPI.
I'm not pretending all is fine in the kingdom. But a lot of big Python issues (2 vs 3, encoding, deps) are a in a fantastic state compared to years ago and I think it's important that people don't stay stuck on a bad impression from past mistakes.
It's also for me a way to acknowledge the huge work the community has been putting into fixing what's been asked.
I'm kind of sad that someone would downvote this without really asking why I might think Python isn't suitable for serious software development.
The main reason is that it isn't a language that is nice to users. It shifts the entire burden of setting up an environment to run the Python application in. Which means that installing software tends to be a messy affair at the best of times, and a complete mystery to most non-programmers.
The difference between a good professional software developer and someone who is a just hack, is empathy with the user. Having an obvious way to package applications allows developers to be nice to their users. Python never provided that, so rather than producing nice binaries (or something akin to Jar files) it just dumps the problem of assembling the bits in the user's lap. That's a dick move.
I teach javascript to students coming from Java and Python and, while I do agree npm could be better, students find it to be so much more intuitive rather than have some IDE handle packages (in the case of java) or be stored somewhere in your system path with python. Being able to see the dependency within node_modules and make that connection between you saying "i want to install this" and it appears in your project file system makes a bunch of sense for beginners.
Just so you know, 95% of the few hundred Java devs I've worked with in the last few years used maven or gradle on the command line to handle packages, including junior developers. I think the Java = IDE concept is outdated (although IntelliJ IDEA is an amazingly well-integrated IDE with Java).
I honestly don't get why I was down voted but mkvirtualenv, workon xyz, pip install -r requirements.txt isn't exactly a huge barrier to entry with a python project. Personally I always saw the node_modules directory as something to be ignored. It's a crowded bunch of junk.
I admit I haven't used Python that much, but I'd say about 90% of the libraries and projects I had interest in didn't have a requirements.txt.
I've used Node a LOT, and I have yet to have a library or project -not- have a package.json.
I'll readily admit that Python has the -ability- to be a well maintained environment. THe fact it doesn't default and practically enforce it has led to a culture where it's not often used. And half the mindshare uses conda environments instead.
I didn't downvote you. However, I do disagree with what you are saying. In Python, the absolute most trivial case, is already a big pain. In order to get a project started, you need to: 1) create a virtual environment (which you can do with virtualenv, python -m venv, virtualenvwrapper, pyenv-virtualenv, Pipenv, poetry, or, Conda - but lets ignore Conda from here on out). 2) next, you may need to activate the virtual environment - but, its easy to forget to do this or to activate the wrong one. And, depending on how you created the virtual environment, you have to do it differently. 3) once activated, you then need to install your dependencies - possibly using pip, or, maybe using Pipenv or poetry. 4) Depending on the type of project you are setting up, you may also need to create a setup.py file, otherwise you won't be able to install the new project you are working on into a virtual environment. 5) Then, you probably need to configure your IDE to use your virtual environment - depending on how you created it your IDE may pick it up automatically, but, it probably didn't. Then, you can get down to work.
But, thats the easy case - the more painful case is when you want to either deploy your project or you want to update it. If you didn't use Pipenv or poetry, you're going to need to create a requirements file - probably with pip freeze. You can then go to a different virtual environment and do a pip install -r to install the requirements from that file. Of course, when developing your code, you may have installed modules like py.test that you don't want to install on your production system - but pip doesn't know the difference between a development and a runtime dependency, so, you either need to edit the requirements file generated by pip freeze by hand, or, just live with deploying code you don't want to to production. If you used Pipenv or poetry, at least then you can keep development and runtime dependencies separate. However, both of these tools are less available than pip, so, this generally means you have to install them on your production system - which, given that they are newer, tends to be awkward to do since it may involve pulling down code from github directly. Alternatively, you can do a pip freeze to create a requirements file, but, then you are back to pull in dependencies you may not want.
The next thing you're going to want to do is to update some dependencies. If all you have is a requirements.txt file, well, you are pretty much out of luck. If it was created by pip freeze, its going to include all of your transitive dependencies - good luck remembering which ones you use directly and which ones you don't. Maybe you didn't use pip freeze to create it, however, and you created it by hand. Well, now you'll know which dependencies you actually are using, since, you only put those in the file - however, the problem then becomes that since you didn't list your transitive dependencies, whenever you install the requirements, you could get a different set of transitive dependencies - and if you accidentally started using one of them without realizing it, this could break your production system. So, maybe you listed all of your dependencies in your setup.py file - if so, you can always delete your virtual environment, reinstall everything from your setup.py file, and then re-generate your requirements file. However, doing that is a massive, massive pain since it involves a number of commands. If you try to do this, odds are that your setup.py and your requirements files start to fall out of sync and you give up on one or the other of them.
Pipenv helps - a bit. Its more of a replacement for the requirements file than for the setup.py file - which leads to the odd problem of not knowing if you should list your requirements in both places or try to have one include the other. Whats made more fun, is that Pipenv's interface includes a bunch of options that don't make much sense (pipenv install includes the options "--selective-upgrade", "--keep-outdated", "--skip-lock", and "--ignore-pipfile" and its not really all that clear what they are supposed to do). What I'd like to be able to do is to either update either a single dependency OR update them all, at my discretion. I assume that some combination or its arcane options are supposed to allow you to update a single dependency without updating all of them - however, if so, its not clear which one is supposed to do that as it seems like both "--selective-upgrade" and "--keep-outdated" might do that. However, worse than not knowing what option you should use, it seems like neither of them actually does work: https://github.com/pypa/pipenv/issues/966 has been open for a while and has been dismissed by the maintainers as not a problem, then "fixed", then acknowledged that it didn't actually work, and then they went dark. So, as it stands, if you try to update any dependency, Pipenv is probably going to insist on updating everything - so, have fun testing that.
Poetry is probably the strongest contender for making this whole mess sane. But, for reasons that seem to completely defy logic, Pipenv is getting most of the attention in this space. It appears to be mostly a one person project - and so tying a project to it feels risky. Despite all that, it does work pretty well, but, there are still a lot of features that would be great to see and it would be really great to see it get some more attention and manpower.
While I can understand your frustrations in some of those cases, but it almost seems like the issues are being overcomplicated. I have worked with complex code bases and three commands got me running. Occasionally requirements.txt might fail me, so yes I might have to install a package manually. But for most projects I am up and running in 2-3 minutes. I don't bother with pipenv, or poetry. I use the kiss method.
How do you create the requirements.txt file - by hand, or by pip freeze? If by hand, how do you make sure to lock the versions of your transitive dependencies? If by pip freeze, how do you keep track of what you actually depend on as opposed to what your dependencies depend on?
How do you update your dependencies? Do you modify requirements.txt directly? If so, how do you keep it in sync with setup.py? How do you find updated versions of your dependencies - do you search pypi by hand and then update the file? Or, do you leave the versions of your dependencies unlocked in requirements.txt and ask pip to re-install everything to pull in the updated ones? And if its the latter case, how do you then lock them down again so that if you do multiple deployments you'll always get the same set of dependencies installed?
I'm really not trying to say that there is anything wrong with the way you work - for some types of work, these aren't big issues. And, if these issues don't matter for your use case, well, that's awesome and keep on rocking it. However, for use cases where these issue do matter, the current Python dependency landscape is a bit of a dumpster fire - there are 50 ways to do everything and none of them work well. And worse, none of them really seem designed to solve the problem. And its not like its the UNIX philosophy at play, where tools are designed to solve one problem at a time and you can solve complex problems by composing them. I've spent a ton of time trying to make a reasonable workflow that handles the update and deploy lifecycle well, and no combination of tools seems to do it. (Except poetry - that tool, while not perfect, seems to actually be trying to fix this).
Pip freeze is dead simple and I upgrade packages as needed. I've rarely manually updated requirements.txt. I literally make a venv, pip install what I need and I am off and running. My editor/ide automatically recognizes the environment. I can't imagine how much simpler it could be. I've been doing it for years without any problems. Dependcies, versioning etc are incredibly simple. A couple commands and I am up and running, boom, done. I probably don't even have time to get a cup of coffee in that workflow. I also can deploy to QA and production when necessary fairly fast, although that might take a few more minutes.
Having a proper requirements set of files for production is a 30 second trivial task and can be automated easily.
In my case, it doesn't. I'm maintaining libraries and also applications that use those libraries.
For libraries, you need your setup.py to be kept up to date, as requirements.txt doesn't do anything when you pip install a library package. Of course, requirements.txt is necessary when you want to run the libraries's tests, since few things are as frustrating as having the tests broken by some random dependency having a new version come out. But, then you have to keep setup.py and requirements.txt kinda synced - only kinda, because in setup.py you will list your test and dev dependencies separately, but in requirements.txt they all get mixed up together. In theory its possible to script keeping requirements.txt up to date - in practice, when working with a big team, its a tremendous pain. The first option is to tell everyone not to mess it up, but, that doesn't work. The 2nd option is to develop a bunch of scripts to do it, but, then you have to get everyone to install and use them, and thats quite frustrating since its not at all clear to my why the standard tools don't do it already.
We also have applications. And those applications depend on some libraries we wrote. Those libraries have their own dependencies. When a library is updated, it might gain or lose dependencies. So, when an application is updated to use a new version of that library, its requirements.txt should be updated accordingly. pip installing the new version of the library will bring in its new dependencies - but won't get rid of the old ones from the virtual environment. A subsequent pip freeze will freeze a bunch of dependencies that aren't needed anymore - and that get harder and harder to find an eliminate as the number of unused dependencies grows. Again, this could be scripted, but, its a pain to do, and, the tools should support it.
Whats so frustrating, is that these aren't unsolved problems in computer science. There are solutions. NPM/yarn does an OK job (I have minimal experience). Rust's cargo is fantastic. I've heard that Ruby's bundler is great. I fully appreciate that solving these problems probably requires volunteers - and I'm not volunteering, so, maybe there is only so much I can do to complain. But, looking at most of the work going into the ecosystem, it seems to be ignoring these problems.
Anyway, maybe I was a bit too harsh in my initial response. If your workflow is working for you, that is great. What I would suggest, however, is that if you find a need to add additional requirements to that workflow (such as an easy way to update a single dependency, and its transitive dependencies - both adding and removing them), you'll find that the available Python option quickly disappoint you. For your sake, I hope that doesn't happen, since, its unpleasant to deal with.
You seem fixated on setup.py I have literally never cared about that file in 20 years of python development. Keep the work flow simple and problems tend to disappear.
You can't create a Python library without a setup.py file. Some of the things that I develop are Python libraries, so, having a setup.py file is a must. It would seem that you aren't developing the same types of things as I am - which is totally fine. But, I've been trying to describe my particular use cases, and for those use cases, the Python tooling isn't great. My use cases aren't that niche, and, they aren't invalid. They are different than yours, but, that doesn't mean that I'm not keeping it as simple as I can.
Why would I care if a requirement is direct or a dependency? If it's a dependency it's a requirement. Period.
Speaking of dependencies, NPM generally has a lot more and they are deeper and all too often for little shit that never should have been an import in the first place IMO. More parts == more stuff to go wrong and all too often it seems to.
Pip isn't perfect and granted I use it more than NPM. And once in a blue moon I do run into troubles and wind up installing a package manually or editing requirements.txt by hand. But it mostly works. NPM on the other hand seems to blow up quite often. At least for me. Versioning, this is different than it was a week ago, oh, this dependency only works on Macs, etc. etc. Plus it just feels about as trustworthy as gas station sushi. Only my opinion and experience, but I've spent a lot more time fighting with NPM than pip.
> Why would I care if a requirement is direct or a dependency? If it's a dependency it's a requirement. Period.
Let's say I need package A. And, package A depends on package B, but, I'm not otherwise using package B.
1. I want to say that I need package A, >=1.0 and <2. I want to be able to tell my tools to go find the most recent version of package A and install it, as long as it meets my requirements. I do not care at all which version of package B is installed, as long as it doesn't conflict with any of my other requirements. I don't want to see package B at all - its just an implementation detail of package A.
2. When I deploy to production, I want to make sure that the deploy is repeatable. If the last time I deployed, I had package A==1.2.3 and B==4.5.6, then, when I deploy again, I want those exact same versions.
Ok, but still, why should I care if a package listed in requirements is dependency? What difference does it make in the real world?
And `pip freeze > requirements.txt` writes out version of the package that `pip install` later installs.
Very very seldom has this ever caused problems and I've been doing Python for over 8 years. I just really don't get the "problems" you see here, they seem pedantic and theoretical rather than real world issues.
Again, not to say pip is perfect, it isn't. Just that it's way more reliable than NPM.
You don't care, and thats fine. I, however, do care.
I prefer to keep the list of packages that I'm installing limited to just those that I actually need - maybe someone will argue this is unnecessary, but, I think that limited what you install to what you are actually going to use, as much as possible, is simple good behavior when you are installing onto your production servers. If I have a flat list of packages in a requirements.txt, its super hard to keep track of which ones I'm actually using as opposed to those that some other dependency is using. And when some dependency stops using them, then, they tend to get stuck in requirements.txt for no reason.
Its hard to keep track of what the actual version requirements for those packages are - which means its hard to keep them updated. As much as possible, I want to keep the versions of the packages I'm using up to date. or, at least have the option to do regular updates on some schedule that works for me and my team. But, when I have this flat list of locked packages generated by pip freeze, that becomes a big giant pain. It should just be a single command and then some testing.
These aren't niche or pedantic use cases. Just because my use case isn't the same as yours, doesn't make mine invalid, or even uncommon. I'm sure your use case works for you, and that's great - and I'm not going to describe it pejoratively just because its different than mine.
I haven't used NPM much, but, NPM at least does attempt to address some of these issues. I can't speak to how well it does. I do know that package managers for other languages, such as Rust's cargo, do address these types of issues and make addressing these types of issues first class concerns. So, its not like this is some giant unsolved problem in computer science - its just that most of the Python tooling doesn't.
The argument is not really whether it’s easy to get up and running with some trivial app. Clearly it is easy in purging. Parent’s point is about the longer term development cycle, when you have more dependencies, more project complexity, and more potential for small errors to ripple and affect your production systems.
The argument is invalid, if a project has that many issues someone did a terrible job or chose a terrible ecosystem. pip/venv/python is dead simple even for incredibly complex projects. if someone scrws that up routinely they might need to consider a different career.
A language is more than syntax. These things matter a lot and scaling python can be a huge pain ... despite how much most of us love python syntax, it has problems at a project level.
Isn’t this just what occurs when you move beyond a single language? I wouldn’t say he discovered how much better nodejs was over java, so much as he discovered that java isn’t the only way to do things; he would have probably had a similar epiphany with go, rust, smalltalk, clojure, haskell or pretty much any other not-java language.
Imo this more an argument towards expanding beyond a single language/ecosystem; hopefully the author doesn’t make the same mistake with nodejs, nestling in and never moving beyond it for 10 years until finally realizing there might just be another way
I don't really care what this guys credentials are; he claims npm and yarn are better than maven and gradle but doesn't bother explaining why (and it's evident he hasn't used gradle)
Nor has he moved beyond a Java 6 mindset considering the things he complains about in terms of modularity, verbosity and even strict type checking.
The fact that he thinks that the IDE race is between Eclipse and Netbeans is telling.
As far as boilerplate and unneeded code goes, he is correct. Spring, Hibernate and Spring Boot removes code, and results in some arcane problems that you shouldn't need to learn about. Now, let's discuss the React ecosystem: explain to me the purity of intent behind an application that requires:
- action classes/files
- reducer classes/files
- possibly route classes/files
- a "store"
- a component
to set a variable, complete with the half dozen packages required to enable this behaviour. Not everyone has to do this, but when creating a React application with the kind of scope that a Java application often requires, this is an inevitably, and a LOT of work for a supposedly simple task. If there is an error, you are similarly going through a stack trace of weird abstract classes, except they are in a bundle file which is equally impossible to parse.
Being nitpicky here: you mention this somewhat in your comment, but I just want to stress that redux is absolutely not necessary to your react application (though a lot of beginners think it is a "must know" because every blog article says you need to know it). React in and of itself allows you to achieve the same functionality of redux (and a fairly complex app) without the need for a bunch of boilerplate code. Of course, if you deeply understand redux and its purpose, it can make creating large apps less of a hassle, but there's nothing in react that says you must use redux to make a complex app.
Indeed, we’ve built pretty big and solid apps without redux (when it did not even exist). After that we investigate “flux” and eventually we decided to adopt redux, that was easy for new hires.
You say React but then complain about Redux. I get it I'm not a fan of Redux either, so why not direct your criticism to Redux instead of React? Also I would like to add that Redux is not needed to build a React app.
> There's a reason people created Kotlin, there's clearly shortcomings in the Java language which did not move fast enough.
Kotlin was created mainly because IntellJ couldn't move on from Java 6. It's a huge improvement, but you've just used the same type of mindset stuck at Java 6 that's being criticised.
Wasn't Kotlin introduced because Android was still stuck in Java 6? IntelliJ is running on JRE9 (a more or less LTE version they package with their installer) currently, afaik, while Android supports Java 7 and only some Java 8 features.
I know it's not the main point but you're talking about Redux there, not React. I've worked with React for years without touching the stuff you're talking about.
I'm a Redux maintainer. FWIW, I'm always open to suggestions on how we can make using Redux easier and better for everyone (example prior discussion: https://github.com/reactjs/redux/issues/2295 ).
We're currently working on a "starter kit" package that's intended to address some of the most common concerns about using Redux, and hope to make it an official Redux-branded package in the near future. I'd appreciate any feedback or ideas you have for improving it:
Also, since you mention stack traces: if anyone out there has a bit of free time on their hands, I'd _love_ to see the Redux DevTools core and extension updated to show a stack trace for where each action was dispatched:
So on the off chance that anyone happens to look at this thread in the future: last night I decided to try implementing that stack trace display idea, and I got it working! Still needs better formatting, but the important thing is the data is there and viewable.
Take a look at what, specifically? If you want him to invest his time helping your competing tool, then respect that time and direct him to specific places where his input might be appreciated.
I’m an experienced native app dev (swift/kotlin). I learned react/redux in a few months.
For me, stressing immutability via Immutable.js/flow would go a long way. The combination is terrific but I had to piece together my own best practice.
However, the "starter kit" I linked does use the Immer library internally, which uses ES6 Proxies to let you write "mutative" code, but then applies the updates immutably. The goal here is to simplify your immutable update logic. (My only concern with using it is that there's no obvious indication in your own code that a given reducer function is updating immutably, because the code as-is really _is_ mutative, and that might confuse people down the road. Still, hopefully it will be beneficial in the long run.)
Seems to be stuck at 2000-2010 java. And hasn’t spent a whole lot of time with npm or tried to maintain a project for a significant time, wrestling the issues that the fluid npm ecosystem brings.
Maven gets a lot of flack for its rigidity but the fact that one can get a ten years old complex project and build it reliably today is simply amazing
Exactly, I deal with npm on a daily basis and I really wish deterministic builds were more of a concern in the javascript/npm universe. I can't count how many times a project broke because a "minor" release came out and broke existing code. Also the majority of projects have a caret in their package.json package versions so you automatically get the updates.
I'm well aware of package locks. However npm has a horrible implementation of package locking. Running npm install will regenerate your package lock unless you removed prefixs from all your packages or you install using --package-lock-only[1].
So my question to you is, have you been using npm package locks correctly? Or do you let npm install generate a new package lock every time its run?
> It is highly recommended you commit the generated package lock to source control
Fixed the Javascript way, layers of workarounds.
And if you indeed need to update a dependency, you need to regenerate the lock.
But the main difference is that when that happens, the onus to verify all packages transitive unpinned dependencies still result in a working combination (of which there is no guarantee) is on the receiving developer, not on the package maintainer.
This is untrue. If you need to update one dependency; npm/yarn will update the lockfile with the minimal changes needed to update that one dependency. You do not need to regenerate the entire lockfile.
If the goal of this article is to compare Node.js vs Java...then why are you bringing up React? Shouldn't the conversation be restricted to the backend? In that case, wouldn't you want to compare Express to Spring?
> It’s not like I’ve completely divorced myself of Java. I have written a significant amount of Java/Spring/Hibernate code in the last 3 years (...) I worked in the Solar Industry doing deeply soul-fulfilling things like writing database queries about kiloWatt-hours
This person definitely needs to try out some JVM languages like Scala, Kotlin or Groovy. They solve nearly everything he talks about while keeping the good parts of the JVM ecosystem and avoiding a lot of the bad parts of NodeJS.
For a long time I worked with Java and tried to make the move to Groovy since I could appreciate what I read about it, having a background in Smalltalk. But somehow using Java for some large projects just seemed more straightforward and safe, because of type-checking which supports large-scale refactorings.
But since I moved to mainly work on Node.js I feel that Node.js is what Groovy should have been. A simple but actually powerful language since EcmaScript version 6.
And open.JavaScript feels open because, JavaScript is everywhere, both on the client and now with Node.js on the server too. Everybody knows it. It can not be called "niche language", like Groovy perhaps could. There is safety in numbers.
What I'm looking forward to is Node.js running on GraalVM. That should make it easy to call Java libraries from within Node.js. Use JavaScript as "glue for Java" like it was originally intended.
I agree with you about ES6 - very much in many ways it's captured the best of the syntax from languages like groovy without relaxing so much that things become ambiguous etc.
Nonetheless, I find JavaScript still simply isn't structured enough and doesn't have the ability to tap into the underlying strengths of the JVM ecosystem. Static compilation has improved enormously incidentally, which to me means I almost never drop into Java any more where I used to code 50/50 in Groovy / Java.
Mind you, if all you want is ES6 on the JVM, you already have a pretty good subset of it in Nashorn / JDK9. I don't know how well it can use Node libraries though ...
> Use JavaScript as "glue for Java" like it was originally intended.
Then we would have come full circle. JavaScript was originally intended as glue for Java applets, but applets never became popular. Instead, Java was used for standalone and server-side codebases, and Beanshell came along as glue code for those. Later, James Strachan added closures to Beanshell's basic functionality, and Apache Groovy was born. Afterwards, Scala became popular and showed that a statically-compiled language could be used dynamically as well, and Strachan said if he'd known about Scala at the time, he'd never have created Groovy. Later, Kotlin showed you could even have builders in statically-compiled code. Now Graal is finally enabling Javascript on the JVM, perhaps selling the idea to developers better than Rhino or Nashorn did.
I went from Java to (dabbling in) Scala to TypeScript. While my heart is still with Scala, the ecosystem around Node is incredibly persuasive. The speed with which you can accomplish practical tasks, like building a desktop app with Electron, beats anything I've experienced in Javaland. (Of course that greatly depends on what kind of projects fill your time.) I think you could argue that the way Node hops domains -- web front-end, back-end, and desktop -- fulfills a broader version of Java's original vision: to be the cross-platform language.
I can't speak for the author, but I did a similar shift around 2010 to Node.js and JavaScript from many years in the Java ecosystem. I did try Groovy and quite like(d) Scala before moving to JS, and while Scala at least was a big improvement at the language level, I eventually shifted further and further in Node.js because it simply was more productive and easier to make full stack webapps.
I can see developer's points if they are just "backend developers", but doing anything with a web UI is vastly superior experience and simpler in Node.js.
Oh, for sure! I just mentioned the three that first came to mind (which is biased by the fact I personally use them). But yes, I know lots of people who love Clojure.
Seems like this guy has not touched Java for the last 10 years, this Java 10 is much more modern and there are many robust and light weight frameworks now that make web development (Java EE) a breeze. You can even write Node.JS code within Java using Graal VM.
I’m not so sure about that. I see node getting more and more traction these days. In 2012 it was a hipster tech, in 2018 it’s actually starting to see its way into production. At least around here, my own place included.
We’re a .net shop, like half the country I live in, but I could’ve written a somewhat similar story as the author. Not exactly similar, I don’t think NPM is really better than maven or nuget and coming from .net, the IDEs for JS are certainly weaker than they are for .Net.
The reason we’re doing more and more JS, and using node, is because of how fast it produces stuff that works.
It’s simply faster to write the same functionality with JS and node than it is in .net, and it gives us the advantage of only using one language for back-end, front-end and mobile. Which is extremely useful for a small team in an enterprise setup.
It also works everywhere. We’re a .net shop like I said, that means we run on IIS, which can be done with other techs than .net through stuff like fastcgi, but have you ever tried setting that up and keeping it up-to-date? I have, and really showed me how little of an operations guy I am because it was horrible. IIS-node on the other hand worked out the box.
It’s a little ironic, but it’s now easier for us to deploy Node apps than .net apps because .net apps often need different configuration on the IIS than it does on your development box and Node works exactly the same.
It’s mostly the easy, fast part that does it for us though, and not so much the language or node itself. We’re simply more productive when we’re using Node.
Of course you need a lot of governance with the JS stack, but you honestly need that for any tech, and we already had pretty strict rules in place for how we write things in .net.
Disclaimer, it’s entirely possible that we are not as good at .net as we think. It’s also possible that I’m noticing Node in more places than I did in 2012, or 2016 for that matter, places because I’m more focused on it now that we use it in production. Node is a rising trend throughout my network though, with more and more people picking it up for stuff that isn’t just weekend projects.
We have, I haven’t personally done any of it in production because I didn’t really see a reason to use it over node. My coworkers did a few pocs though, and initially had to spend a lot of time actually getting it to work with deployment. I have no idea why that was, you’d expect it to just work considering it’s a complete Microsoft setup, but it didn’t work out.
So it was a terribly expensive poc, because our most expensive resource is the time developers/operations spend. I mean, core is more performant than node technically, but as far as cost-efficiency goes, time to market is just much more important than technical performance in our organization. We have a lot of iron, and we don’t have a lot of users. We’re a muniplacity of average size after all, the most simultaneous visitors we’ve ever had across every system we operate is in the tens of thousands.
As far as development goes, I don’t think there is much difference between core and standard, but I’ll admit that I haven’t looked into it since 1.0.
We’re in this weird spot where we’re using more and more powershell and azure orchestration for back-end services that used to be powered by C#. A lot of that has also moved from development to operations, so that development is mainly focused on building APIs, working with data-sets or building various minor web/mobile apps, and Node + JS is just working out better than everything else.
We’ve tried core, we’ve tried Django and flask, we’ve tried a bunch of other stuff, but for getting an app up and running the same hour you start building it? Well, node just works for us.
I’m not saying node or JS are better than .Net though, I don’t think it is and I’d still prefer Java or C# if I was building something major.
I develop a very complex, 500K+ LoC Java application. It implements a natural language sentence parser that is integrated with a specialized lossless data compressor. I use:
- No IDE, just a simple text editor
- No ORM, hand-written SQL code
- My build tool is a Python script that marshals a javac call
- I have just a few JAR file dependencies, which are checked into the repo
- My own simple unit test and integration test framework
For me, Java Just Works. It's literally been years since I've encountered a problem with the actual language. Yeah, there can be a lot of boilerplate. Yeah, there are some issues with complex type manipulation. But those limitations almost never cause me any real pain.
Yours is a very niche case where you built an application that is perhaps used by other applications downstream.
The author's gripes are mostly with the day-to-day java web application development that is being done in large firms (which I could even believe is the majority of Java development work).
I only introduce new dependencies into my system when I see them as strictly necessary. I am completely satisfied with my alternative implementations of those components (this is in contrast to the IDE question, where I at least see the benefit of tools like autocomplete that the IDE provides, but choose not to use it because the benefit of simplicity is greater). Given the rage and anger the OP directs against Maven, I think I am completely justified in that choice.
I can understand where you are coming from, and I often follow a rather similar approach where I will implement things myself if the alternative is depending on some unwieldy beast of a dependency.
(I did write my own Servlet engine for a project once, and it was faster than Jetty, and had better GC characteristics at the time, but in the end I ditched it for Jetty as I didn't want to maintain a complete HTTP implementation - that wasn't key to the project. The extra performance and the dependencies I got rid of were just not worth the extra effort).
However the reason I always chose to use JUnit was because I wanted the unit tests to be familiar to other developers since this is often where people go to understand how a piece of code works. In a sense it was a part of Java at the time. And I chose Maven for similar reasons. Plus the fact that Maven wasn't something you could really escape from in Java. Sooner or later it rears its head and I thought I would cause less pain if I just accepted that.
I like the idea behind Maven: convention over configuration, but I've never been a fan of how that idea has been executed. It could have been better if someone more experienced (and obsessive) had designed it way back when.
I still think Gradle is worse than Maven though. I'm not a fan of scriptable build systems, they should be declarative (and most of the time when people think they need to script things they just haven't bothered to learn how to use the tools). And even if you use Gradle, you can't really escape dealing with Maven.
(I ditched Java a couple of years back, so I haven't done Java for a while. I'm not overly fond of Oracle, so I had promised myself that I would stop using Java as soon as I found a replacement language that suited me. It took a few years, but I eventually found a different language I liked even better: Go)
My observation is that if you use Java mainly through large frameworks, you won't mind as much switching to JavaScript. When using large frameworks, most of your day is essentially spent doing what I think of as configuration work -- making things fit together, rather than actual programming.
That's not a value judgement -- I'm just describing very different jobs.
If you are a configuration programmer then it really doesn't matter much what language you use. In fact, the simple ergonomics of the tools for a given set of tasks may be more important than the language itself.
It sounds more like the author doesn't know shit about modern Java ecosystem. HTTP servers are embeddable and configured in 5 lines of code. Connection Pools work just fine with zero config 'cause they have sane defaults. Spring Boot allows you to have proper web-app by list couple dependencies and one line of code - the one that actually starts it.
Tests work out of the box, sane distinction between integration tests and unit tests, plugins for docker and NPM so one command runs your unit tests, your integration tests, builds you docker images, runs databases, everything. Your CI is 'mvn clean verify' now, without arcane shit in bash.
Then those people come to me and say that I need to redefine "require" to be able to mock stuff (and don't forget to do it in the right order). No, thank you, I'd rather stick with my DI and write actually testable code.
Whenever I try to step away from Java I always am in trouble:
- How do I mock stuff - oh, you simply hack the core mechanisms in $language - no thanks, we've moved on past that in Java.
- How do I build stuff - well, simply write this bunch of supporting scripts in JS/bash/make whatever - no thank you, maven plugin ecosystem does it in declarative way with no support on my part.
- How do I integrate it with IDE - well you can't because you've written a bunch of custom scripts to build stuff.
- How do I package resources and dependencies - you build a docker image that has all of it (or zip file and freaking installation script) - thank you very much my jar files can do it just fine without my intervention.
Whenever I step away from Java ecosystem I feel like I'm back coding in freaking PHP.
Yep - once I got beyond 'EJBs' and other JavaEE stuff and eventually to Kotlin life has been a joy. Maven can be bewildering but to me that's the price of flexibility and I've learnt to deal with it.
For all the complexity that you can have with Maven, a large majority of projects can get away with nothing but a <dependencies> section and maybe a plugin or two. The one thing it does right that no other build tool I've used has managed to reach is make projects adhere to a fairly rigorous standard, once you understand the various phases to a maven build and the basics of a POM you can pick up any project and get started fairly quickly.
The author compares Spring and Java EE vs. Node.js. But these three things can do very different things. It would be better to compare Node.js with something like Reactor or RxJava and a reactive Web framework like WebFlux.
> Dahl’s experience was that using threads made for a heavyweight complex system. He sought something different, and spent a couple years honing and refining a set of core ideals arriving at Node.js. The result was a light-weight system, a single execution thread, an ingenious use of JavaScript anonymous functions for asynchronous callbacks, and a runtime library that ingeniously implemented asynchronicity.
He cured the symptom with an oversimplified solution. Having only one thread per process is a severe limitation. It would have been better, to create good APIs (like Reactor or Kotlin coroutines) or language-level solutions (like Rust) to handle multi-threading.
> In Java. Listeners require creating a concrete instance of an abstract interface class.
That is no longer true, since event handlers can be defined as anonymous functions since Java 8 (from 2014).
> Another learning: Most programming languages obscure the programmers intent, making it harder to understand the code.
The author mixes Java the language, Java the platform and libraries/frameworks, so it is worth pointing out, that the Java ecosystem is not limited to the Java language. Using Kotlin or Scala would remove his reasons to complain.
> JavaScript, by contrast, has loosey-goosey typing. The theory is obvious: The programmer has no certainty what kind of object they’ve received, so how can the programmer know what to do?
You have to make certain assumptions to handle incoming data, but it would be good for robustness and security to make these assumptions explicit. And, really, converting a Json string into a Java object is a single line calling Jackson and maybe a few annotations.
The section about modules could be applied to Java 9+ as well.
Obviously Java's type-checking compiler is a benefit for large projects. How could JavaScript ever address its lack of that? Well JavaScript is a dynamic language, it can check types at runtime. Combined with unit-testing that can go a long way without requiring that "everything must always be type-checked". For instance it makes sense to declare the types of public interfaces of a class, maybe not so much every variable used in their implementation.
Cisf.js is one attempt at providing simple run-time assertions to serve the purpose of (optional) dynamic type-checking.
They're produced by the very best degree developers who will be distinguished for your polo dress creating. You'll find polo Ron Lauren inside exclusive array which include particular classes for men, women. <a href='http://whatdaytoday.com/national-chicken-wing-day/'>National Chicken Wing Day</a>
> What’s it mean when you get a Hibernate PersistentObjectException about a “detached entity passed to persist”? That took several days to figure out — at the risk of oversimplification — it meant the JSON arriving at the REST endpoint had ID fields with values. Hibernate, oversimplying, wants control of the ID values, and throws this confusing exception.
Yes, Maven and Gradle are weirdly complex, but they are workable. And .jar files are just downright amazing compared to most other things (I'm looking at you madman Python dependencies!)
Hint: maybe you could come up with something like npm for Java ? Maybe it's waiting to happen?
And Javadocs are mundane but still better than 'nodocs' which is the standard on Node.js which really needs to change, and could start with some standardization and leadership from somewhere ... maybe the Typescript team?
Oh, and also, since everything is JSON, and that blends nicely with JS ... my god man that's so much nicer than JSON/XML in Java which is often unwiedy.