Hacker News new | past | comments | ask | show | jobs | submit login
Ask HN: How is your company using Node.js?
19 points by chrisabrams on Feb 11, 2014 | hide | past | favorite | 16 comments
I am curious about how different companies are using Node.js in their project(s). I am working on an open-source report which I hope to publish online soon. If you're in NYC I'll be talking about this at a future NYC Node.js meetup.

1. What are you using Node.js for? Is this for a new or existing project?

2. Why choose Node.js for the project?

3. What performance gains/benefits did you see, if any?

4. Any lessons learned or gotchas?




1. Node.js powers the data collection for W3Counter's real-time dashboards

http://www.w3counter.com/stats/pulse/1

2. I got tired of fussing with kernel network config tuning, and I'm more confident with writing servers in node than C/C++.

3. A reduction in TCP sockets needed to handle a few thousand concurrent connections from several tens of thousand to just the few thousand. The request/response model of doing it in something like PHP meant there were connection from the users to nginx, nginx to php-fpm, php-fpm to the database for each user -- and it increased at more than just 3x the number of users since TCP connection stick around a while after they're finished waiting for any out-of-order packets and such.

A reduction in time-per-request from ~70ms to ~1ms. No initialization/db-connect/teardown per-request means node can do its thing and dispose of the request much faster than PHP, which was the original language this was written in some years ago.

http://i.imgur.com/ipQXcjJ.png


That's nice! I guess the response time will be on the same network, right?If not, What is your server provider? How far you are from the server? What is your ISP?


The 70ms and 1ms are time spent in the application handle each request, not response times at the client. That's going to be determined by latency between the client and the servers, which are hosted in Softlayer's Washington DC data center. 19ms from Philadelphia if you're curious.


Oh wow, that's an impressive reduction in time-per-request! Thanks!


1. https://www.datalanche.com

2. The correct, proper answer is very, very long. Too long for a HN comment. The short version is that the product is I/O heavy since it mostly just queries a database and returns the results to the user over HTTPS. Node is primarily designed for this use case.

3. Per-query memory overhead is less than a threaded approach. Javascript is one of the fastest scripting languages when it comes to string manipulation which is what the API server does besides I/O. Node itself is written in C/C++ so it is quite fast too.

4a. Being single-threaded, it is very sensitive to CPU-bound tasks. I am actually working on refactoring some code to support streaming because there is an edge case that is CPU-bound and stalls the whole process.

4b. A big downside for my use case is handling 64-bit integers and numerics since Javascript does not natively support these data types, but are very important to support in a database product. We have worked around it using strings. The API server does little computation itself, so it isn't that big a deal.

4c. If you are doing a lot of CPU-bound work or computation with large numbers, then I strongly recommend not using Node. Otherwise it is probably okay even if your use case is not what Node is primarily designed for.

I should also point out that I have been developing software for a long time and have learned to become skeptical of the latest language, framework, tool, etc. I have seen so many of them come and go. It took some convincing for me to decide on Node for Datalanche. I am a big believer is picking the right tool for the job rather than the latest trend. I will be the first to say one should pick Node because it is the best for the use case.


Thank you for the fantastic insight! This is one of the better, simplified versions of the pros and cons of Node


You are welcome. :-)


API servers seem to be the most popular use case that I've seen.

That's interesting about using strings to work with 64-bit integers.

Well said on picking the right tool for the job.


I run a web development company in Sydney called Thinkmill (http://www.thinkmill.com.au).

We use Node.js for everything we do, from mobile app backends to websites to web apps. We've been using it for nearly two years, and found it fantastic - compared to technologies we've used previously it's faster to develop in, faster to run, easy to deploy (especially with platforms like Heroku) and the value of npm and the ecosystem of high quality open source packages is hard to overstate.

We have also developed an open source Node.js model-driven CMS / web app framework called KeystoneJS (http://keystonejs.com) which is built on Express and MongoDB and has been getting some great feedback :)

The biggest lesson / gotcha has been getting into the different mindset - especially when we started, there wasn't a lot of 'hand holding' available compared to other languages and frameworks, and there were few established best practices to follow. I think it's easier these days (there's certainly more out there) but I'm very familiar with it now so it's hard to gauge from a beginners point of view.

It's easy to fall into traps with callback patterns and error handling, for example - so you really need to get your head around what's out there (like the async library for example), and how to structure your application and code.


Keystone.js looks pretty sweet, I'm gonna try it out this weekend!

Before using Node.js had y'all done a lot of work with async code-bases?


I'm not a company, but that said I'm using node.js in a few toy applications.

For example I have a HTTP-server in node that accepts POST requests and sends their bodies over to an XMPP server.

For a bigger project http://blogspam.net/ aims to classify blog-comments as spam/ham in real-time, and the entire server is written with node.js. (It really just receives JSON objects, scans them with a series of plugins, and sends back a reply saying "OK" or not. Conceptually simple.)

Finally my email stack is powered by node.js, handling 30ish virtual domains with recipient testing, and anti-spam at SMTP-time. This is built around Haraka, which is a reimplementation of the ideas found in qpsmtpd. (A server I once built a small company around.)

Generally I've chosen node because it was fast to get "a scalable server" up and running with. Both the blogspam service and the mail handler were initially perl, and both received 20-40% speedups for free (well if you don't factor in my developer time).


We use Node.js for a few internal services. Most of them are HTTP APIs which serve data from MongoDB and Redis. Two applications are gateways to external services (SMPP, Email) which maintain a database of delivery attempts and results. Two other apps: One of them is a tracker which collects data from our website of our logged in users activity, and one is a queue manager which manages load on external services which can take a limited number of requests at a time.

Node.js has proven to give us good performance benefits and ease of development, maintenance and scalable deployment compared to PHP and .net, the other two platforms we usually work with.

I am not aware of the old metrics, but as per my seniors, we have got a very good performance improvement on the API services by using node. The rest of the node projects were new projects.


That's great to hear; your story is very similar to many companies out in NY.


Exponential.io - Exponential is a code generator designed to help developers build apps faster with less effort.

We use Node.js for our entire client-facing stack including an API server for code generation, a command line client for pulling updated code from the server, plus we use it for our public website.

We're currently evaluating using Node WebKit (https://github.com/rogerwang/node-webkit) for building a cross-platform desktop UI for our CLI tools.


We are using Node as a backend to multipart uploader interfacing with the Amazon S3 architecture. Our application is built in Symfony2, but we found that Node was much easier for writing parallel upload scripts in. Took less than 3 days and works perfectly every time, no regrets there.


What is being uploaded? Large images/videos, small text files, etc.?




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: