Hacker News new | past | comments | ask | show | jobs | submit login
Ace - Sinatra for Node with Fibers (github.com/maccman)
100 points by maccman on Feb 14, 2012 | hide | past | favorite | 37 comments



Might not be the best name for a JS lib/app since ACE is already a popular in-browser editor: http://ace.ajax.org/.

Also encouraging the use of node for synchronous processing (even when using coroutines) is unlikely to be high performing when scaled up. If you have work to do, it'd be best to defer to another service and let node work as a router.


Not to mention the ACE C++ system, which has been around for years and is widely known: http://www.cs.wustl.edu/~schmidt/ACE.html


I've run benchmarks comparing plain Node to Common Node (my library that uses fibers) and the version with fibers uses marginally more memory and on average has 75% of the throughput of the purely asynchronous code.

You can find the benchmarks and results in the project README: https://github.com/olegp/common-node#readme

Here are the graphs: http://www.slideshare.net/olegp/server-side-javascript-going...


+1 ACE Editor and Ace - Sinatra for Node might be confusing.

Maybe name it "Frank" (Frank Sinatra) or "Ray" (Ray Charles). :-)


Well, the creators of ace (macmann and mjackson) are lucky that ACE is almost always used as an acronym in the Ajax.org Cloud Editor's marketing and documentation. Because of this and the sheer number of JavaScript programmers out there, I think they'll most likely be able to keep the name.


very true, especially since node is single threaded, you might block other requests etc.


Regarding coroutines. The very reason Node.js is awesome is because it forces you to do evented programming and makes it apparent what is blocking and what isn't. That's the beauty of Node. The event loop is not a bug, it's a feature. Why program in JavaScript if you don't want to do evented programming? Why use Node.js if you don't want to see the difference between blocking and non-blocking code?

On the flip-side: This looks like a solid implementation of what lots of people have been waiting for (fibers in node).


"Why use Node.js if you don't want to see the difference between blocking and non-blocking code?"

This is a persistent fallacy in the Node community, but there is nothing that requires a "sleep(1000)" in a language to block all execution in other contexts in that language. That is a weakness of your chosen base language, not an immutable truth. Better languages have been doing this for many years now.

I actually sort of agree with you, but you've got it backwards; if you're going to program in a reasonable paradigm instead of manually chopping your code up to suit a weak language and weak runtime, why not use a language actually meant to do this? "Fibers" (as they are termed here) are the right way to go, but they should be designed in from the beginning, not layered on the fourth or fifth layer of the code. You'll never really be able to get them right up there, because they belong much lower.


This is evented programming. The event loop is still being used. Fibers give the impression the code is synchronous, but it's still asynchronous behind the scenes.


Coroutines don't make the code any less asynchronous it just helps abstract away that detail.


This project looks really nice, though I really wish there was an easy way to segment routes into multiple files since it seems like an application w/ any kind of complexity would get out of hand quickly.

I didn't see a way by just skimming the docs - is there an easy way?


Yes, the 'app' global will be available in every file you require in.


Here's an example of my usual pattern, using express: https://github.com/ryancole/node-simple-upload/tree/master/n...

The routes are not namespaced like you'd see with something like Python's Flask does, but it still allows me to group all my similar routes.


this works for me:

include it in routes/index.js, and everything in your routes dir will be exposed if you do var routes = require('./routes')

var files = fs.readdirSync('./routes') files.forEach(function(file){ if(file != 'index.js'){ var exp = require('./'+file); _.extend(exports, exp); } });


just write a library that traverses a directory and include the files...


Alex, nice to see you working with fibers.

I've recently ported Stick from RingoJS to Common Node. It seems to be similar to Ace (also using fibers etc.): https://github.com/olegp/stick

In fact, some of the libraries I've made to work with it, like mongo-sync (https://github.com/olegp/mongo-sync), should work with Ace as is.


This reminds me of Zappa.js (http://zappajs.org/), one of my favorite CoffeeScript implementations of a web server on Node (although it's mostly just a wrapper for express). I'd be curious, though what advantages there are in creating a totally new web server (this approach) from Zappa's backwards compatible with Express.js's approach.


This appears only to support one-at-a-time synchronous control flow. Giving up parallel tasks is a funny proposition in node, as it's one of its greatest strengths. So this framework still seems limited in the potential uses, but maybe it will grow over time.

I wrote a library based on fibers that can provide similar benefits to any node application. It also gives simple patterns for error handling, makes debugging easier by maintaining stack traces across async boundaries, and provides really easy ways to do series and parallel flow control.

Take a look at https://github.com/scriby/asyncblock.

It would actually be pretty cool to integrate asyncblock into this project as it could provide the flow control portion for free.


No, this still supports parallel tasks and evented programing. Fibers let you program in a synchronous manner, but it still be asynchronous behind the scenes.


I'm saying the framework doesn't have support for parallel tasks within a single request. For example, the case of kicking off two database queries in parallel, then doing something when both are finished. I understand that each request executes in parallel.

It's possible I missed how the framework supports parallel tasks, can you explain? The only thing I saw was wait(), which gives you "series" instead of "parallel". It's true people could still use callbacks/libraries to implement parallel tasks, but then we haven't gained much.


Ah, I see. Yes, you could use promises to implement something like that - and it's something that I could definitely add. However, I would say that multiple async calls are pretty unusual.

The usual CRUD scenario is that you do a DB lookup, then perhaps alter the object, save it to the db, and return to the client. All tasks are performed sequentially. Unless multiple async calls are a common scenario, then I think I'll leave them out. Is there more multiple async examples you can think of?


I think in the real world there are a variety of cases that benefit from parallel operatio...

-Anything dealing with disk I/O like copying or reading files -Sending / receiving from S3 or other file repository -Making a request to external web services as part of a request -Working with document DBs like Mongo where instead of a join you can do one query per table in parallel

I would love it if you could check out asyncblock and see if you like how it helps manage control flow with fibers. It would be easy to integrate with what you have already. We use asyncblock in a very large node based application, and it's really helpful for keeping business logic straightforward and simple.

Another possibility would be using node-sync (https://github.com/0ctave/node-sync) if you prefer their approach.


also don't forget v8 stack sizes, this isn't erlang, we don't have super light-weight stacks for coros


Looks very similar to connect and express. Can anyone explain how this is different/better?


The use of node-fibers is how it's different. Up to you on whether or not you think that makes it better.


Quoted from the documentation: "Every request is wrapped in a Node Fiber".

So...essentially, this behaves like a traditional web server? A process listens for incoming connections, then spins up a new process/thread/fiber to handle the request?


No - a Fiber is completely different to a process/thread. The server is still evented. https://github.com/laverdet/node-fibers


each "fiber" still consumes a relatively large stack, so in effect you're right


This is pretty cool. I personally like comb's executeInOrder http://pollenware.github.com/comb/symbols/comb.html#.execute...


Sounds great =D I've never used fibers in Node, but I'm quite experienced with Fibers and EventMachine aka EM-Synchrony. How do Fibers affect 1.debugging & stack trace and 2. performance & garbage collection ? Thx


Debugging is a lot easier since you get meaningful stack traces. Performance is slightly lower due to context switching - check out the Common Node benchmarks I mention in another comment.


This looks very promising and is different from the other node.js web frameworks. Very well thought out.


I think it's much more similar to Goliath than to Sinatra.


If you have to use synchronous code in JavaScript you're doing it wrong.


I'm not saying I like it, but this isn't synchronous code, which I think the readme does a pretty good job of explaining.


I still don't see the draw of using coffeescript server side. Why bother using the language (javascript on node.js) if you are just gonna turn it into something else? Wouldn't ruby suit your needs better?

To some degree I kinda feel like many shops are doing it just so they can say they use node. Maybe that isn't the case.


There's a benefit to running the same language on the client and the server, even if that language isn't JavaScript.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: