I built an CRM/CMS application where every single controller action call is event sourced.
The whole application lives in memory as a single object aggregate, which gets rebuilt on startup. I started off with writing json to the file system, moved into compressing and appending to a log file, and moved into using Azure cloud tables.
It's awesomely fast to respond to requests (15ms), and to add new features, but you do get interesting new problems, e.g. along the way I had to:
- come up with a way of migrating events (as my storage formats changed as I improved my frameworks)
- find a good way to do fast full-text-search against in memory objects as I had no SQL or ElasticSearch infrastructure (ended up using Linq against in-memory Lucene RamDirectories)
- deal with concurrency issues in a fairly novel manner(as all users are acting against a single in-memory)
I'm hoping this architecture will start to become more popular - I think we are in need of a framework equivalent to Rails to take it mainstream.
That is very interesting. I am guessing this is a closed source application? Did you do something along the lines of CQRS (Command, Query part) or just write directly the event source? At what point did appending to log file stop working which caused the switch to the cloud (or was that for unrelated reasons)?
I am also hoping it will become more popular as the pros seem to vastly outweigh the cons. But I think you are right about the framework. From my research it seems to be medium to large enterprises that would typically be best suited to using and developing something like kafka, and those enterprises typically would not open source their applications. So I definitely think a framework from a company who is using it as scale would be huge.
Until then, I suppose I will keep reading up and learning all I can and figure out how to implement this on a much smaller scale.
Cloud storage was just used so I didn't have to manage backups myself.
I absolutely didn't separate command and query - the commands themselves are actions which execute against the domain model, and that domain is used to build responses.
Another thing that gets tricky is making your application deterministic - any calls to the current time, random number or guid generatiom, or to 3rd party services, have to be recorded and replayable in the correct order for when you reconstruct your application instance. This can get tricky if you refactor your application or change its logic later.
It's worth reading up on Prevalence/MemoryImage, and looking into NEventStore also.
The whole application lives in memory as a single object aggregate, which gets rebuilt on startup. I started off with writing json to the file system, moved into compressing and appending to a log file, and moved into using Azure cloud tables.
It's awesomely fast to respond to requests (15ms), and to add new features, but you do get interesting new problems, e.g. along the way I had to:
- come up with a way of migrating events (as my storage formats changed as I improved my frameworks) - find a good way to do fast full-text-search against in memory objects as I had no SQL or ElasticSearch infrastructure (ended up using Linq against in-memory Lucene RamDirectories) - deal with concurrency issues in a fairly novel manner(as all users are acting against a single in-memory)
I'm hoping this architecture will start to become more popular - I think we are in need of a framework equivalent to Rails to take it mainstream.