Hacker News new | past | comments | ask | show | jobs | submit login

ICQ didn't archive a copy of every message sent.



This doesn't increase much the complexity...

Not thinking long about it (5 minutes), but this is how I would do it: Not going muhc into detail (I don't use twitter, just had a quick look at it).

1) 1 replicated system where you can fetch messages by id (complete body with who sent it, to whom, time, etc...) 2) User page: list of ids.... 3) Private messages: list of ids... 4) When a new message comes in, write it to a queue. Process those ids and append the ids to the different pages. Multiple processes doing this (each process has a subset of users with the complete list of people following those users). One could add another layer to do bulk inserts.

Could be easily done in Memcachedb. One page view takes x + 1 memcachedb requests (x number of items on page). One can still optimze this by caching (static html pages which are deleted when a page is updated for a user). When inserting, replace existing data by adding the ids.

Everything is nicely seperated. (Eg pages for user 1-10000 are on server 1, etc... Messages can be nicely sperated as well).

Any thoughts on this? To twitter: hire me not him ;)




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: