Hacker News new | past | comments | ask | show | jobs | submit login

Has anyone been able to use Lambda for relatively high-memory load applications? (~2GB+ RAM)

That's our biggest restraint at the moment, so far I haven't seen any good options.




Lambda only supports up to a maximum of 1536mb of RAM right now.

I've been involved with developing lambda functions that consume roughly 1.2gb of ram each time, but the memory usage is easy to predict as the function is triggered by files in S3 that are about the same size.

They say to break your problem down into smaller chunks to fit into the memory - is that possible in your case?


Not easily! I'm doing speech recognition; I might be able to run a segment through two separate models, with half the lexicon available in each run, then just combine the results. However I don't think the overall savings will be enough to justify running it twice. It's getting close though, once they get the RAM up more, I'll try it out.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: