This would be amazing! At lot of ML use cases are largely unfeasible in lambda on python without serious pruning. Latest version of tensorflow is 150mb uncompressed, add numpy pandas etc to that and it adds up fast. I think 1 GB uncompressed would be pretty reasonable in the current state of ML tools, personally.
As a thought, could Lambda (perhaps in cooperation with AWS Sagemaker?) offer a Lambda execution environment atop the AWS Deep Learning AMI? This would solve a lot of problems for a lot of people