As for your second question, Cortex uses Docker to containerize models. The rest of Cortex's features (deploying models as microservices, orchestrating an inference cluster, autoscaling, prediction monitoring, etc.) are outside Docker's scope.
Right now, users handle API auth by using AWS API gateway in front of Cortex, but incorporating AWS API Gateway into Cortex to automate this is on our short term roadmap.
We also have a pretty active Gitter channel: https://gitter.im/cortexlabs/cortex
As for your second question, Cortex uses Docker to containerize models. The rest of Cortex's features (deploying models as microservices, orchestrating an inference cluster, autoscaling, prediction monitoring, etc.) are outside Docker's scope.