Technology Fridays: AWS Deep Learning AMIs Takes Us Closer to the AI-First Cloud

Jesus Rodriguez
2 min readAug 11, 2017

Welcome to Technology Fridays! Today we are going to cover one of the most exciting recent releases in the cloud deep learning space: AWS Deep Learning AMIs.

Fragmentation is one of the aspects hindering the mainstream adoption of deep learning application development stacks. The release of Google’s TensorFlow a couple of years ago, triggered an explosion of open source deep learning frameworks each one with its own strengths and weaknesses. Today, any development team building a deep learning solution has over a dozen open source stacks such as TensorFlow, Torch, Theano, Microsoft Cognitive Toolkit, Caffe2, Caffe, Keras, Bonsai, MxNet and many others at their disposal. More importantly, each one of those frameworks requires specific infrastructure configurations in order to perform optimally and often needs to integrate with other data science technologies in order to deliver complete solutions. In summary, if you thin that building a deep learning application is just a matter of using the correct framework you are up for a surprise!

AWS Deep Learning AMIs extend AWS’s infrastructure and machine intelligence stack with the tools and frameworks to enable data scientists and researchers to build and scale deep learning applications. Built on top of the popular EC2 service, AWS Deep Learning AMIs allow data scientists to quickly launch instances pre-configured with specific frameworks and tools that expedite teh implementation and operationalization of deep learning models.

The current release of AWS Deep Learning AMIs supports the implementation and execution of deep learning algorithms across different frameworks such as TensorFlow, Microsoft Cognitive Toolkit, Caffe, Caffe2, Theano, Keras and MxNet. The instances in the platform also come pre-configured with Jupyter which enables the implementation of interactive deep learning models using Python 2.7 or 3.4. Additionally, the AWS Deep Learning AMIs include the AWS Python SDK which streamlines the interoperability with other AWS services. In addition to Jupyter, AWS Deep Learning AMIs include popular toolkits such as CppLit, PyLint, Pandas, GraphViz and several other.

Hardware acceleration plays an important role in the AWS Deep Learning AMIs. The platform is optimized for NVIDIA CUDA and cuDNN drivers as well as the Intel Math Kernet Library( MKL). Those configuration improve GPU-acceleration in deep learning models without requiring any code modifications.

One of the strongest differentiators of AWS Deep Learning AMIs is the integration with other AWS services. Fro instance, the platform leverages AWS IAM for access control policies, AWS SQS to exchange metadata configuration between instances, AWS Lambda of auto-scaling jobs, NAT Gateway and VPC of communication with external infrastructures and a few other AWS services. Additionally, AWS Deep Learning AMIs integrates with CloudFormation to expedite and automate the creation of deep learning clusters. Automation is also possible via the AWS CLI.

Competition?

AWS Deep Learning AMIs can be classified as a could deep learning infrastructure platform. GPU-optimized instances in PaaS stacks such as Azure or Google Cloud can be considered distant competitors. Multi-platform deep learning runtimes such as Bonsai, Floyd or BitFusion are also rapidly becoming relevant in the space.

--

--

Jesus Rodriguez

CEO of IntoTheBlock, President of Faktory, President of NeuralFabric and founder of The Sequence , Lecturer at Columbia University, Wharton, Angel Investor...