This video is part of the Deep Learning Summit, San Francisco, 2018 Event. If you would like to access all of the videos please click here.

Building Infrastructure for Deep Learning

OpenAI is a non-profit research company that does cutting-edge AI research. Kubernetes and Docker have allowed us the flexibility to experiment with various computing frameworks and topologies without paying the infrastructure cost, and have enabled us to keep up with the pace of deep learning research. However, our use cases are distinctly different from the well-supported microservice use case, and we've iterated on our infrastructure and tooling to optimize for our work. In this talk, we will go over some of the motivations and internals of our customizations, as well as an example of how they all come to work together to accelerate research.

Vicki Cheung, Head of Infrastructure at OpenAI

Vicki is part of the founding team and leads infrastructure at OpenAI, where they run deep learning experiments with large numerical compute requirements at scale. Previously, she led engineering at TrueVault and was a founding engineer at Duolingo.