This video is part of the Connected Home Summit, Boston, 2016 Event. If you would like to access all of the videos please click here.

Turning Everyday Devices into Health Sensors

Today's electronics have very sensitive optical and motion sensors. These can captures subtle signals resulting from cardiorespiratory activity. I will present how webcam(s) can be used to measure important physiological parameters without contact with the body. In addition, I will show how an ordinary smartphones can be turned into a continuous physiological monitors. Both of these techniques reveal the surprising power of devices with around us all the time. I will show how deep learning are helping us create highly scalable and low-cost applications based on these sensor measurements.

Daniel McDuff, Director of Research at Affectiva

Daniel McDuff is Principal Research Scientist at Affectiva. He is building and utilizing scalable computer vision and machine learning tools to enable the automated recognition and analysis of emotions and physiology. At Affectiva Daniel is building state-of-the-art facial expression recognition software and leading analysis of the world's largest database of human emotions (currently with 8B+ data points). Daniel completed his PhD in the Affective Computing Group at the MIT Media Lab in 2014 and has a B.A. and Masters from Cambridge University. His work has received nominations and awards from Popular Science magazine as one of the top inventions in 2011, South-by-South-West Interactive (SXSWi), The Webby Awards, ESOMAR and the Center for Integrated Medicine and Innovative Technology (CIMIT). His work has been reported in many publications including The Times, the New York Times, The Wall Street Journal, BBC News, New Scientist and Forbes magazine. Daniel is also a Research Affiliate at the MIT Media Lab.