Kinect 2-Chain

The Kinect 2-Chain was a project I worked on for HackMIT 2015. The goal of the project was to aid the visually impaired in navigation. We used a Kinect 2 to map the space in front of the user and send stereo audio signals with varying pitch to indicate the direction and distance of obstacles. We also used a deep learning API so that the user could also request that a description of the scene in front of them be read aloud. We took 2nd place overall and also won the Microsoft prize; some news coverage can be found here.