Interacting with physical objects with kinect and iPhone
Date of Issue2014
School of Computer Engineering
This project is the proof-of-concept prototype for a new novel method of user interaction with technology. We live in a world today where our gadgets keep getting smaller in size and get added functionality in each new version. With that in mind, it is possible to envision gadgets and technology that are not available now, but will be available in the future. This project works on the premise of a hypothetical device of the future, which combines the functionality of a motion and depth sensor like the Microsoft Kinect and a Projector. This combination opens up a lot of possibilities for new interaction techniques and applications. This project only scratches the surface of possibilities and demonstrates the concept with the help of 2 prototype applications. One of them lets the user virtually draw on objects around him/her and the other lets him/her virtually try on apparel. These applications work on top of a framework developed by the student to find a correlation mapping between the physical space that the Kinect can see and the space on which the Projector can project. This calibration step is the most important step of this process and forms the basis for the applications and a similar step will be required for all future applications. This report details the process of development of this calibration step and the various iterations implanted to finally achieve accurate calibration.
DRNTU::Engineering::Computer science and engineering
Final Year Project (FYP)
Nanyang Technological University