dc.contributor.authorSazali Mohammed Ali
dc.description.abstractAs drones are now becoming more popular and being commercialized all over the globe, new ways of controlling these drones have been explored. The purpose of this study is to investigate one of the many ways of controlling this drones that feels second nature to us and gives the natural flight experience to the user. In this study, the implementation of using a motion controller to control the motion of a drone is via human gestures. Using the LEAP motion sensor as the motion controller and the Parrot AR Drone 2.0 for this implementation. The Parrot AR Drone is commercial quad rotor having a built-in Wi-Fi system. The Parrot AR Drone is connected to the laptop via Wi-Fi and the LEAP sensor is connected to the laptop via USB port. The LEAP Motion sensor recognizes the hand gestures and relays it on to the laptop. The laptop, acting as the server, runs the program which is used as the platform for this implementation. JavaScript embedded in HTML is the programming language used for interaction with the AR Drone to convey the simple hand gestures via web browser. In the implementation, we have written JavaScript codes to interpret the hand gestures captured by the LEAP, and transmit them to control the motion of the AR Drone through these gesturesen_US
dc.format.extent34 p.en_US
dc.rightsNanyang Technological University
dc.subjectDRNTU::Engineering::Computer science and engineeringen_US
dc.titleGesture control for indoor navigation and maze exploration for dronesen_US
dc.typeFinal Year Project (FYP)en_US
dc.contributor.supervisorSundaram Suresh (SCSE)en_US
dc.contributor.schoolSchool of Computer Science and Engineeringen_US
dc.description.degreeCOMPUTER ENGINEERINGen_US
dc.contributor.researchCentre for Computational Intelligenceen_US

Files in this item


This item appears in the following Collection(s)

Show simple item record