Hello,
My name is Rafael and I am a research student at University of Toronto.
We would like to use a Microsoft Kinect to identify a quadcopter's position and attitude in the space. We are NOT using the Kinect to identify
people or gestures.
Before moving to my question, I will briefly explain the idea of our project:
We have an AR Drone 2.0 with which we have established communication to read its sensors and to actuate in its motors. The goal of the project is
to apply distinct control theories that can guarantee positioning and hovering. However, for the first one we need inertial measurements (attitude and position in relation to a fixed coordinate system). As we have a MS Kinect available, the idea is to use
it to acquire this information.
My question is: as the Microsoft Kinect can identify the joints of a human body with precision, is there a way to change its algorithm (or something
already developed), and make it find the joints of a drone, such as markers attached on it, for example?
Thank you in advance,
Rafael Miranda.