UF Collaborators
              
  • Professor Andrew J. Kurdila (Project Leader, Optimization and Path Planning Team)
  • Professor Richard C. Lind (Autopilot Team)
  • Professor Michael Nechyba (Vision Processing Team)
  • Dr. Yunjun Xu ?? (Virtual Environment Team)
Introduction

Micro air vehicles (MAV) and micro air-delivered munitions (MADM) are being pursued for missions related to persistent area dominance such as bomb damage assessment, surveillance, local observation, target recognition, tracking, area mapping, cooperative maneuver, loiter, and payload delivery. The critical development needed to deploy such vehicles is with uncertain environment like urban area. Therefore, A dedicated facility, centered around a virtual environment, to investigate visualization and computational technologies that will enable mission capability for a MAV or MADM, operating within a coordinated team, with extreme agility and autonomy, is proposed and undergoing.

This project allow us to derive and develop vision processing specifically aimed at flight control in complex 3D environments. cosed loop, vision-based flight controllers for MAVs that incorporate more accurate flight characteristics of the vehicle as opposed to simple simulations of vehicle flight response, and study the interactive and fusion of diverse sensor technologies including micro-scale gyros, micro-electronics for GPS, MEMS based accelerometers, and vision-based sensing and estimation.

Furthermore, this research will tightly integrate an education plan designed to attract and train promising undergraduate and graduate students in recent topics related to the proposed research. The hardware-in-loop facility would provide outstanding opportunities for education. Clearly, a virtual environment would be much more instructive than simply looking at the plots of time / frequency responses.

The hardware-in-the-loop vision based control of MAV project is organized as four main groups:

  • Virtual Environment Team: Working on designing, developing, and  3D virtual urban environment in order to significantly increase the development rate of enabling technologies, based on the real-time autopilot team information and urban area information.
  • Vision Processing Team: Working on image processing techniques for identifying the obstacles in virtual and real urban environment.
  • Optimization and Path Planning Team: Working on developing methodologies for optimal path planning based on the information provided by the vision team.
  • Autopilot Team: Working on developing efficient control techniques based on the navigation information.