The research conducted in the lab focuses on two challenges in robotics and the connections between them. Estimation and sensor fusion, lead by Prof. Campbell, allows the robot to perceive and understand the world around it. Verifiable high-level control, lead by Prof. Kress-Gazit, allows users to interact with a robot at a high-level while providing guarantees of correctness regarding the safety and the behavior of the robot.
Developing theory and tools to address these challenges in a integrated way is the key to developing truly autonomous robots.
Linear Temporal Logic MissiOn Planning (LTLMoP) – Python-based toolbox for controlling a robot (simulated or real) using Structured English, logic and symbolic control. Code and documentation can be found here
Walking robot Control – Creating basic controllers to drive a walking robot.
Robots & Humans Cooperative Search Experiment – Humans and robots cooperatively search for multiple targets, based on probability density function. Robots have features such as environment mapping in Gaussian mixture, generating and updating PDF, goal and path planning, navigation with collision avoidance using VFH+ driver, and object detection.
Collaborative Terrain Estimation – A planar grid based distributed terrain height estimation algorithm for use in a distributed data fusion network.
Mapping and Identification with Robot-Human Sensor Fusion – Humans and robots map an indoor environment using robotic sensors (e.g. lasers) and human sensors (e.g. object identification). Robust sensor fusion techniques are used to build a probabilistic map of objects without assuming "perfect" human input.
Collaborative Search with Robot-Human Sensor Fusion – Categorical soft data fusion for Gaussian Mixtures via Variational Bayesian Importance Sampling, with applications to cooperative search. Humans provide hard information (sensor readings, target classifications) and soft information (approximate locations, negative information) to robots to assist in search and identification tasks.