Please visit the web page of our lab: www.autonomousrobotslab.com
Dr. Kostas Alexis
  • Home
  • News
  • Research
    • Autonomous Navigation and Exploration
    • Solar-powered UAVs
    • Agile and Physical Interaction Control
    • Localization and 3D Reconstruction
    • Augmented Reality
    • Marine Robotics
    • Autonomous Robots Arena
    • Projects
  • Publications
  • Group
    • Positions
  • Courses
    • Introduction to Aerial Robotics >
      • Online Textbook >
        • Modeling >
          • Frame Rotations and Representations
          • Multirotor Dynamics
        • State Estimation >
          • Inertial Sensors
          • The Kalman Filter
        • Flight Control >
          • PID Control
          • LQR Control
          • Linear Model Predictive Control
        • Motion Planning >
          • Holonomic Vehicle BVS
          • Dubins Airplane
          • Collision-free Navigation
          • Structural Inspection Path Planning
        • Simulation Tools >
          • Simulations with SimPy
          • MATLAB & Simulink
          • RotorS Simulator >
            • RotorS Simulator Video Examples
      • Literature and Links
      • RotorS Simulator
      • Student Projects
      • Homework Assignments
      • Independent Study
      • Video Explanations
      • Syllabus
      • Grade Statistics
    • Autonomous Mobile Robot Design >
      • Semester Projects
      • Code Repository
      • Literature and Links
      • RotorS Simulator
      • Video Explanations
      • Resources for Semester Projects
      • Syllabus
      • Grade Statistics
    • Robotics Short Seminars
    • Outreach >
      • Autonomous Robots Camp >
        • RotorS Simulator
  • Student Projects
    • Undergraduate Researchers Needed
  • Contact
Picture

Master Projects & Independent Study projects

Localization using a Solid-State LiDAR
Supervisors: Kostas Alexis (UNR)
Available
​LiDAR technology has been particularly succesfull in enabling robots to localize themselves in GPS-denied environments. However, in most such cases the corresponding sensors are both expensive and of some significant weight. Nowadays, there is a lot of research towards the development of solid-state LiDAR systems. In this project, the goal is to develop a localization pipeline that will be efficient given the data provided by such a system, exploit its particular properties and deal with its limitations.
Picture
Localization and Mapping using a Thermal Camera
Supervisors: Kostas Alexis, Christos Papachristos 
Available

​Thermal imaging has certain properties that make it interesting as a sensing modality to assist the problem of localization and mapping for robitc systems. In many environments that good visual features may not be available, the thermal image may turn up being rich in information. However, certain challenges also exist due to the different charactecteristics of thermal imaging and how different viewpoints do not lead to the same differences in image information as in visible-spectrum imaging. The goal of this project is to develop a Simultaneous Localization and Mapping optimized for the case of thermal imaging onboard robotic systems. 
Picture
Real-time Semantic Classification using onboard GPU
Supervisors: Kostas Alexis 
Assigned Student: Tyler Sorey
​
​This project aims to develop a GPU-based architecture to enable the real-time, efficient and robust semantic classification of objects within the environment. The goal is to contribute a new approach that exploits the superior abilities of GPUs for parallel computing and eventually allow the real-time classification using both 2D and 3D information as provided from the sensors onboard a small aerial robot. To that end, an embedded GPU system is also employed. 
Picture
Cooperative Autonomous Exploration using Aerial Robots
Supervisors: Kostas Alexis (UNR), Mina Kamel (ETH Zurich)
Available

This project aims to develop cooperative and distributed algorithms capable for efficiently solving the problem of autonomous exploration based on a team of possibly heterogeneous aerial robots. Employing information gain–based approaches, we search to develop advanced methods that will allow a single aerial robot to explore its 3D environment autonomously and efficiently, while methods of direct/indirect communication will enable distributed cooperative exploration. The methods to be developed will be tested in a high–fidelity simulation environment that incorporates the sensor models and accounts for the true vehicle limitations and dynamics while the possibility of real–life experiments is provided. Download the Project Description (PDF)
Picture
Real-time HUD Goggles 
Supervisors: Kostas Alexis, Tyler Sorey, Christos Papachristos
Available
​
In fast paced sports, especially those where the individual's life is at risk such as skydiving, information about the current environment can be paramount both during and after the event. Skydivers generally use information known only before entering the plane, or that can be seen or felt. Providing them with real--time, consistent information would be invaluable. To understand the importance of such data, it is enough tto consider the risk for human life when being under canopy. As the canopy sizes decrease, the risk for the skydiver increases even further. This risk can be alleviated by having real--time updates of ground speed, angle of attack and other relevant states. This project aims to develop an automated Heads Up Display (HUD) Goggles system that can provide such information while remaining small and lightweight, robust and accurate. It is highlighted that all these qualities are essential. Inaccurate data can be misleading and may lead to wrong situational awareness. At the same time, the system has to rely on lightweight embedded solutions as it should hinder the user's natural ability to perform the required tasks. Download the Project Description (PDF)
Picture
GPU-based Dense Mapping for Aerial Robotics
Supervisors: Kostas Alexis
​Available

Dense mapping in real–time is among the most critical functionalities that an aerial robot should possess in order to be able to conduct applications useful to the society. Although some solutions exist, most of them rely on structured–light sensors and can operate only in very limited distances and appropriate light conditions. Using a stereo camera for this purpose can lead to a more versatile framework but achieving equal levels of data density in real–time and at comparable update rates is a major challenge. This project aims to examine the use of embedded systems with GPUs and stereo camera systems in
order to achieve this goal. Download the Project Description (PDF)
Picture
Picture

Short Projects

Thermal Navigation for Aerial Robotics
Supervisors: Kostas Alexis (UNR)
​Available

The goal of this project will be to investigate the potential of thermal navigation for small aerial robotics. Via the integration of a minimal sensor suite that combines thermal vision, inertial sensors, magnetometer we want to examine the potential of deploying an algorithm for the localization of the aerial vehicle against its environment without visible light cameras. This will set the basis for autonomous navigation when the light conditions do not allow the operation of normal cameras and will further give rise to the possibility of night navigation for micro aerial vehicles.  Download the Project Description (PDF)
Picture

Own ideas - high risk projects!

Do you have your own idea about a robotics project? Are you willing to discuss a high-risk project with the understanding that things might not always work? Contact me and schedule a meeting!
Picture

Old - Accomplished Projects

Dense Mapping and Autonomous MAV Navigation using Time-of-Flight 3D Cameras and Inertial Sensors 
Supervisors: Kostas Alexis (UNR)
Assigned. Student: 
Ashutosh Singandhupe
The goal of this project is to develop a hardware and software module that fuses the data of Time–of–Flight 3D sensors, visible light cameras and inertial measurement units in order to a) enable the accurate robot localization and 3D mapping using a Micro Aerial Vehicle (MAV), and b) empower the robot with the capacity of collision–free navigation in previously unknown environments. Download the Project Description (PDF)
Aerial Robotic Path Planning for Inspection Operations 
Supervisors: Kostas Alexis (UNR)
Assigned. Student: 
Harinder Singh Toor
This project aims to intelligently combine methods for explicit inspection path–planning (when a model of the structure to be inspected exists) with active exploitation of the probabilistic vision–based feedback available in most high–end aerial robots in order to develop an adaptive algorithm capable of “active sensing” for complete, high–fidelity structural inspection operations. By actively closing the perception–navigation loop of the system, the goal is to establish the autonomy levels required to enable autonomous complete 3D inspection in partially known or completely unknown environments. Download the Project Description (PDF)
Reconfigurable Multi-agent Autonomous Exploration using Aerial Robots 
Supervisors: Kostas Alexis (UNR)
Assigned. Student: 
Sanket Lokhande​
This project aims to intelligently combine methods for explicit inspection path–planning (when a model of the structure to be inspected exists) with active exploitation of the probabilistic vision–based feedback available in most high–end aerial robots in order to develop an adaptive algorithm capable of “active sensing” for complete, high–fidelity structural inspection operations. By actively closing the perception–navigation loop of the system, the goal is to establish the autonomy levels required to enable autonomous complete 3D inspection in partially known or completely unknown environments. Download the Project Description (PDF)
Development of the UNR Flying Arena
Supervisors: Kostas Alexis (UNR), Luis Rodolfo Garcia Garrillo (UNR)
Available to multiple students . 2 students already Assigned: Aswathi Sandeep, Alexander Wittmann
This project aims to develop the UNR Flying Arena. Located at the High–Bay laboratory, UNR provides large space for testing of unmanned aircrafts. The available infrastructure includes a motion capture system which facilitates ground truth but also enables us to deploy safety mechanisms that can take over the control of our robots in case of emergency. This project aims to develop the relevant ROS–based Framework. Download the Project Description (PDF)
Proudly powered by Weebly