Please visit the web page of our lab: www.autonomousrobotslab.com
Dr. Kostas Alexis
  • Home
  • News
  • Research
    • Autonomous Navigation and Exploration
    • Solar-powered UAVs
    • Agile and Physical Interaction Control
    • Localization and 3D Reconstruction
    • Augmented Reality
    • Marine Robotics
    • Autonomous Robots Arena
    • Projects
  • Publications
  • Group
    • Positions
  • Courses
    • Introduction to Aerial Robotics >
      • Online Textbook >
        • Modeling >
          • Frame Rotations and Representations
          • Multirotor Dynamics
        • State Estimation >
          • Inertial Sensors
          • The Kalman Filter
        • Flight Control >
          • PID Control
          • LQR Control
          • Linear Model Predictive Control
        • Motion Planning >
          • Holonomic Vehicle BVS
          • Dubins Airplane
          • Collision-free Navigation
          • Structural Inspection Path Planning
        • Simulation Tools >
          • Simulations with SimPy
          • MATLAB & Simulink
          • RotorS Simulator >
            • RotorS Simulator Video Examples
      • Literature and Links
      • RotorS Simulator
      • Student Projects
      • Homework Assignments
      • Independent Study
      • Video Explanations
      • Syllabus
      • Grade Statistics
    • Autonomous Mobile Robot Design >
      • Semester Projects
      • Code Repository
      • Literature and Links
      • RotorS Simulator
      • Video Explanations
      • Resources for Semester Projects
      • Syllabus
      • Grade Statistics
    • Robotics Short Seminars
    • Outreach >
      • Autonomous Robots Camp >
        • RotorS Simulator
  • Student Projects
    • Undergraduate Researchers Needed
  • Contact

Semester Projects

The course is developed around challenging research-oriented semester projects. Those that were provided for Fall 2016 are listed below. 
Project #5: Aerial Robotics for Nuclear Site Characterization
Description: A century of nuclear research, war and accidents created a worldwide legacy of contaminated sites. Massive cleanup of that nuclear complex is underway. Our broad research goal is to addresses means to explore and rad-map nuclear sites by deploying unprecedented, tightly integrated sensing, modeling and planning on small flying robots. Within this project in particular, the goal is to develop multi-modal sensing and mapping capabilities by fusing visual cues with thermal and radiation camera data alongside with inertial sensor readings. Ultimately, the aerial robot should be able to derive 3D maps of its environment that are further annotated with the spatial thermal and radiation distribution. Technically, this will be achieved via the development of a multi-modal localization and mapping pipeline that exploits the different sensing modalities (inertial, visible-light, thermal and radiation camera) in a synchronized and complimentary fashion. Finally, within the project you are expected to demonstrate the autonomous multi-modal mapping capabilities via relevant experiments using a multirotor aerial robot. 

Research Tasks:
  • Task 1: Thermal, LiDAR, Radiation Sensing modules integration
  • Task 2: Thermal camera-SLAM
  • Task 3: Multi-modal 3D maps
  • Task 4: Estimation of spatial distribution of heat and radiation
  • Task 5: Heat/Radiation source seek planning
  • Task 6: Robot Evaluation and Demonstration
​
Team:
  • Shehryar Khattak
  • Tung Dang
  • Tuan Le
  • Nhan Pham
  • Tim Kwist
  • Daniel Mendez

​Collaborators: Nevada Advanced Autonomous Systems Innovation Center - ​https://www.unr.edu/naasic
Budget: $2,000

Indicative Video of the Results of the Team:
​
Picture
Project #4: Aerial Robotics for Climate Monitoring and Control
Description: Within that project you are requested to develop an aerial robot capable of environmental monitoring. In particular, an “environmental sensing pod” that integrates visible light and multispectral cameras, GPS receiver, and inertial, atmospheric quality, as well as temperature sensors. Through appropriate sensor fusion, the aerial robot should be able to estimate a consistent 3D terrain/atmospheric map of its environment according to which every spatial point is annotated with atmospheric measurements and the altitude that those took place (or ideally their spatial distribution). To enable advanced operational capacity, a fixed-wing aerial robot should be employed and GPS-based navigation should be automated. Ideally, the aerial robot should be able to also autonomously derive paths that ensure sufficient coverage of environmental sensing data.

Research Tasks:
  • Task 1: Autopilot integration and verification
  • Task 2: Sensing modules and Processing unit integration
  • Task 3: Integration of Visual-Inertial SLAM solution
  • Task 4: Environmental-data trajectory annotation and estimation of spatial distributions
  • Task 5: Real-time plane extraction for landing
  • Task 6: Robot Evaluation and Demonstration

Team:
  • Jason Rush
  • Mat Boggs
  • Devaul Tyler Timothy
  • Frank Mascarich​


​Collaborators: Desert Research Institute - ​https://www.dri.edu/ 
Budget: $2,000

Indicative Video of the Results of the Team:
​
Picture
Project #3: Robots to Study Lake Tahoe!
Description: Water is a nexus of global struggle, and increasing pressure on water resources is driven by large-scale perturbations such as climate change, invasive species, dam development and diversions, pathogen occurrence, nutrient deposition, pollution, toxic chemicals, and increasing and competing human demands. These problems are multidimensional and require integrative, data-driven solutions enabled by environmental data collection at various scales in space and time. Currently, most ecological research that quantifies impacts from perturbations in aquatic ecosystems is based on (i) the collection of single snapshot data in space, or (ii) multiple collections from a single part of an ecosystem over time. Ecosystems are inherently complex; therefore, having access to these relatively coarse and incomplete collections in space and time could result in less than optimal data based solutions. The goal of this project is to design and develop a platform that can be used on the surface of a lake to quantify the water quality changes in the nearshore environment (1-10 m deep). The platform would be autonomous, used to monitor the environment for water quality (temperature, turbidity, oxygen, chl a) at a given depth.

Research Tasks:
  • Task 1: Autopilot integration and verification
  • Task 2: Sensing modules and Processing unit integration
  • Task 3: Robot Localization and Mapping using Visual-Inertial solution
  • Task 4: Fused visible light/thermal fusion for unified 3D reconstruction
  • Task 5: Robot boat autonomous navigation for shoreline tracking
  • Task 6: Robot Evaluation and Demonstration
​
Team:
  • Camille Bourquin
  •  Stephen Williams
  • Tyler Schmidt
  • Steven King
  • Ke Xu

​Collaborators: Aquatic Ecosystems Analysis Lab: - ​http://aquaticecosystemslab.org/ , NAASIC
Budget: $2,000
Picture
Picture
Project #2: Autonomous Cars Navigation
Description: Autonomous transportation systems not only is an ongoing research trend but also a key factor for the progress of our societies, safety of transportation, more green technologies, growth and better quality of life. The goal of this project will be to develop a miniaturized autonomous car able to navigate while mapping its environment, detecting objects in it (other cars) and performing collision-avoidance maneuvers. To achieve this goal, the robot will integrated controlled steering and a perception system that fuses data from cameras, an inertial measurement unit and depth sensors therefore being able to robustly performing the simultaneous localization and mapping task. Finally, a local path path planner will guide the steering control towards collision-free paths. 

Research Tasks:
  • ​Task 1: Sensing modules and Processing Unit Integration
  • Task 2: Autopilot integration and verification
  • Task 3: Robot Localization using LiDAR/RGBD/Visual-SLAM
  • Task 4: Static/Dynamic Obstacle Detection
  • Task 5: Robot Motion Collision-free Planning
  • Task 6: Robot Evaluation and Demonstration

Team:
  • Niki Silveria
  • Monique Dingle
  • Phoebe Argon
  • Jason Worsnop Cody 
  • Brett Knadle
  • Phillip Vong
​
Collaborators: Nevada Center for Applied Research, NAASIC
Budget: $2,000
Picture
Project #1: Smartphone-assisted Delivery Drone Landing
Description: This project will run in collaboration with Flirtey - the first parcel delivery company conducting real-life operations in the US. The goal is to develop a system that exploits direct/indirect communication between a smartphone and the aerial robot such that delivery landing "on top" of the smartphone becomes possible. Such an approach will enable commercial parcel delivery within challenging and cluttered urban environments. Within the framework of the project, we seek for the most reliable, novel but also technologically feasible solution for the problem at hand. The aerial robot will be able of visual processing and may implement different communication protocols, while the smartphone should be considered "as available" on the market. 

Research Tasks:
  • ​Task 1: Autopilot integration
  • Task 2: Camera systems integration
  • Task 3: Robot-to-Phone and Phone-to-Robot cooperative localization
  • Task 4: Visual-servoying phone tracking
  • Task 5: Autonomous Landing on phone
  • Task 6: Robot Evaluation and Demonstration
​
Team: 
  • Sajid Zeeshan
  • Lopez Austin
  • Golden Erik
  • Kevin Green

​Collaborators: Flirtey - ​http://flirtey.com/
Budget: $2,000
Picture

​​

Proudly powered by Weebly