Mars Rover

College of Engineering Unit(s): 
Electrical Engineering and Computer Science

Team: 
Prathyoosha Chaya

Project Description: 

This project aims to create software deployed on the OSU Robotics Club Mars Rover robot, which allows it to autonomously navigate through outdoor terrain.

In our everyday lives as humans, navigating around our environment is relatively straightforward. Our bodies are equipped with many sensory inputs, such as sight, sound, hearing, touch, and balance, that all work together to allow us to perceive our environment and avoid any obstacles that we may otherwise run into. For a robot however, the task of navigating through an unknown environment is an immense challenge. This project aims to create software that allows a robot to navigate by itself, or autonomously, in an outdoor environment.

The goals for the project are based on the robot's requirements for the University Rover Challenge (URC) 2021 competition. The URC competition occurs in the Mars Desert Research station in Utah, where the capstone team’s software will be deployed. The URC autonomous navigation challenge tests the rover’s navigation abilities when run completely autonomously. Rovers are required to autonomously traverse between visually marked goals for each stage of the mission. Each stage is timed, and increases the difficulty in the terrain. Failure to complete a stage, or complete it on time, will result in the end of the mission. The gates are marked by various AR tags (similar to QR codes), and GPS coordinates close to the gate will be provided right before the challenge starts. Thus, our software goals are to implement the following functionality: obstacle avoidance, base station integration, rough terrain traversal, goal detection, and waypoint navigation.

The autonomous software package developed in this project is based on Robot Operating System (ROS) on an Ubuntu operating system. It utilizes the ROS navigation stack for implementing RGBD-based simultaneous localization and mapping (SLAM), using visual odometry from the ZED stereoscopic camera, GPS, IMU, and wheel encoder inputs. These sensory inputs provide depth, vision, and orientation feedback necessary to navigate the terrain.

The rover's autonomous progress is monitored by the team using a remote base station. This provides an interface to input GPS coordinates for autonomous navigation, and see telemetry, camera feeds, and statuses from the rover.

Project Website(s):