- Winner of the IEEE R10 Undergraduate Project Video Competition 2023
- Category: AI and Robotics
- View Announcement
- Featured on IEEE R10 SAC LinkedIn page
- IEEE R10 Undergraduate Student Project Video Contest Submission
Table of Contents
SLAM (Simultaneous Localization And Mapping) is an essential technology used in robotics that helps robots to estimate their position and orientation on a map while creating a map of the environment to carry out autonomous activities.
Turtlebot using SLAM to navigate across a map
This project aims to put together a mobile robot similar to a TurtleBot. A TurtleBot is a low-cost personal robot kit with open-source software.
ROS is an open-source, meta-operating system to operate robots. ROS provides the services of an operating system, such as hardware abstraction, low-level device control, implementation of commonly-used functionality, message-passing between processes, and package management. ROS provides tools and libraries for obtaining, building, writing, and running code across multiple computers. ROS currently only runs on Unix-based platforms. Software for ROS is primarily tested on Ubuntu and Mac OS X systems, though the ROS community has been contributing support for Fedora, Gentoo, Arch Linux, and other Linux platforms.
- Nodes: A node is an executable that uses ROS to communicate with other nodes. A ROS Node can be a Publisher or a Subscriber. A Publisher puts the messages of a standard message type to a particular topic. The Subscriber, on the other hand, subscribes to the topic and receives the messages that are published to the topic.
- Messages: ROS data type used upon subscribing or publishing to a topic.
- Topics: Nodes can publish messages to a topic as well as subscribe to a topic to receive messages.
- Master: Name service for ROS (i.e. helps nodes find each other)
- rosout: ROS equivalent of stdout/stderr
- roscore: Master + rosout + parameter server (parameter server will be introduced later)
Client libraries needed for this project:
- rospy: python client library
- roscpp: c++ client library
TurtleBot3 is a small, affordable, programmable, ROS-based mobile robot for education, research, hobby, and product prototyping. The TurtleBot’s core technology is SLAM, Navigation, and Manipulation, making it suitable for home service robots.
Turtlebot3 Models - Burger and Waffle
A LIDAR (LIght Detection And Ranging) is a sensor that uses light in the form of a pulsed laser to calculate the relative distances of various objects. This 3D scanning system calculates how long it takes for beams of light to hit an object or surface and reflect back to the laser scanner using the velocity of light. The observed LIDAR data is then used to generate precise, three dimensional information about the environment of the robot and navigate smoothly while avoiding the obstacles.
The Raspberry Pi is an affordable single-board computer that can run Linux operating system such as Raspbian and Ubuntu. It is extensively used to develop programming skills or build hardware projects. It is a fast and versatile microprocessing board along with a set of GPIO (General Purpose Input/Output) pins, allowing one to control electronic components for physical computing. This project uses a Raspberry Pi 3B.
Arduino is an open-source electronics platform based on easy-to-use hardware and software intended for applications in interactive projects. Arduino Mega is a microcontroller development board based on the ATmega2560 microcontroller IC. It can be interfaced with various hardware components such as sensors and actuators. Arduino can be programmed using Arduino C which is a language based on C++.
- Ubuntu 20.04
- ROS Noetic
For installing ROS refer to ROS Wiki
Create a catkin workspace
mkdir ros_ws
cd ros_ws
mkdir src
catkin build
To automatically source this workspace every time a new shell is launched, run these commands
echo "source ~/ros_ws/devel/setup.bash" >> ~/.bashrc
source ~/.bashrc
Clone the repository in the src
folder in the catkin workspace.
cd ~/ros_ws/src
git clone https://github.com/IEEE-NITK/SLAMBot.git
Navigate back to the workspace folder and build the packages.
cd ~/ros_ws
catkin build
To launch the gazebo simulation use the command
roslaunch slambot_simulation slambot_simulation.launch
To move the robot around, we will use teleop_twist_keyboard which is a package for moving the robot using the keyboard:
rosrun teleop_twist_keyboard teleop_twist_keyboard.py
Before we can autonomously drive around any world, we need to provide the robot with a map which will be used to localize (position) the robot relative to obstacles or features defined in the map. We will create a new map of the default world by doing the following:
- Launch the world in Gazebo
- Launch the mapping script
- Drive around and collect data from the robot's sensors until we have a (nearly) complete and accurate map.
After launching the gazebo simulation and teleoperation, use the following command to create a map using gmapping algorithm:
roslaunch slambot_simulation slambot_slam.launch
Now, with the terminal running the teleop_twist_keyboard selected, drive the robot around using the I
, J
, L
, ,
and K
keys.
mapping.mp4
Once your map has all of the features defined (black for walls or obstacles, white for no obstacles, and gray/transparent for unknown regions), we need to save the map.
In a new terminal, run:
rosrun map_server map_saver -f ~/map
This will create two files:
- map.pgm - the image containing the white, gray, and black regions.
- map.yaml - the configuration data for the map.pgm image.
Now that the map of the world is prepared, we can drive the robot autonomously inside the world. To do this, we will:
- Launch the gazebo simulation
- Launch the naviation script
- Set an initial pose estimate to align the map relative to the current sensor data (i.e. perform an initial localization)
- Set target (goal) positions and orientations for the robot to drive to
- Have the robot navigate to the goal autonomously
To launch the gazebo simuation, in a terminal run:
roslaunch slambot_simulation slambot_simulation.launch
Then in a new terminal, run
roslaunch slambot_simulation slambot_navigation.launch
This will open an RViz window with the map and the robot with the current sensor values displayed.
In RViz
- Select 2D Pose Estimate. Click and drag an arrow that estimates where the robot is currently positioned relative to the map that we created.
- This will allign the robot to the map according to the current sensor feed.
- Next, select 2D Nav Goal.
- Click and drag an arrow that represents the position and orientation of where the robot needs to drive to. In order for a path to be calculated, this arrow must be inside a white or gray region, signifying that there is no known obstacle at the goal location.
- The robot will compute a local path (a yellow arc) and a global path (the blue or red spline), and will autonomously drive to the target position.
Naviation.mp4
The robot model used in this project is a simple differential drive comprising of 2 wheels mounted on their motor along with a roller castor for additional base support.
The robot consists of 4 layers:
- Bottom-most layer: The propulsion group comprising of battery and motors.
- Second layer: It consists of the power distribution board along with motor drivers.
- Third layer: This layer has an Raspberry Pi, which is a single board computer, along with an Arduino Mega microcontroller.
- Top-most layer: It comprises of the LIDAR
The above mentioned plates are 3D printed parts which are further assembled with the electronic hardware using screws and supports to ensure stability of the structure.
The design of the bot is created on Fusion 360, a commercial CAD and CAM software. It is then directly exported as a URDF (Unified Robotic Description Format) file, accompanied by a .stl file of the model alongside a .launch and .yaml file to simulate it on Gazebo. The URDF is an XML file format for specifying the geometry and organization of robots in ROS and can be generated using URDF_Exporter.
The robot is simulated in Gazebo, a powerful, open-source 3D robotics simulator with the ability to accurately and efficiently generate synthetic sensor data and offers realistic environments with high fidelity sensors streams to construct and interact with simulations. Within Gazebo, a physics engine is used to define realistic elements of an environment such as illumination, gravity and inertia.
This repository is licensed under the BSD-3-Clause License