Skip to content

9. Introduction to Software for Non‐Software Members

Anthony Tan edited this page Aug 21, 2024 · 8 revisions

Introduction to Software for Non‐Software Members

This page of the Wiki is meant for non-software members who wish to learn more about how all this agglomerate of files works. You are also welcome to read this if you are a software member. However, this explanation alone does not provide enough knowledge to contribute to solving software problems.

Table of Contents

Docker

Docker is a tool that allows you to package an application with everything it needs to run—like its own mini-environment—into something called a container. This container can run consistently on any computer, regardless of the underlying system, so you don’t have to worry about compatibility issues. It’s a way to ensure that "it works on my machine" actually means it will work anywhere.

To create these containers, Docker uses images, which are like blueprints that define what the container should include—such as the operating system, application, and any necessary files or configurations. You can start with an existing image, run it as a container, make any changes you want (like installing software or tweaking settings), and then save those changes as a new image. This way, you can create and share customized environments that work the same way everywhere.

The benefits of using Docker include ensuring consistency across different environments, isolating applications to prevent interference, and making it easy to move applications between systems. Docker containers are also more resource-efficient compared to traditional virtual machines, allowing for faster startup times and better use of system resources.

The major downside with Docker, is the huge learning curve, especially if you’re new to concepts like images and containers.

ROS

ROS, or Robot Operating System, is like the nervous system and communication network of a robot. Just like how your body has nerves that carry messages from your brain to your muscles and back, ROS connects different parts of a robot—its sensors, actuators, and control algorithms—so they can all communicate and work together.

Imagine ROS as a well-organized factory where different departments (like sensors, motors, and controllers) need to talk to each other to get things done. Each department (called a node in ROS) has its own job, like gathering data, processing information, or making decisions. The factory uses a messaging system (ROS topics) to send and receive updates, so every department knows what’s happening and can respond accordingly.

ROS also has a blueprint system (packages) where all the tools, libraries, and programs a robot might need are stored. If you want to teach a robot a new skill or give it new abilities, you can add or modify these packages.

Simulation

To test our code, we developed a simulation using Unity, a popular game development platform. Just like in video games, we can create scripts and design 3D objects in Unity to behave the way we want. This allows us to model a virtual environment where we can test our robotic systems without needing to interact with the real world.

Unity has a specific tool for robotics that enables us to connect our simulation directly to the code we’ve written in ROS (Robot Operating System). This connection lets Unity and ROS exchange data in real-time. Using C# scripts in Unity, we can simulate a real-life environment, including all the sensors and actuators our robot might have.

For example, suppose we want our robot to move to a specific position. In the simulation, the robot's current position is sent from Unity to our ROS code. ROS then calculates how much effort the thrusters need to apply to move the robot towards the target position. This command is sent back to Unity, which updates the robot's position in the virtual environment accordingly.

This setup allows us to test whether our code behaves as expected in a controlled, simulated environment. However, it's important to note that our simulation isn’t perfectly realistic. Just because our robot behaves correctly in the simulation doesn’t guarantee it will work the same way in the real world. Still, it’s a valuable step in the development process, giving us confidence that our code is on the right track.

Pooltests

Pool tests are more complex because they involve more than just software. This section is crucial for clear and effective communication between subteams during pool test days. Mistakes happen, but when the software team is under pressure, hearing "Why is software taking so long?" or "Are we ready yet?" every few minutes doesn't help. Below, you'll find an explanation of what the software team is actually doing, when things go wrong, how you can assist the software team, and how the software team can support you.

Before pooltests

Before bringing the robot to the pool, the software team has a few key responsibilities: running a dry test and checking sensor status. The dry test ensures the thrusters are working correctly, while the sensor check confirms that all sensors are providing accurate readings. These should be the primary focus of the software team. If there is something else, then we are probably messed up. However, there are two exceptions: Docker and sensor drivers. These are complex and require patience from everyone. For example, a single misplaced character in a Docker image can result in a 20-minute delay. If this seems abstract, think of it like designing a complex circuit board or fabricating a critical new part—it takes time, and rushing it usually makes things worse.

How can electrical and mech help?

On pool test days, there's a natural order in which subteams work on the AUV. Mechanical is usually first, ensuring all parts are properly installed and leak-proof. Next, electrical organizes the cables inside the hull. Finally, software gets involved. The sequence isn’t an issue, but not understanding its limitations can be. If mechanical encounters a delay, there's usually enough buffer time. If electrical runs into issues, it's not ideal, but manageable. However, if software encounters a problem, we start losing valuable pool test time.

We fully appreciate the importance of pool tests and understand that it can be frustrating to stand by when issues arise. However, rushing the process does NOT help; it only adds stress and worsens the work environment. If a pool test doesn’t happen, it doesn’t happen. There’s no need to make a tough situation worse. If the software team is struggling for any reason, let’s discuss it later, when emotions aren’t running high.

During the pooltest, there is one important way you can help: just be present. You can bring a book, computer, or anything to prevent you from being bored. We are not mech or electrical. If the robot breaks (electrically or mechanically) or there is a leak, we probably won't know how to fix it. We don't know all the safety procedures. When you come to the pooltest, we know we can rely on you if something goes wrong. So please, just come.

How can software help?

If you’re part of the mechanical team: We can carry equipment, hold tools, or fetch items you need—basically, act as an extra pair of hands.

If you’re part of the electrical team: We can do the same, assisting with anything that doesn’t require specialized knowledge.

Summary

If you are part of this team, there are boring things you have to do. It's part of the job. It's part of any job. If you understand this, don't be brain rot and turn this into something useful. Try to learn something about the other subteams' work. From personal experience, even a rough understanding of how other subteams operate can be incredibly beneficial for your career.

Commands for Pooltests

The following is the sequence of commands to run during a pooltest. Note that '#' represent a comment and not a command:

# Connected to wifi, go to AUV folder.
git status
# If git status says that you are on noetic branch, then skip the next command.
git checkout noetic
# Update the AUV repository.
git pull
# After plugging in the router, connect to the AUV network through your computer Wifi configurations.
# Now you can connect to the jetson.
ssh [email protected]
# If it times out, then the router is not plugged in correctly or the jetson is not turned on.
# If it doesn't time out, it will ask for a password.
jetson
# Now you are connected to the jetson.
# To check if any docker containers are running
docker ps
# If there aren't any docker containers, run these two commands
cd ~/AUV-2024/Docker/jetson
docker compose up
# On a new terminal window, ssh into jetson again.
# This time, if you run docker ps on the new window, you will see a container.
# If the first docker ps returns a container, skip the previous steps and continue from here.
# The name of the container should be used in the following command.
docker exec -it [name of the container] bash
# To run dry test
roslaunch propulsion drytest.launch
# To run sensors status
roslaunch sensors sensors.launch
# If there is a problem with one of the sensors, make sure that the jetson can see them by running.
ls /dev/[name_of_sensor]*
# If it can't find, check /dev/tty* and see if it's there.
# If it is, then run the udev rules script in /AUV-2024/Docker/jetson.
./generate_udev_rules.sh
# If it isn't, then it's an electrical problem.

Code

Propulsion

The propulsion package first arms the thrusters - which means that it makes them functional. It does this by sending high-low PWM signals (Pulse Width Modulation) in microseconds to each thruster. The PWMs directly control the motors of the individual thrusters. The package also handles transforming efforts into PWM signals. The efforts are received in the form of a wrench, which consists of a set of forces AND torques on the X, Y, and Z axis. These are then mapped into the forces of each thruster using a transformation matrix. The transformation matrix is a 6x8 matrix (6 axis of movement and 8 thrusters). Each entry represents how much a given thruster contributes to the movement on a given axis.

Controls

The control system's primary task is to listen to desired setpoint (or target pose) commands from the planner and continually generate an effort (force/torque) that is most likely to guide the AUV towards the desired setpoint.

There are two strategies for controlling the AUV, sending constant efforts (e.g., surge with a 5 Newtons effort) and PIDs.

If you hear the "quaternions" know that it's related to rotation. In most scenarios, using Euler angles is the easiestworst option. However, when it comes to 3D rotations, Euler angles present significant limitations whereas quaternions do not. If you are interested in learning more about it, we have a wiki section explaining it.

Sensors

The sensors package is responsible for launching all the sensors' drivers, establishing the communication between our code and the physical sensors, and checking if the sensors send valid data.

EKF

The Extended Kalman Filter (EKF) is the foundation of the software. Without it, no other package could do much of anything. The EKF takes input from any sensor configuration, and based on hyper-parameters and system dynamics estimates where the AUV is. The EKF needs a lot of pool testing to get right (sim does not count). Not testing the EKF prior to RoboSub 2024 was the most critical software issue leading to our poor performance.

Imagine you have a robot that is trying to figure out where it is in a room. The robot can see some things around the room, like tables and chairs, but it doesn't know exactly where it is.

The EKF is like a special way the robot can use the things it sees to figure out where it is. It works by taking a guess at where the robot might be, and then checking if that guess matches what the robot sees. If the guess is a little off, the EKF can tweak the guess to make it better.

The robot keeps doing this over and over, adjusting its guess each time, until it gets a really good idea of where it is in the room. The EKF helps the robot stay on track and figure out its location, even if the room is a little tricky or the robot's sensors aren't perfect.

Planner

The Planner is responsible for high-level decision making/mission planning for the AUV. It uses simplified interfaces with the Vision and State Estimation packages to decide what the AUV should do next, and an interface with the Controls package to execute changes in position/orientation accordingly. Missions are built as state machines, i.e. a series of states, where each state represents a small task (for example searching for a specific object on the floor of the pool, going through a gate, etc.).

[Planner diagram](The Planner is responsible for high-level decision making/mission planning for the AUV. It uses simplified interfaces with the Vision and State Estimation packages to decide what the AUV should do next, and an interface with the Controls package to execute changes in position/orientation accordingly. Missions are built as state machines, i.e. a series of states, where each state represents a small task (for example searching for a specific object on the floor of the pool, going through a gate, etc.).)

Vision

The vision package is responsible for everything that has to do with computer vision on the AUV. This includes getting data (images), training object detection models, streaming cameras' feeds to Python, performing object detection on the camera feeds, estimating the position (and orientation or other features) of detected objects, and building a 3D map of where the detect objects are in relation to the AUV's pose.