Autonomous Rover

An open source project to recreate a miniature NASA Marss Rover
Funded by NASA PSGC

Github

The goal is to navigate to a destination successfully.

Using only a microcontroller...

And so with a single point, the Rover goes...

Anything that prevents the Rover from reaching its destination is an obstacle. The Rover must overcome these obstacles.

Destination Visual

A destination is given to the Rover, and then it calculates a striaght line towards that destination.

Ultrasound

Straight Line

It will continue to follow this line until an obstacle is detected. Then turn in an arbitrary direction.

Ultrasound
BNO055

BNO055. Accelerometer, Magnometer, and Gyroscope.

An IMU is used to follow the straight line. It calculates the Rover's direction, movement, and orientation to get to the destination.

Ultrasound

GPS

The GPS keeps track of where the Rover is at all times.

However, a bit more is needed ...

A straight line is not sufficient information for the Rover to reach its destination...

Ultrasound

The Rover needs to know the following questions to reach a destination:

  • Which direction?

  • When to turn?

  • How far away?

  • Is it even possible!

  • When to recalculate?

  • Where is it...?

The Rover needs to know when and where to turn. Much like a car would need to know what road to take.

These instructions are achieved by breaking the path into waypoints.

Each waypoint tells the Rover when and where to turn in order to reach its destination.

Ultrasound

These points are calculated automatically, or set manually.

Once the Rover has calculated its path, it will use sensors to detect objects.

Ultrasound

Ultrasound

Range 190cm, Field of View 30 degrees. Method: Ultrasonic

An Ultrasound is used to detect objects. Ultrasounds have an emitter and a receiver. The sound emitted bounces off of a nearby object. The receiver detects this sound. Distance is found by measuring the time it takes for the receiver to receive the emitted sound.

Upon detecting an object, the Rover will turn away from the object, and return to the generated path.

It uses the IMU to calculate the exact meters to backup and turn in place.

To future teams.


    Three core properties to autonomous navigation

  • Calculating a Path

    What directions to take and when?
  • Following the Path

    Using GPS and IMU to follow the calculated path.
  • Avoid Obstacles on Path

    Use sensors such as camera to avoid obstacles on the calculated path.

Going to a destination successfully implies these core properties.

Process:

  • Mechanics

    Control various sensors
  • Tracking

    Path finding.
  • Obstacle Avoidance

    When, Where, What ...
  • Mapping

    Where is everyone?
  • Localization

    Where am I?
  • Simultaneously

    The brain...all must happen together.

Mechanics Sensors and Tracking.

The problem...we must find a method in which the rover can go from point A to B.

One such solution is using a bearing tracking algorithm.

Tracking Algorithm

We find the bearing and check if the bearing equals the heading. If correct continue forwards, else spin.

If an obstacle is encountered avoid obstacle and then proceed to tracking algorithm.


Benefits:

  1. Simple
  2. Fast
  3. Memory efficient
  4. Dynamic (can withstand different enviroments)
  5. Effective

Tracking Algorithm Visualized

Head straight to our destination!

  • Obstacle: Turn, go forwards a bit...
  • Path: Continue following path...
  • Obstacle: Turn, go forwards a bit...
iPhone

State Machine

Clearly what is described requires some type of means to store this behavior. One such means is using a state machine.

With state machines we can delegate exact moments for when the rover should exploit certain behaviors

Tracking Algorithm

We find the bearing and check if the bearing equals the heading. If correct continue forwards, else spin.

If an obstacle is encountered avoid obstacle and then proceed to tracking algorithm.


Benefits:

  1. Simple
  2. Fast
  3. Memory efficient
  4. Dynamic (can withstand different enviroments)
  5. Effective

State Machine Visualized

  • Track Go towards path.
  • Object detected Spin...go forwards.
  • Backwards Forwards Backup if too close.
iPhone

More in-depth overview

Paths

Obstacle Avoidance

In order to avoid an obstacle it must be detected. There are various sensors used for object detection.

Let's describe a few of them:


Sensors

  1. Ultrasound
  2. LIDAR or RADAR
  3. IR
  4. Camera

We have chosen to focus on LIDAR and Ultrasound. While the others are being developed more.

Ultrasound

Ultrasound

Range 190cm, Field of View 30 degrees. Method: Ultrasonic

Ultrasounds have an emitter and a receiver. The sound emitted bounces off of a nearby object. The receiver detects this sound. Distance is found by measuring the time it takes for the receiver to receive the emitted sound.

Ultrasound Field of View

Due to ultrasound only giving back a scalar value, exact position of the object detected can not be found. Instead, the futher the range, the less exact the angle.

iPhone
iPhone
Ultrasound

LIDAR

Range 4-5 meters, angle and distance, Method: Light

LIDAR sends a laser multiple times per second. It rotates at a certain rate, allowing for 360 detections.

Map

LIDAR maps the enviroment around it. Therefore, creating a internal map for the rover is critical.

The map we created uses a node based system. These nodes surround the rover, and intersecting rays are detected.

Simple Map

A very simple map is having various nodes surrounding the rover.

These nodes are spheres, and when a laser intersects the node that node becomes untraversable.

Ultrasound

Localization

Localization is knowing where the rover is at all times.

This consists of two elements: Orientation and Position

Orientation calculates the rover's rotation. The tilt and angles.

This is useful for finding the heading, and the incline. Let's describe a few devices.

    Orientation and Rotations

  • Gyroscope

    Measures angular velocity.
  • Magnetometer

    Measures magnetic field.
  • Accelerometer

    Measures acceleration.

Defining Rotations in 3D.

In order to rotate in 3D we must first define a class to handle such. We will use vectors. For 3D rotation three angles, the yaw, pitch, and roll, are used. We can calculate these by describing rotations as a matrix. on each axis.

iPhone
iPhone

Full matrix.

By multiplying all three axes of rotation a final matrix is produced.

This matrix known as the Euler Rotation Matrix allows for yaw, pitch, and roll to be calculated. It works similar to a gimbal.

Euler Rotation

Gyroscopes

Gyroscopes calculate angular velocity. It can calculate the angular velocity of all three axes. Gyroscopes uses a spinning gimbal to lock each axis in place. When the axes are displace, this is calculated as a force. The force can be calculated mathematically via the cross product.

iPhone

Gimbal Lock

Sadly, there is an issue with this method. When two axes align they become locked in place.

This is a property of the Euler Rotation Matrix. This means the rotation matrix is not sufficient for accurate rotations.

Euler Rotation

Rotors

We can instead rotate via outter products. This method uses the bivector, which represents the plane produced by taking the outter product. This is similar to representing rotations via the cross product, or the gyrscope.

Bivectors contain components just like a vector, but in terms of a plane and not a line. Through reflections, a bivector can be split into a component within a plane and outside the plane. These two components can be used to calculate the rotation on the three axes. This method is known as a Rotor.

Quaternions

When extended into 3D the most commonly used Rotor is known as the Quaternion. Quaternions are a special case of a Rotor, where i, j, k represents the rotation. Quaternions are special because they use complex numbers to represent rotations.

iPhone

More in-depth overview of Rotors

Rotors

via Gfycat

Magnetometer

Even with gyrscope and quaternions it is still not accurate enough to calulcate angular positioning.

Gyrscopes get angular velocity. By taking the integral you can derive the position, but it comes with a fair degree of error. Magnetometer gives exact angular position, based on absolute world coordinates (true magnetic north), instead of relative coordinates.

Ultrasound

Errors

However, due to magnetometer calculating a weak force, Earth's magnetic field, it has a fair degree of error as well.

Many devices can produce a local magnetic field, and thus cause the magnetometer to read falsely. In standard testing, this is usually an error of 10-7 degrees.

Accelerometer

Accelerometers are the perfect device. By measuring force from acceleration. It can calculate not just orientation, but also rotation.

These can be derived by double integrating the accelerometer. In order for this to be achieve gravity must be removed.

Gravity Component

Accelerometers will experience a force of 9.8ms^2 due to gravity. However, as the object rotates this force is distributed along all three axes. In order to remove this force we must subtract it, or find its inverse. This can be done by multiplying with our rotation matrix and solving.

iPhone
iPhone

YAW

However, the yaw is still missing. By definition, an accelerometer cannot properly calculate the yaw angle. We must combine all three devices in order to get accurate orientation....

Sensor Fusion, The Kalman Filter

Being computer science, we can use a technique commonly found in machine learning and statistics. That is mathematical filters. One such filter is the Kalman Filter. We can use the data from all three sensors and try to predict the next value based on the previous values. A brief view of the Kalman Filter is as follows

iPhone
iPhone

    Our Rotation Uses

  • Gyroscope

    Measures angular velocity, but accumulating errors.
  • Magnetometer

    Measures magnetic field, but has an error of 1-7 degrees
  • Accelerometer

    Measures acceleration, but cannot measure Yaw.

The Kalman Filter solves these errors by combining the sensors. Each correcting the other.

More in-depth overview of Kalman Filter

Kalman
Ultrasound

GPS

Accuracy: 3-10 meters, update per second, use case: global coordinates.

Finding position is the second challenge. GPS is an excellent device for tracking position over a wide field. This does not mean GPS is perfect. GPS gives only an estimated positioning.

Calculating Bearing

Globe

The bearing is the angle at which the rover needs to turn in order to reach a certain destination. This can be represented as a vector. With an angle and distance, the rover can travel the imaginary line.

However, earth is a sphere, thus this requires a method to convert the coordinates in terms of distance, and angles. For this we use spherical coordinates.

Spherical Coordinates and Bearing.

Sperhical coordiantes is a coordinate system for 3D which uses the radial distance and angles (theta, phi) to produce a vector. The bearing can then be calculate by the same principles, using spherical trigonometry.

Spherical Formula

More in-depth overview of Sperhical Coordinates

Spherical Coordiantes