5.1 Sonar Sensors
SONAR, an acronym for Sound Navigation And Ranging, models the contours of an environment based on how it catches and throws back sound waves. The sender generates a sonic, or sound, wave that travels outwards in an expanding cone, and listens for an echo. The characteristics of that echo can help the listener locate objects. This is how bats nail mosquitos with such exemplary and satisfying efficiency, and how Seawolf submarines will nail their prey, should they ever be called upon to do so. The sonars ringing your robot, while lamentably not quite as efficient as a bats, still provide a useful map of its surroundings, as long as you recognize and respect their inherent limitations.
Your robot is equipped with an array of sonar transducers the gold colored circular disks running around the perimeter of the robots upper enclosure. On the B21 robot, the 24 sonars are about 74 centimeters above the floor; the B14's 16 sonars are about 50 cm above the floor. For purposes of identification by your application program, the sonars are numbered; sonar zero is at the left rear of the robot; numbers ascend clockwise around the perimeter.
The robot "reads" its sonars about three times per second. For each reading, the total time between the generation of the ping and the receipt of the echo, coupled with the speed of sound in the robots environment, generates an estimate of the distance to the object that bounced back the echo.
Figure 5-2 Sonar Numbering on the B21 and B14 Robots
Figure 5-3 - Typical Sonar Beam Pattern at 50 Khz. Notice the main cone and the side cones.
As the robots sonars fire off pings and receive echos, they continuously update a data structure mapped to the actual collection of transducers, an array of sonarType called sonars, of size B_MAX_SENSOR_COLS. Since not every sonar can fire a ping at once (they might be confused by pings simultaneously fired off by their immediate neighbors!), they fire pings in a pre-set pattern, usually four or six sonars at a time, with the full set of patterns fired about three times per second. As each subset of sonars returns its readings, they fill in the corresponding data structures. Each element in the array sonars, of type sonarType, looks like this:
value: An integer representing the distance, in millimeters, of the last echo received by the corresponding sonar. value is the distance of the "flight" of the ping, calculated as if every ping originated at the center axis of the robot.
mostRecent: An integer, either TRUE or FALSE. If TRUE, the data structure contains a value harvested from the most recent round of pings.
time: A timeval struct indicating the time the ping was received.
Each sonar detects obstacles in a cone-shaped range that starts out, close in to the robot, with a half-angle of about 15 degrees, and spreads outwards. An obstacles surface characteristics (smooth or textured, for example), as well as the angle at which an obstacle isplaced relative to the robot, significantly affect how and even whether an obstacle will be detected. Wise robot programmers never assume sonar data is infallible, and always look at multiple readings and do appropriate crosschecking. The sonars can be "fooled" for any of the following reasons:
1. The sonar has no way of knowing exactly where, in its fifteen-degree and wider cone of attention, an obstacle actually is.
2. The sonar has no way of knowing the relative angle of an obstacle. Obstacles at steep angles might bounce their echos off in a completely different direction, leaving the sonar ignorant of their existence, as it never receives an echo.
3. The sonar can be badly fooled if its ping bounces off an obliquely-angled object ontoanother object in the environment, which then, in turn, returns an echo to the sonar. This effect, called specular reflection, can cause sonar errors, causing the sonars to overestimate the distance between the robot and the nearest obstacle.
4. Extremely smooth walls presented at steep angles, and glass walls, can seriously mislead the sonars.
On the positive side, your robot has many sonars, fairly closely spaced, providing some redundancy and enabling crosschecking. Also, sonars almost never underestimate the distance to an obstacle. Therefore, its a prudent practice to examine the distances returned by a group of sonars and use only the lowest values.
Or, you might record multiple readings as the robot moves about, and use the data from each to build up an occupancy grid. If several readings, from several angles and several sonars, keep detecting an obstacle in more or less the same place, its a safe bet to mark that spot as "occupied."
Figures 5-4 and 5-5 illustrate the sonars cone of attention and how its limitations can lead to false readings.
Figure 5-4 - How the Sonar Can be Fooled, Part I
When the beam strikes a surface with a large angle of incidence, the edge of the wavefront is reflected back to the sensor instead of the centerline. This effect, called radial error, often results in errors greater than one foot.
In addition, because of the sonars relatively large beam (its angle is about 15 degrees), it tends to produce a rather blurred image of its surroundings. This can lead to angular error, which has deleterious effects on the robots impressions of its surroundings similar to those caused by radial error.
Figure 5-5 - How the Sonar Can be Fooled, Part II
After striking a surface at a large angle of incidence, the echo may bounce back into oblivion rather than reflecting a strong echo back to the sonar receiver. This type of false reflection occurs when the incidence angle of the beam is greater than a critical angle, denoted above as X, which defines the cone of reflection (CR) for the surface.
A sonar beam striking a wall from outside the CR will be reflected away from the sensor, producing an unrealistically long sonar ray. The sonar beam apparently penetrates the wall!
Every surface material has its own CR half-angle, which may range from 7 or 8 degrees (for glass) to nearly 90 degrees (for rough surfaces.)
The Sonar Application Program Interface (API)
To access sonar readings from your application program, basically, youll:
1. use sonarInit() to initialize the sonar module;
2. Start the BeeSoft Scheduler and use a polled callback to occasionally examine data in sonars. This is simple, but differences in scheduling rates may result in you seeing the same sonar readings twice, or having a set of readings overwritten before your program has a chance to look at it. So, you might want to use the next method.
3. To see each new set of readings exactly once, use registerSonarCallback to register a module-specific callback function with the BeeSoft sonar module. When a new pattern reading (of four or six sonars) is available, BeeSoft will put new data into the sonars array, set the mostRecent flags to TRUE, and call your user callback function so your program can take a look.
Here are the data structures and calls you need to know about to use the your robots sonars.
Sonar API Data Structures
Note: All sonar values are based on distances calculated from the center axis of the robot.
sonarType sonars[B_MAX_SENSOR_COLS] -- A global array containing the current sonar values.
bRobot.sonar_cols -- The number of sonar structures in the array, corresponding to the number of physical sonar transducers on the robot.
NO_RETURN -- The value of a sonar structure whose last ping did not return an echo possibly because the sonar was fired into an open space larger than the sonars effective range, or because a large amount of specular reflection ping-ponging occurred.
Sonar API Function Calls
Specifies which function BeeSoft should call when it has a fresh set of sonar readings available. It will be called with an updated array of sonarsType, of length B_MAX_SENSOR_COLS.
Initializes the sonar system. Call this before calling any other sonar function.
void sonarStop(void); -- Halts firing of the sonars.
void sonarStart(void); -- Resumes firing of the sonars, if youve halted them.