Obstacle detection#

Introduction#

The purpose of the Obstacle detection example application is to detect objects and estimate their distance and angle from a moving platform, such as a robot. The algorithm is built on top of the Sparse IQ service and has various configuration parameters available to tailor the application to specific use cases.

The application can be seen as a combination of the Distance Detector and Speed Detector to measure both the distance to and speed of an object simultaneously.

The application utilizes the following key concepts:

1. Distance filtering: A matched filter is applied along the distance dimension to improve the signal quality and suppress noise. For a more in-depth discussion on the topic of distance filter, see the Distance Detector documentation.

2. DFT in the sweep dimension: The velocity component at every distance point is computed by performing a Discrete Fourier Transform (DFT) in the sweep dimension of the data frame. For a more in-depth discussion on the topic of speed estimation by DFT in the sweep dimension, see the Speed Detector documentation.

3. Comparing frame to a threshold: Peaks in the distance-velocity frames, corresponding to objects in front of the radar, are found and compared to a threshold.

4. Kalman filtering: Kalman filters are used to track the objects over time based on both position and velocity.

5. Velocity to angle conversion: The A121 radar has only a single channel, and therefore cannot supply any angular information to an object. However, if the robot, and therefore also the radar, is moving at a known speed, angular information can be extracted. For example, if the radar is moving at 20 cm/s, and sees an object approaching the radar at 20 cm/s, the object is likely straight in front of the radar. If the object is instead standing still, it is likely 90 degrees to the side. More precisely,

(34)#\[\alpha = \cos^{-1}\left(\frac{v_{object}}{v_{robot}}\right),\]

where \(\alpha\) is the angle to the object, \(v_{object}\) is the speed of the object measured by the radar and \(v_{robot}\) is the speed of the robot or radar.

This multiple sweep angle estimation using a moving radar can be seen as Synthetic Aperture Radar (SAR) processing.

A121 Improvements#

If you are familiar with the A111 Obstacle detection, a list of improvements with the A121 Obstacle detection application is given here:

1. Multiple sweeps per frame: A high robot speed requires a high sweep rate to determine the object angle. In the A111, each sweep needs to be transferred from the sensor before the next can be measured, which limits the maximum sweep rate. In A121, multiple sweeps can be measured in a frame, before the frame is transferred to the host. This enables higher sweep rates and therefore higher robot speeds.

2. Subsweeps and step length: By utilizing subsweeps, the application scanning range can be split up in a range close to the sensor where a low profile and low step length can be used, and a range far from the sensor where a higher profile can be used for maximum sensitivity. Note, currently the Obstacle detection application in the Exploration Tool GUI does not support subsweeps, please run the example file described below.

3. Temperature sensor: The surrounding temperature impacts the amplitude of the measured signal and noise. With the temperature sensor integrated in the radar, the thresholds can be adjusted when the temperature changes to keep the false alarm rate of the application low and sensitivity high.

4. Better distance performance: The A121 Obstacle detection example application uses data supplied by the Sparse IQ service which has improved distance accuracy compared to the A111 IQ data service.

Limitations#

  • Angle estimation only works for static objects. The angle to a moving pet, human or other robot will be incorrect.

  • Angle estimation is only possible if the radar is moving and the correct velocity is supplied to detection algorithm. An error in the robot velocity will lead to an error in the estimated angle.

  • The angle supplied by the algorithm is the angle between the direction of motion of the radar and the direction from the radar to the object. If, for example, an object is found at 30 degrees, it can be 30 degrees to the left or right, and even 30 degrees up or down.

Calibration and Threshold#

To determine if any objects are present, the measured signal is compared to a threshold. The threshold is based on measurements collected during calibration.

Any signals from static objects are located in the zeroth bin after the DFT so the threshold in this bin is a constant (num_mean_threshold) times the mean signal from the calibration plus a constant (num_std_threshold) times the standard deviation. The threshold for the other DFT bins, corresponding to moving objects, is based only on the standard deviation of the signal.

During calibration, num_frames_in_recorded_threshold frames are collected to estimate the background for the thresholds, so it is important that no objects are present during this measurement.

A second step of the calibration step is to measure the offset compensation. The purpose of the offset compensation is to improve the distance trueness of the Obstacle detection. The compensation utilizes the loopback measurement, where the radar pulse is measured electronically on the radar, without transmitting it into the air. The location of the peak amplitude is correlated with the distance error and used to correct the distance estimate.

To trigger the calibration process in the Exploration Tool GUI, simply press the button labeled “Calibrate detector”.

Subsweeps#

With subsweeps, the profile, HWAAS and step length can be adjusted along the range. One recommended configuration is to use a low profile in the subsweep close to the sensor to detect small obstacles in front of the robot and a higher profile and step length at larger distance to detect walls and larger objects.

The optimal subsweep configuration varies so this has to be done manually. The Obstacle detection example application in the Exploration Tool GUI does not support subsweep, please see acconeer-python-exploration/examples/algo/a121/obstacle/detector.py for an example on running the application with subsweeps.

Configuration parameters#

The configuration parameters of the Obstacle detection example application can be divided into two parts, sensor parameters and threshold parameters.

The sensor parameters are similar to the parameters used by the underlying Sparse IQ service with a few exceptions; start_m and end_m set suitable sweep range and max_robot_speed controls the sweep rate.

During application calibration, num_frames_in_recorded_threshold of frames are collected to estimate the background signal and noise. The mean and standard deviation are then scaled with num_mean_threshold and num_std_threshold to construct a threshold.

class acconeer.exptool.a121.algo.obstacle._detector.DetectorConfig(*, start_m: float = 0.15, end_m: float = 0.6, step_length: int = 2, hwaas: int = 12, profile=Profile.PROFILE_3, max_robot_speed: float = 0.5, sweeps_per_frame: int = 16, num_frames_in_recorded_threshold: int = 50, num_std_threshold: float = 5, num_mean_threshold: float = 2, update_rate: float = 50.0, subsweep_configurations: List[SubsweepConfig] | None = None, peak_sorting_method=PeakSortingMethod.CLOSEST, enable_close_proximity_detection: bool = False, dead_reckoning_duration_s: float = 0.5, kalman_sensitivity: float = 0.5)#
start_m: float#

Start point of measurement interval in meters.

end_m: float#

End point of measurement interval in meters.

step_length: int#

Used to set step length. In unit approx. 2.5 mm.

hwaas: int#

Hardware averaging. Higher gives better SNR but increase power consumption and lower sweep rate.

profile: Profile#

Profile, 1-5. Higher equals better SNR and lower increase resolution. A recommendation is Profile 1 closer than 20 cm and Profile 3 beyond.

max_robot_speed: float#

Sets the sweep rate after the maximum robot speed in meters per second

sweeps_per_frame: int#

Number of sweeps per frame. The length of the FFT to estimate speed or angle.

num_frames_in_recorded_threshold: int#

Number of frames used when calibrating threshold.

num_std_threshold: float#

Number of standard deviations added to the threshold.

num_mean_threshold: float#

Number of means added to the threshold.

update_rate: float#

Sets the detector update rate.

subsweep_configurations: List[SubsweepConfig] | None#

Optional list of subsweep configurations that over-writes the sensor configuration.

peak_sorting_method: PeakSortingMethod#

Sorting method of targets.

enable_close_proximity_detection: bool#

Enable close proximity detection. Utilises the first subsweep to control the activity very close to the sensor. Extends the result with the flag ´close_proximity_trig´ which is ´True´ if an object suddenly appears close to the sensor. If ´subsweep_configurations´ is not set, then the detector will automatically add an additional subsweep for the close proximity functionality, otherwise the first subsweep will be utilised.

dead_reckoning_duration_s: float#

Specify the duration (s) of the Kalman filter to perform dead reckoning before stop tracking.

kalman_sensitivity: float#

Specify the sensitivity of the Kalman filter. A higher value yields a more responsive filter. A lower value yields a more robust filter.

Detector result#

class acconeer.exptool.a121.algo.obstacle._detector.DetectorResult(*, current_velocity: float = None, processor_results: Dict[int, ProcessorResult] = None, close_proximity_trig: Dict[int, bool] | None = None)#

Processor result#

class acconeer.exptool.a121.algo.obstacle._processors.ProcessorResult(*, targets: list[Target] = NOTHING, time: float = None, extra_result: ProcessorExtraResult = NOTHING, subsweeps_extra_results: List[SubsweepProcessorExtraResult] = None)#