How a robot sees, detects, and follows a person
using only a spinning laser scanner
A 2D LiDAR (Light Detection And Ranging) spins a laser beam 360° and measures the time-of-flight of each pulse to calculate distance.
Every 100ms, the LiDAR publishes a sensor_msgs/LaserScan message — an array of distances, one per angular step.
sensor_msgs/LaserScan:
header.frame_id: "laser"
angle_min: -3.14159 # -π radians
angle_max: 3.14159 # +π radians
angle_increment: 0.00436 # ~0.25° per step
time_increment: 0.000028
range_min: 0.15 # meters
range_max: 12.0 # meters
ranges: [1.2, 1.3, 1.1, inf, inf, 0.8, ...] # 1440 values
Raw scan points are grouped into clusters — nearby points that likely belong to the same physical object.
A Random Forest classifier evaluates each cluster's geometric features to determine: "Is this a human leg?"
0.1 confidence to track
Once legs are detected, a Kalman Filter maintains a smooth, continuous track of each person across frames — even when briefly occluded.
/people_trackedThe tracker pairs two nearby leg tracks into a single person track using proximity and motion coherence.
Two leg clusters must be < 0.8m apart to be paired as one person
Both legs must move in the same general direction and speed
Published as the midpoint between the two paired leg positions
Each person gets a unique ID maintained across frames for reliable following
The follower node reads the person's position and computes velocity commands using a simple P-controller.
Every 100ms, this entire chain executes — from raw photons to motor commands.
Time-of-flight laser → distance array → cluster → classify → track → control. Each step reduces noise and adds intelligence.
cluster_dist0.13m — grouping thresholdmax_leg_pair0.8m — leg pairing distanceKp_linear0.5 — speed response gaintarget_dist1.0m — follow distance