01 / 10
← → to navigate
ROS2 · LiDAR · Robotics

2D LiDAR Human Following

How a robot sees, detects, and follows a person
using only a spinning laser scanner

360°Scan Range
10HzUpdate Rate
1.0mFollow Distance
01
01 — The Sensor

What is a
2D LiDAR?

A 2D LiDAR (Light Detection And Ranging) spins a laser beam 360° and measures the time-of-flight of each pulse to calculate distance.

Speed of Light Laser pulses travel at ~3×10⁸ m/s
🔄
360° Rotation Motor spins the emitter continuously
📡
Output: /scan topic Array of ~360–1440 distance values
Live LiDAR Simulation
02
02 — Raw Data

The LaserScan Message

Every 100ms, the LiDAR publishes a sensor_msgs/LaserScan message — an array of distances, one per angular step.

angle_min: −π angle_increment: ~0.25° angle_max: +π
ROS2 Message
sensor_msgs/LaserScan:
  header.frame_id: "laser"
  angle_min:       -3.14159   # -π radians
  angle_max:        3.14159   # +π radians
  angle_increment:  0.00436   # ~0.25° per step
  time_increment:   0.000028
  range_min:        0.15      # meters
  range_max:       12.0       # meters
  ranges:          [1.2, 1.3, 1.1, inf, inf, 0.8, ...]  # 1440 values
03
03 — Detection Step 1

Point Clustering

Raw scan points are grouped into clusters — nearby points that likely belong to the same physical object.

1
Euclidean Distance Check If two adjacent points are < 0.13m apart → same cluster
2
Minimum Points Filter Clusters with < 3 points = noise → discard
3
Range Filter Ignore clusters beyond 10m from scanner
4
Result Each cluster = one potential object (leg, wall, chair…)
Clustering Visualization
04
04 — Detection Step 2

Leg Shape Classification

A Random Forest classifier evaluates each cluster's geometric features to determine: "Is this a human leg?"

📐
Width Human leg: 8–15 cm wide
〰️
Circularity Leg cross-section ≈ circular
📊
Point Density Consistent spacing of scan points
📏
Linearity Low = curved (leg), High = wall
🔢
Point Count Typical leg: 5–20 scan points
🎯
Confidence Score RF outputs 0.0–1.0 probability
🌲 Random Forest trained on thousands of real leg scans — threshold default: 0.1 confidence to track
05
05 — Tracking

Kalman Filter
Tracking

Once legs are detected, a Kalman Filter maintains a smooth, continuous track of each person across frames — even when briefly occluded.

PREDICT
Use previous velocity to estimate where the person will be next frame
x̂ₖ = F·xₖ₋₁ + B·uₖ
UPDATE
Correct the prediction using the new laser measurement
xₖ = x̂ₖ + K·(zₖ − H·x̂ₖ)
OUTPUT
Smooth position + velocity estimate published to /people_tracked
Kalman Filter Path Smoothing
06
06 — Person Detection

Two Legs → One Person

The tracker pairs two nearby leg tracks into a single person track using proximity and motion coherence.

📏
Distance Rule

Two leg clusters must be < 0.8m apart to be paired as one person

🔄
Motion Coherence

Both legs must move in the same general direction and speed

📍
Person Position

Published as the midpoint between the two paired leg positions

🆔
Persistent ID

Each person gets a unique ID maintained across frames for reliable following

07
07 — Control Law

Proportional
Controller

The follower node reads the person's position and computes velocity commands using a simple P-controller.

Linear Velocity
v = Kplin × (dmeasureddtarget)
Angular Velocity
ω = Kpang × θerror
Too Far
d > target → v > 0 → move forward
Too Close
d < target → v < 0 → back up
Off-Center
θ ≠ 0 → ω ≠ 0 → rotate toward
P-Controller Response
08
08 — Complete System

The Full Pipeline

Every 100ms, this entire chain executes — from raw photons to motor commands.

🔴
LiDAR
Spins laser
360° @ 10Hz
/scan
🔵
Clustering
Group nearby
scan points
clusters[]
🟢
Classifier
Random Forest
leg detection
/detected_legs
🟡
Kalman
Track & smooth
person position
/people_tracked
🟠
Controller
P-control
distance & angle
/cmd_vel
⚙️
Motors
Differential
drive wheels
PWM signals
Total latency: ~10–30ms per cycle
Scan 2ms
Cluster 1ms
Classify 3ms
Kalman 2ms
Control 2ms
09
09 — Key Takeaways

Tuning & Summary

🎯 Core Principle

Time-of-flight laser → distance array → cluster → classify → track → control. Each step reduces noise and adds intelligence.

⚙️ Key Parameters
cluster_dist0.13m — grouping threshold
max_leg_pair0.8m — leg pairing distance
Kp_linear0.5 — speed response gain
target_dist1.0m — follow distance
🔧 Troubleshooting
Oscillates → lower Kp_lin
🐢Too slow → raise Kp_lin
Misses legs → lower confidence
👻False detects → raise confidence
🚀 ROS2 Stack
sllidar_ros2 — driver
ros2_leg_detector — detection
lidar_follower — controller
RViz2 — visualization
10