Skip to content

Time Synchronization Error Budgets for Multi-Sensor Autonomy

Time Synchronization Error Budgets for Multi-Sensor Autonomy curated visual

Visual: timing-error-to-spatial-error budget stack showing clock offset, exposure delay, transport latency, fusion timestamp, vehicle speed, and sensor-specific effects.

Time synchronization is a measurement model, not a logging convenience. A camera frame, LiDAR point, radar detection, IMU sample, wheel tick, and GNSS fix are only comparable if the stack knows when the physical measurement was acquired, which clock produced the timestamp, and how much uncertainty remains.

This page defines practical time-error budgets for perception, SLAM, mapping, airside replay, and validation.


1. Why Time Error Becomes Spatial Error

The first-order spatial error from a timestamp error is:

e_position ~= v * dt
e_yaw      ~= yaw_rate * dt
e_lateral  ~= range * sin(e_yaw)

Examples:

Motion caseTime errorResulting error
Tug docking at 2 m/s20 ms4 cm longitudinal offset
Yard vehicle at 10 m/s50 ms50 cm longitudinal offset
Road AV at 25 m/s20 ms50 cm longitudinal offset
Vehicle turning at 20 deg/s50 ms1 deg yaw mismatch
Camera projecting object at 30 m with 1 deg yaw mismatch50 ms at 20 deg/sabout 52 cm lateral projection error

For low-speed airside autonomy, centimeter-level map alignment can still be lost by tens of milliseconds of timestamp error. For road-speed autonomy, the same error can dominate the sensor's own measurement noise.


2. Timestamp Semantics

Every message should distinguish these times:

TimestampMeaningUse in fusion
Acquisition timeTime the photons, laser returns, RF echoes, inertial sample, or encoder edge were measured.Primary timestamp for perception, SLAM, and estimation.
Exposure midpointCamera time at the middle of exposure.Preferred for global-shutter image geometry when exposure is non-negligible.
Row timeRolling-shutter time for each image row.Needed for motion compensation and rolling-shutter calibration.
Scan or packet timeLiDAR/radar packet or firing time, often per block or per point.Needed for deskew and target tracking.
Driver publish timeTime the software driver published the message.Useful for latency metrics, not geometry.
Host receive timeTime the host received the packet or message.Useful for network diagnostics and replay order.
Consumer timeTime a node consumed the message.Useful for pipeline latency and deadline monitoring.

Fusion should use acquisition time. Publish, receive, and consumer times are diagnostic fields. If a driver stamps messages with now() at publish time, the system has converted transport jitter into geometric error.

2.1 Clock Domains

Typical vehicle clock domains:

GNSS constellation time / UTC
  -> receiver time and PPS
  -> PTP or gPTP grandmaster
  -> Ethernet PHCs and sensor clocks
  -> host system clocks
  -> middleware clocks and replay clocks

Each conversion should be explicit. A log should record not only the timestamp value, but also the clock source and synchronization state that made the value credible.


3. Synchronization Mechanisms

MechanismTypical useStrengthFailure mode
Hardware triggerCameras, strobes, some LiDARsDeterministic acquisition start.Trigger edge is confused with exposure midpoint or readout completion.
PPSGNSS receivers, IMUs, time serversSimple one-pulse-per-second phase reference.Time-of-week or UTC association is missing or wrong.
IEEE 1588 PTPEthernet sensors and computeSub-microsecond to microsecond class with hardware timestamping.Switches or NICs fall back to software timestamping; grandmaster failover causes a step.
IEEE 802.1AS gPTPTSN domains, automotive EthernetPTP profile for time-sensitive networks.Not all endpoints support the same profile or domain.
GNSS disciplined clockGlobal traceability and incident replayTies vehicle time to UTC/GNSS time.GNSS outage, spoofing, jamming, or poor holdover oscillator.
Software timestampingPrototypes and low-cost sensorsEasy to deploy.Load-dependent jitter and network delay dominate.
Offline alignmentDataset repair and calibrationCan estimate residual offset from motion.Degenerate motion creates plausible but wrong offsets.

PTP/gPTP only solves the network-clock problem when hardware timestamping is enabled through the NIC, switch, and sensor. A PTP daemon that synchronizes a host clock cannot recover the acquisition time of an unsynchronized camera unless the camera exposes either hardware timestamps or a trigger relationship.


4. Error Budget Model

A practical timestamp error budget for sensor i can be written as:

t_true_i = t_msg_i
         + b_clock_i
         + d_clock_i * (t - t0)
         + b_trigger_i
         + b_driver_i
         + b_transport_i
         + n_jitter_i

sigma_t_i^2 =
    sigma_clock_i^2
  + sigma_trigger_i^2
  + sigma_driver_i^2
  + sigma_transport_i^2
  + sigma_model_i^2

where:

  • b_clock_i is clock offset.
  • d_clock_i is clock drift or frequency error.
  • b_trigger_i is trigger-to-acquisition delay.
  • b_driver_i is fixed driver latency if the timestamp is not hardware time.
  • b_transport_i is network or bus delay.
  • n_jitter_i is residual variable error.

For two sensors, relative timing uncertainty is:

sigma_dt_ij = sqrt(sigma_t_i^2 + sigma_t_j^2 - 2 * cov(t_i, t_j))

Shared clock sources create correlation. If two cameras share the same trigger and clock, their relative timing can be better than their absolute UTC timing. If two sensors are independently stamped by host receive time, their jitter is often not independent under CPU or network load.

4.1 Example Budget Targets

Fusion pathPreferred targetReason
IMU propagation< 0.1 ms sample interval errorBias and integration errors grow quickly at high rate.
LiDAR point deskew< 1 ms point or packet timingTurning motion bends scans when per-point timing is wrong.
LiDAR-camera projection< 2-5 ms relative offsetProjection error is visible on moving ego vehicle and moving actors.
Radar-camera association< 5-10 ms relative offsetRadar tracks and image boxes diverge under actor motion.
Wheel odometry to IMU< 2-5 msSlip detection and yaw-rate residuals need aligned samples.
Airside low-speed replay< 10 ms plus recorded latency metadataIncident reconstruction needs traceability, not just approximate order.

These are engineering targets, not universal standards. The right value depends on vehicle speed, yaw rate, range, perception tolerance, and validation claims.


5. Sensor-Specific Timing Notes

5.1 Cameras

For global shutter cameras, timestamp exposure start, exposure midpoint, or trigger edge consistently. If exposure time is T_exp, a trigger-at-exposure start image should often be modeled at:

t_image ~= t_trigger + trigger_delay + T_exp / 2

For rolling shutter cameras:

t_row(v) = t_frame_start + row_index * row_readout_time

A single frame timestamp is insufficient for metric projection during vehicle motion unless the rolling-shutter model is included.

5.2 LiDAR

Spinning and scanning LiDARs acquire points over tens to hundreds of milliseconds. Deskew requires per-point, per-column, or per-packet timing:

p_base(t_ref) = T_base(t_ref)_base(t_point) * T_base_lidar * p_lidar(t_point)

Using one timestamp for a whole scan is only acceptable when ego motion is small relative to map and perception tolerances.

5.3 Radar

Radar detections may be produced after chirp integration, FFT processing, and tracking. The timestamp should refer to the measurement epoch, not the time the object list left the sensor. Radar velocity residuals are especially sensitive to ego-motion time alignment.

5.4 IMU and Wheel Encoders

IMUs should provide monotonic sample times with known scale and clock source. Wheel encoders should timestamp tick edges or integration intervals. If encoder counts are polled over CAN at variable intervals, the covariance must include polling jitter and quantization.

5.5 GNSS/RTK

GNSS receivers expose position epochs, receiver clock status, PPS, timepulse, fix type, correction age, and sometimes raw measurement time tags. Do not stamp GNSS fixes with serial or TCP arrival time. For RTK, stale corrections can make the position look precise while its epoch or integrity is no longer valid.


6. Logs and Replay

Replay changes the meaning of time unless the log preserves the original timing contract.

Log at minimum:

  • acquisition timestamp from the sensor or synchronized driver
  • host receive timestamp
  • publish timestamp if different from acquisition time
  • PTP/gPTP offset, path delay, grandmaster identity, and lock state
  • PPS lock and GNSS time status
  • trigger counters, dropped frames, packet loss, and sequence numbers
  • driver queue depth and middleware message age
  • transform tree and calibration artifact IDs
  • replay clock policy and rate

Replay systems should be able to answer:

  1. What was the physical acquisition time of this measurement?
  2. Was the sensor synchronized at that time?
  3. Was the transform valid at that acquisition time?
  4. Did the online system process the same measurement before its deadline?
  5. Are replayed messages ordered by acquisition time or by recorded arrival?

For incident replay, arrival-order playback may reproduce software timing bugs, while acquisition-time playback may reproduce the intended estimator behavior. Both are useful, but they answer different questions.


7. Effects on Perception, SLAM, Mapping, and Validation

FunctionTiming effect
PerceptionMisaligned camera/LiDAR/radar observations create ghost objects, poor association, false velocity, and degraded training labels.
SLAMScan matching residuals grow during turns; visual-inertial constraints become biased; loop closures may compensate for time error as pose error.
MappingMap edges smear, lane and stand markings shift, poles duplicate, and LiDAR intensity maps become inconsistent between passes.
ValidationDataset labels become non-reproducible; incident timelines are disputable; latency and safety-margin claims cannot be defended.

For airside vehicles, timing affects stand docking, personnel detection, aircraft clearance, jet-blast zone mapping, and after-action reconstruction. A vehicle moving slowly can still violate a 5 cm clearance budget if a synchronized sensor path silently falls back to host receive timestamps.


8. Failure Modes and Health Checks

Failure modeObservable signalMitigation
Sensor stamps publish timeMessage age looks small but fusion residuals grow with load.Driver review, hardware timestamp tests, load-injection validation.
PTP grandmaster changeClock offset step or frequency ramp.Log grandmaster ID and holdover state; reject large time jumps.
PPS present without time-of-weekOne-second phase is right but absolute epoch is wrong.Verify UTC/GNSS time validity before accepting PPS-derived time.
Rolling shutter treated as global shutterProjection residual varies by image row during motion.Use global shutter or estimate row time.
Whole LiDAR scan treated as instantaneousCurved walls and doubled edges during turns.Use per-point timing and deskew.
Correction stream latency ignoredRTK fix remains flagged but position lags or jumps.Monitor correction age and receiver quality flags.
Replay uses wall-clock receive timeOffline results differ from online behavior.Preserve acquisition time and replay policy explicitly.

Useful runtime monitors:

  • PTP offset and path delay by interface
  • sensor clock offset estimates
  • message age by sensor and topic
  • out-of-order timestamp counts
  • dropped trigger or sequence counts
  • LiDAR-camera reprojection residual vs vehicle speed and yaw rate
  • radar velocity residual vs IMU/wheel ego-motion
  • map residual vs scan time and deskew status

9. Source-Backed Design Rules

  1. Treat acquisition time as the measurement timestamp.
  2. Store publish and receive times as diagnostics, not substitutes.
  3. Use hardware trigger, PPS, PTP, or gPTP for production fusion paths.
  4. Include clock offset and drift in the error budget.
  5. Turn temporal offset into an estimated calibration state when hardware synchronization cannot be guaranteed.
  6. Validate under motion, CPU load, network load, GNSS outage, and replay.
  7. Make logs traceable enough to reconstruct both physical time and software processing time.

Sources

Public research notes collected from public sources.