Skip to content

Sparse LiDAR Map Cleaning Validation

Last updated: 2026-05-09

Sparse LiDAR makes map cleaning harder because missing evidence can look like empty space, dynamic artifacts can leak through moving objects, and aggressive cleaners can erase valid structure. Airside sites amplify the problem: open aprons have few vertical landmarks, long ranges, reflective aircraft surfaces, low fixtures, and temporary equipment.

Validation Claim

For the validated LiDAR configuration and airport zones, the map-cleaning pipeline remains conservative under sparse observations: it removes dynamic artifacts only when supported by geometric, temporal, radiometric, or review evidence, and it preserves localization-critical static structure.

Sparse Failure Modes

FailureSparse-data causeAirside exampleSafety impact
Dynamic artifact retainedrays pass through/around moving object and hit backgroundtug or bus crossing a stand surveyghost structure pollutes localization map
Static erosioncleaner overreacts to missing returnspole, sign, chock, curb, terminal edgelost localization or boundary evidence
False free spaceunobserved cells treated as emptyaircraft-occluded service laneplanner assumes clearance it does not have
Intensity misclassificationrange, incidence, wet surface, or sensor aging shifts intensityreflective markings or aircraft skinvalid asset deleted or ghost retained
Reversion failureremoved static points are not restoredlow vertical fixtures and sparse polesmap quality degrades silently
Parameter brittlenessthresholds tuned on dense road LiDAR16/32 beam apron passcleaner does not transfer

Test Matrix

DimensionRequired slicesAcceptance focus
LiDAR densitynominal, degraded beam count, packet drop, lower-resolution sensormetric stability and safe degradation
Rangenear, mid, long range to terminal edge and standsstatic retention by range bin
Incidence angleshallow ground, vertical edges, curved aircraft surfacesintensity and geometry residual robustness
Weather/surfacedry, wet, night lighting, glare, de-icing residue if in ODDfalse deletion and false retention changes
Object motionmoving GSE, parked-then-removed GSE, aircraft present/absentdynamic rejection without static erosion
Pose qualitynominal, bounded jitter, time offset, GNSS-denied replaycleaner sensitivity to registration error
Scene structureopen apron, terminal edge, service road, gate equipment clusterlocalization observability after cleaning

Metrics

MetricDefinitionGate use
Sparse static preservationstatic retention by range, beam, and incidence binblocks over-cleaning in weak-observation zones
Dynamic artifact rejectionremoved dynamic/artifact labels divided by all artifact labelsconfirms cleaner still works under sparsity
Unknown-not-free rateunobserved or occluded cells retained as unknown instead of freeprotects planning semantics
Intensity residual stabilitycalibrated intensity disagreement by material/rangechecks RI-DVP-style radiometric assumptions
Reversion recoveryvalid static points restored after aggressive candidate removalcatches one-way deletion bugs
Localization healthNDT/ICP score, covariance, inliers, residuals, relocalization successfinal safety-relevant map quality signal
Reviewer escalationpercentage of sparse decisions sent to human reviewconfirms uncertainty is exposed

Gate Rules

GatePass conditionBlocker
Sparse input declarationLiDAR model, channel count, scan pattern, range limits, packet loss, and mounting are recordedcandidate uses untracked sensor assumptions
Conservative unknown handlingunobserved space cannot become free space without positive evidencecleaner converts missing data into clearance
Cross-session evidenceremovals in sparse zones are supported by repeated observations or reviewone noisy pass drives permanent deletion
Localization replaysparse-zone replay remains within release thresholdsresidual or covariance worsens in open apron
Parameter lockthresholds are frozen before holdout sparse testspost-hoc tuning on acceptance set
Fallback pathsparse-data warning can block publication or require overlay reviewsilent acceptance of low-evidence tiles

Practical Procedure

  1. Build dense-reference labels where possible using repeated slow passes, static survey, or manual inspection.
  2. Downsample and corrupt the same sequences to emulate sparse sensor modes.
  3. Run ERASOR, Removert, MapCleaner, RI-DVP-style, or production cleaner variants on the same inputs.
  4. Compare retained static, removed dynamic, restored static, unknown, and rejected layers.
  5. Replay localization on every candidate map and inspect the weak-feature apron zones first.
  6. Treat sparse disagreement as a publication risk, not as a visualization issue.

Sources

Public research notes collected from public sources.