Skip to content

Technology Readiness Assessment

POC-by-POC Readiness, Dependencies, and Go/No-Go Criteria


Technology Readiness Levels (TRL)

TRLDefinitionDeployment Evidence Lens
1Basic principles observedResearch paper, standard, or credible technical principle published
2Technology concept formulatedArchitecture or operating concept defined for an explicit ODD
3Proof of conceptWorks on recorded data, benchmark data, or a narrow offline prototype
4Lab validationWorks in simulation, bench test, or controlled replay with measurable criteria
5Relevant environment validationWorks on vehicle, robot, or field equipment in shadow mode or supervised trials
6Prototype demonstratedPerforms the target task autonomously in a controlled area with fallback controls
7System prototype in operational environmentRuns in the real ODD with operational monitors, evidence logging, and stakeholder acceptance
8System complete and qualifiedSafety case, operating procedure, and release evidence accepted for the scoped deployment
9System proven in operationsMulti-site or fleet-scale operation with post-market monitoring and incident learning

POC Readiness Matrix

POC 1: Self-Supervised Scene Prediction (Occupancy World Model)

ComponentCurrent TRLTarget TRLBlockerRisk
PointPillars BEV encoder6 (nuScenes pretrained, TensorRT on Orin proven)7Airside fine-tuning dataLow
VQ-VAE/FSQ tokenizer4 (OccWorld proven on nuScenes)6Custom training on airside BEVLow
Transformer world model4 (OccWorld, DrivingGPT proven)6Airside fine-tuning, computeMedium
Self-supervised training pipeline3 (concept from comma.ai, Dreamer)5Data pipeline from bagsMedium
Occupancy prediction quality2 (untested on airside data)5Domain gap, novel objectsMedium

Overall TRL: 2-3 → Target: 5Go/No-Go: Achieve IoU > 0.3 at 1s on airside replay data within 4 weeks.

POC 2: Learned 3D Detection (CenterPoint)

ComponentCurrent TRLTarget TRLBlockerRisk
CenterPoint/PointPillars7 (production in Waymo, Autoware)7None — provenLow
nuScenes pretrained model6 (OpenPCDet, TensorRT proven)7NoneLow
Auto-labeling pipeline4 (concept proven, tools exist)6Label quality on airsideMedium
Custom class training3 (airside classes undefined)6Class definition, annotationMedium
TensorRT on Orin7 (6.84ms measured)8NoneLow

Overall TRL: 4 → Target: 7Go/No-Go: Achieve mAP > 40% on 10+ airside classes with < 25ms on Orin.

POC 3: Prediction-Aware Frenet Planner

ComponentCurrent TRLTarget TRLBlockerRisk
Frenet planner (existing)8 (production in reference airside AV stack)8None — existsLow
World model cost function2 (concept from Think2Drive, WorldRFT)5POC 1 must work firstHigh
Batched GPU trajectory eval4 (18.7ms benchmark exists)6C++/Python interop latencyMedium
Safety fallback to traditional3 (Simplex concept designed)6Arbitration logicMedium

Overall TRL: 2 → Target: 5Go/No-Go: Shadow mode agreement > 80% with production planner. Latency < 200ms total. Dependency: Requires POC 1 working.

POC 4: Jet Blast Hazard Mapping

ComponentCurrent TRLTarget TRLBlockerRisk
ADS-B receiver + decoding7 (dump1090/readsb mature)8$30 hardware purchaseVery Low
Aircraft type lookup table5 (CFD data published for major types)7Incomplete for all typesLow
Hazard zone computation4 (geometry straightforward)7Validation against real dataLow
ROS integration3 (not built yet)7Simple engineeringVery Low
Planner integration3 (zone → cost function mapping)6Zone manager compatibilityLow

Overall TRL: 3 → Target: 7Go/No-Go: Correct zone visualization for 5+ aircraft types in RViz. Zero false negatives.

POC 5: LiDAR FOD Detection

ComponentCurrent TRLTarget TRLBlockerRisk
PCD map (existing)8 (two maps in workspace)8NoneLow
Map differencing algorithm5 (open3d KD-tree, well-understood)7Threshold tuningLow
Clustering + filtering5 (DBSCAN standard)7False positive rateMedium
Persistence filtering3 (multi-frame confirmation)6Tracking across framesLow
ROS node3 (not built yet)7Simple engineeringVery Low

Overall TRL: 3 → Target: 7Go/No-Go: Detect 10cm object at 25m range with < 5 false alarms/hour.

POC 6: 3DGS Digital Twin

ComponentCurrent TRLTarget TRLBlockerRisk
3DGS training from PCD5 (gsplat, GS-LiDAR proven)6Airport-scale (large scene)Medium
Novel view rendering6 (3DGS mature, 135 FPS)7Ground plane qualityMedium
Dynamic object removal4 (DeSiRe-GS approach exists)5Aircraft/GSE removalMedium
Synthetic LiDAR rendering3 (GS-LiDAR research stage)5Ray-casting through GaussiansHigh
Airport-scale tiling3 (CityGaussian concept exists)5Engineering effortMedium

Overall TRL: 3 → Target: 5Go/No-Go: Render novel views with PSNR > 25 dB. Ground plane smooth enough for planning.

POC 7: Open-Vocab GSE Detection (Requires Cameras)

ComponentCurrent TRLTarget TRLBlockerRisk
YOLO-World7 (production-ready, 52 FPS)8Camera hardware neededLow
Airside prompt library2 (designed but untested)5Prompt tuning on real imagesMedium
2D-to-3D lifting4 (frustum projection well-understood)6LiDAR-camera calibrationMedium
TensorRT on Orin5 (YOLO TensorRT proven)7Re-parameterization exportLow

Overall TRL: 2 → Target: 6Go/No-Go: Detect 10+ GSE types at > 50% recall, zero-shot. < 50ms on Orin. Dependency: Requires camera hardware.

POC 8: Turnaround Phase Estimator

ComponentCurrent TRLTarget TRLBlockerRisk
A-CDM data ingestion3 (API endpoints documented)5Airport data access agreementHigh
GBRT/LSTM model5 (well-understood ML problem)6Training data availabilityMedium
Flight schedule integration4 (AODB APIs documented)6Integration complexityMedium
Phase prediction accuracy2 (untested)5Need labeled turnaround dataHigh

Overall TRL: 2 → Target: 5Go/No-Go: 80% phase accuracy, ±5 min pushback prediction. Dependency: Requires airport operations data access.


Critical Path Analysis

Week 1-2:  POC 4 (Jet Blast) ─── No dependencies, immediate start
           POC 5 (FOD) ───────── No dependencies, immediate start

Week 2-4:  POC 2 (Detection) ─── Needs: auto-labeling pipeline
           POC 1 (World Model) ── Needs: BEV encoder + bag processing

Week 4-6:  POC 3 (Planner) ───── Needs: POC 1 working
           POC 6 (Digital Twin) ─ Needs: compute (A100 cloud)

Week 6-8:  POC 7 (Open-Vocab) ── Needs: camera hardware
           POC 8 (Turnaround) ── Needs: airport data access

Minimum viable demonstration (Week 4): POC 4 (jet blast zones on map) + POC 5 (FOD alerts) + POC 2 (10+ object detection) = Tangible safety improvements with zero world model dependency.

World model demonstration (Week 6): POC 1 (occupancy prediction visualization) + POC 3 (prediction-aware planner in shadow mode) = First proof that world models add value for airside.


Infrastructure Requirements

RequirementPOCs That Need ItCostLead Time
Cloud GPU (1x A100)1, 2, 6$200-5001 day
ADS-B receiver (RTL-SDR)4$303 days shipping
Camera hardware (6-8 cameras)7$500-2,0002-4 weeks
Airport operations data access8$0 (partnership)4-12 weeks
Bag file organization1, 2, 5$0 (engineering time)1-2 weeks
NVIDIA Orin (if not already available)2, 7$1,000-2,0001-2 weeks

Assessment based on technology analysis across 153 research documents. TRL definitions adapted from NASA/ISO 16290 for airside AV context.

Public research notes collected from public sources.