Lesson 1GPS/IMU and time synchronisation: RTK, PPP options, IMU drift traits, timestamping and sync protocolsThis part introduces GPS, IMU, and timing needs. It compares RTK and PPP, explains IMU drift traits, and covers timestamping and sync protocols for accurate, quick sensor fusion and positioning.
GNSS accuracy and availability limitsRTK and PPP correction strategiesIMU bias, noise, and drift modelsTime bases and timestamp policiesPPS, PTP, and IEEE 1588 usageClock monitoring and fault handlingLesson 2Perception stack parts: detection, classification, tracking, lane model estimation, gap acceptance estimationThis part breaks down the perception stack into detection, classification, tracking, and lane modelling. It also looks at gap acceptance estimation and how these parts work together for safe lane keeping and manoeuvre choices.
Object detection and region proposalsObject classification and attributesMulti-object tracking and ID managementLane model estimation and qualityGap acceptance and TTC estimationInterfaces to planning and controlLesson 3Sensor roles by function: splitting duties for lane keeping, object detection/tracking, and positioningThis part assigns roles to each sensor type. You'll see how radar, lidar, cameras, and GNSS/IMU share duties for lane keeping, object detection and tracking, and positioning in a balanced, fault-tolerant setup.
Lane keeping sensing responsibilitiesObject detection and confirmation rolesLongitudinal and lateral tracking dutiesLocalization and map alignment rolesRedundancy and graceful degradationRole allocation for highway pilotLesson 4Calibration, extrinsics, and online self-checks: calibration checks, boresight checks, and integrity monitoringThis part focuses on calibration and integrity checks. It covers extrinsic and intrinsic calibration, boresight checks, online self-checks, and health measures that spot misalignment or sensor faults before they affect safety.
Intrinsic calibration of cameras and lidarExtrinsic calibration between sensorsBoresight checks for radar and camerasOnline self-checks and residual testsHealth metrics and fault thresholdsRecalibration triggers and workflowsLesson 5Typical car sensor specs: front radar (ranges, resolution, update rate, field of view)This part reviews front radar specs and their design impact. It covers range, range and speed resolution, update rate, field of view, and how these affect highway cut-in detection, tracking steadiness, and safety buffers.
Maximum and minimum detection rangeRange, angle, and velocity resolutionUpdate rate and tracking latencyHorizontal and vertical field of viewMulti-path, clutter, and interferenceHighway pilot radar performance needsLesson 6Typical car camera specs: resolution, frame rate, dynamic range, lens FOV, mounting and calibration needsThis part covers camera specs key for self-driving. It addresses resolution, frame rate, dynamic range, lens field of view, and mounting and calibration needs, linking each to lane detection, object recognition, and fusion.
Image resolution and pixel sizeFrame rate and exposure controlDynamic range and HDR techniquesLens FOV and distortion profilesMounting rigidity and placementIntrinsic and extrinsic calibrationLesson 7Typical car lidar specs: range, angular resolution, point rate, weather performance, mounting considerationsThis part explains car lidar specs and trade-offs. You'll look at range, angular resolution, point rate, weather and dirt performance, and mounting limits that shape coverage, blockages, and fusion design.
Detection range and reflectivity limitsHorizontal and vertical angular resolutionPoint rate, scan pattern, and densityRain, fog, and dust performanceVibration, height, and occlusion issuesCleaning, heating, and contaminationLesson 8Sensor fusion setups: low-, mid-, high-level fusion trade-offs and recommended approach for highway pilotThis part reviews sensor fusion setups and trade-offs. It compares low-, mid-, and high-level fusion, then suggests a mid-level approach for highway pilot, stressing toughness, speed, and setup simplicity.
Low-level fusion and raw data sharingMid-level fusion with object listsHigh-level fusion of decisionsLatency, bandwidth, and compute costsFailure isolation and redundancyHighway pilot fusion reference designLesson 9HD map data features: lane-level geometry, speed limits, merge tags, lane links, confidence and versioningThis part details HD map lane geometry, features, and metadata. You'll see how speed limits, lane links, merge tags, confidence, and versioning aid planning, positioning, and safe moves in changing road setups.
Lane centerlines and boundariesLane-level speed limits and rulesMerge, split, and turn lane taggingLane connectivity graphs and topologyConfidence scores and freshness flagsMap versioning and change management