Fisheye Camera Extrinsic EOL Calibration

This post is an application of the EOL calibration described in the EOL calibration article. Detect Image Corners Detecting the corners of the fisheye images on its original image is challenging due to its severe distortion. So we resort to detecting the corners on the BEV image and then project the corners back onto the original image to further refine the corners. Set initial extrinsics (referring to installation parameters (angles and positions) or parameters from joint calibration with LiDAR), construct a 20m×20m grid with resolution of 0.01m in the ego coordinate system’s ground plane (z=0), and generate a BEV projected image; ...

February 21, 2025 · 2 min · 318 words · Fuwei Li

Coordinate Systems in Autonomous Driving

In this post, we will discuss the coordinate systems commonly used in autonomous driving. In practice, different positioning providers may define their own coordinate systems. This post will introduce the fundamental concepts of these coordinate systems and explain how to convert between them for your specific needs. Pose on the Earth When we talk about pose, we refer to the position and orientation of an object in the world. In the context of autonomous driving, the world is the Earth. Therefore, pose describes the position and orientation of an object relative to the Earth. ...

February 7, 2025 · 5 min · 1000 words · Fuwei Li

A Deep Dive into End-of-Line Camera Extrinsic Calibration for Autonomous Vehicles

In this post, we will discuss the end-of-line (EOL) camera calibration, especially for camera bird’s-eye view (BEV) extrinsic calibration. Suggested Pipeline Do single camera intrinsic calibration Measure each corner in the world coordinate Refine the corners’ coordinates according to board constraints (plane, parallel, equally spaced) Find the plane equation in the world coordinate Initialize each camera’s extrinsic parameters by solving the perspective-n-point (PnP) problem Estimate the 3D coordinates by the intersection of the image ray and board plane ...

February 3, 2025 · 11 min · 2234 words · Fuwei Li

An In-Depth Guide to 3D Gaussian Splatting for Real-Time 3D Reconstruction

I highly recommend reading the two papers: [5] and [3]. [5] provides a comprehensive description of the splatting process. [3] provides an efficient implementation of the splatting process. Problem Formulation Volume Rendering with Radiance Fields [1, 2] figure from https://www.cs.cornell.edu/courses/cs5670/2022sp/lectures/lec22_nerf_for_web.pdf. The volume density $\sigma(\mathbf{x})$ can be interpreted as the differential probability of a ray terminating (being absorbed or scattered) at an infinitesimal particle at location $\mathbf{x}$. ...

December 28, 2024 · 15 min · 3131 words · Fuwei Li

Understanding FAST-LIO: Supplementary Derivations and Explanations

In this document, we provide detailed derivations complementing those presented in [2]. Details of the Derivation Discrete Model Based on the $\boxplus$ operation defined above, we can discretize the continuous model in (1) at the IMU sampling period $\Delta t$ using a zero-order holder. The resultant discrete model is $$ \mathbf{x}_{i+1} = \mathbf{x}_i \boxplus (\Delta t f(\mathbf{x}_i, \mathbf{u}_i, \mathbf{w}_i))$$where $i$ is the index of IMU measurements, and the function $f$, state $\mathbf{x}$, input $\mathbf{u}$, and noise $\mathbf{w}$ are defined below: ...

December 27, 2024 · 7 min · 1401 words · Fuwei Li

Iterative Closest Point Uncovered: Mathematical Foundations and Applications

In this post, we will discuss the Iterative Closest Point (ICP) problem: from point-to-point and point-to-plane ICP to generalized ICP. Problem Formulation Let two 3D point-sets $\mathcal{X} = \{\mathbf{x}_i\}, i = 1, \ldots, N$ and $\mathcal{Y} = \{\mathbf{y}_j\}, j = 1, \ldots, M$, where $\mathbf{x}_i, \mathbf{y}_j \in \mathbb{R}^3$ are point coordinates, be the data point-set and the model point-set respectively. The goal is to estimate a rigid motion with rotation $\mathbf{R} \in SO(3)$ and translation $\mathbf{t} \in \mathbb{R}^3$ that minimizes the following $L_2$-error $E$: ...

December 26, 2024 · 19 min · 3968 words · Fuwei Li

Radar Signal Processing: A Tutorial

System Diagram See Appendix.VII Figure 1: System diagram of a typical 4D mmWave radar signal processing chain (figure from [1]) Single Object Tx-Rx Model Below is a mathematical formalization of each major step in the traditional 4D mmWave Frequency Modulated Continuous Wave (FMCW) radar signal processing chain, from transmitted signals through to point-cloud generation. Please note that these equations represent a general framework; actual implementations may vary slightly depending on specific system parameters and design choices. ...

December 26, 2024 · 23 min · 4800 words · Fuwei Li

Pose Tracking with Iterative Extended Kalman Filter

Tracking ego pose is critical in autonomous driving. In this article, we will discuss how to fuse the IMU, wheel encoder, GPS, etc. to track the ego pose. We will derive the pose tracking algorithm based on the iterative extended Kalman filter. This document mainly follows [1] and [2]. Preliminaries Let $\mathcal{M}$ be the manifold of dimension $n$ in consideration (e.g., $\mathcal{M} = SO(3)$). Since manifolds are locally homeomorphic to $\mathbb{R}^n$, we can establish a bijective mapping from a local neighborhood on $\mathcal{M}$ to its tangent space $\mathbb{R}^n$ via two encapsulation operators $\boxplus$ and $\boxminus$: ...

December 24, 2024 · 15 min · 3073 words · Fuwei Li

Position Filtering with Ego Motion Compensation

When tracking an object’s position, it is typically done from the ego vehicle’s perspective. However, the ego’s motion makes tracking the object somewhat difficult. The basic idea is to perform the tracking in the world coordinate system, then transform the results into the ego-car’s coordinate system. In this post, we will discuss how to concisely incorporate the ego’s motion into the object tracking process. Continuous Form Coordinate Definitions The target’s movement in the world coordinate: $\mathbf{o}(t)$; ego-car movement in the world coordinate: $\mathbf{g}(t)$; ego-car’s heading angle: $\theta(t)$; observed target’s coordinate relative to the ego-car: $\mathbf{x}(t)$. ...

December 2, 2024 · 11 min · 2210 words · Fuwei Li

Angle Kalman Filter

Besides object position tracking, heading angle tracking is also critical in autonomous driving. In this article, we will discuss how to track the angle of an object using the Kalman filter and how to do motion compensation. Wrap the Angle In this paper “On wrapping the Kalman filter and estimating with the SO(2) group”, the author has the conclusion: “based on the mathematically grounded framework of filtering on Lie groups, yields the same result as heuristically wrapping the angular variable within the EKF framework”. ...

November 30, 2024 · 2 min · 362 words · Fuwei Li