Index Symbols | A | B | C | D | E | F | G | H | I | J | K | L | M | N | O | P | R | S | T | U | V | W | Y Symbols 2D LIDAR 2D rigid transforms 3D LIDAR 3D reconstruction A accelerometer Ackermann steering, [1] action, [1] Ackermann steering differential drive discrete actions omnidirectional motion quadrotor flight stochastic action action model action space action values activation function aerodynamic force aerodynamics all nearest neighbors ancestral sampling, [1] angular velocity, [1], [2] B backward Euler method Bayes filter Bayes filtering Bayes net Bayesian network Bayes' theorem Bayesian network Bayes net Bayesian networks Bayesian view Bayes’ Theorem bearing belief state belief transition function Bellman equation bias binary factors body frame, [1] body-frame boustrophedon Bundle Adjustment C camera calibration model camera matrix Camera Obscura Cartesian coordinate frame cascaded controller CDF cumulative distribution function chain rule classification CNN convolutional neural network collision-free path complete complete algorithms conditional Gaussian density conditional independence conditional probability conditional probability distribution, [1] conditional probability table CPT conditionally independent configuration, [1] configuration space, [1] continuous time control tape control variables controlled Markov chain controller gain convolutional layer convolutional neural network CNN convolutional neural networks cost cost map covariance matrix CPT conditional probability table cross entropy cumulative distribution function CDF D DAG directed acyclic graph DBN dynamic Bayes net, [1] DDR decision boundary decision tree deep learning deep reinforcement learning DQN DRL, [1] policy gradient methods policy optimization density differential drive robot digital cameras direct ToF directed acyclic graph DAG discount factor discrete time, [1] discrete time system discrete-time system disparity drag drag coefficient driving direction DRL deep reinforcement learning, [1] drone dynamic Bayes net DBN, [1] dynamics E edge detection elimination algorithm, [1] EM algorithm expectation-maximization empirical mean event execution phase expectation, [1] expectation-maximization EM algorithm expected reward expected value experience replay extrinsic calibration F factor factor graph, [1] factor graphs, [1], [2], [3] factored probability distribution factors, [1], [2] feedback control field of view FOV filtering distribution, [1] FLU convention focal length forward Euler method forward model forward simulation forward velocity kinematics FOV field of view frequentist view fundamental stereo equation G Gaussian Gaussian Bayes nets Gaussian factor graphs generative model Gibbs sampling GPS gradient descent, [1] graphical model, [1] graphical models greedy action selection greedy planning group GTSAM, [1], [2], [3] gyroscope H hat operator Hessian hidden Markov model HMM, [1] hidden state hidden variables histogram HMM hidden Markov model, [1] homogeneous transformation matrix horizontal Sobel edge detector I ICP iterative closest points IMU inertial measurement unit, [1] Indirect ToF inertia matrix inertial measurement unit IMU, [1] inference Information Matrix initial state intrinsic camera coordinates inverse problem inverse transform sampling inverse velocity kinematics iterative closest points ICP J Jacobian matrix, [1] jerk joint probability distribution K Kahn’s algorithm Kalman filter Kalman smoothing kinematics L lane switching lateral control law of total probability learning deep learning deep reinforcement learning density estimation neural radiance fields reinforcement learning system identification lens LIDAR Lie group likelihood, [1] likelihood factor likelihood function linear velocity, [1] localization, [1] longitudinal control loop closure, [1] M machine learning manifold MAP maximum a posteriori marginal probability distribution Markov chain Markov decision proces MDP Markov decision process Markov localization Markov property mass maximum a posteriori MAP maximum likelihood, [1] maximum likelihood estimate ML estimate MDP Markov decision proces mean mean squared error measurement model measurement phase micro aerial vehicles mixture density ML estimate maximum likelihood estimate Monte Carlo EM motion model motion primitives, [1] multivariate Gaussian density N navigation frame, [1] nearest neighbor NeRF neural radiance field, [1], [2] neural networks, [1], [2], [3] neural radiance field NeRF, [1], [2] noncommutative group nonlinear optimization nuisance variables O objective function observations occupancy map, [1] opacity optimal policy optimal value function optimization, [1] optimization variable outcome output features P partially observable Markov decision process POMDP particle filter, [1] path path planning PDF probability density function, [1] perception, [1] computer vision hidden Markov models localization MAP estimate maximum likelihood SLAM visual SLAM Perceptron photography pinhole camera model pinhole projection pitch, [1] pixels planning decision theory Markov decision process motion primitives path planning trajectory optimization value iteration PMF probability mass function, [1] point cloud point cloud map policy, [1] policy Iteration polynomial trajectories POMDP partially observable Markov decision process pooling layer pose pose alignment pose constraints PoseSLAM posterior posterior marginals prediction phase predictive distribution principal point principle of optimality prior prior probability distribution PRM probabilistic road map probabilistic road map PRM probabilistically complete probability density function Gaussian PDF, [1] probability distribution, [1] probability mass function PMF, [1] probability theory, [1] proportional PyTorch R random variable, [1] discrete random variable range rapidly-exploring random tree RRT ray direction rectified linear unit ReLU regression reinforcement learning, [1] policy iteration Q-learning ReLU rectified linear unit resolution reward, [1] RFID RGB RGB-D camera Robot localization robot state roll, [1] rollout, [1], [2] rotation matrix, [1], [2] RRT rapidly-exploring random tree S sample space sampling, [1], [2] from Markov chains simulating stochastic actions sampling-based algorithms sampling-based methods SDF signed distance function SE(2) SE(3) semantic segmentation sense-think-act sensing continuous state dynamic Bayes nets inertial LIDAR sensor models sensors stereo vision sensor coordinates sensor fusion sensor model sensors continuous-valued multi-valued sensing SfM structure from motion, [1] SGD stochastic gradient descent, [1] sigmoid function signed distance function SDF simulation simulation by sampling simultaneous localization and mapping SLAM, [1] skew symmetric matrix SLAM simultaneous localization and mapping, [1] sliding direction smoothing Sobel filter sparse Jacobian special Euclidean group, [1] special Euclidean group of order 2 special orthogonal group of order 2 splines, [1] state, [1], [2] 2D pose space 3D pose space configuration space continuous state discrete state, [1] state space state transition matrix state transition model statics statistic statistically independent statistics, [1] stereo baseline stochastic gradient descent SGD, [1] stochastic policy structure from motion SfM, [1] sufficient statistic supervised learning system of normal equations T TAMP task and motion planning task and motion planning TAMP task planning task plans task-level planning Time of Flight time series ToF topological sort torque trajectory trajectory optimization, [1] trajectory planning transformation translational velocity transmittance trapezoidal method trilinear interpolation U unary factors undirected graph uniform probability distribution Unmanned aerial vehicles utility function V value function, [1] value iteration variables variance vectored thrust visual odometry visual SLAM Viterbi Algorithm volume rendering voxel grid W weak law of large numbers, [1] world state Y yaw, [1]