The probabilistic forward projection can be considered as a Markov
process because the ``decision'' part is removed once the actions are
given. Suppose that is given and
is applied. What is the
probability distribution over
? This was already specified
in (10.6) and is the one-stage forward projection.
Now consider the two-stage probabilistic forward projection,
. This can be computed by marginalization
as
![]() |
(10.16) |
Let denote an
-dimensional column vector that represents any
probability distribution over
. The product
yields a
column vector that represents the probability distribution over
that is obtained after starting with
and applying
. The matrix
multiplication performs
inner products, each of which is a
marginalization as shown in (10.13). The forward
projection at any stage,
, can now be expressed using a product of
state transition matrices. Suppose that
is
fixed. Let
, which
indicates that
is known (with probability one). The forward
projection can be computed as
![]() |
(10.18) |
![]() |
(10.19) |
Steven M LaValle 2020-08-14