Based on emerging sensing and communication technology, connected and automated vehicles (CAVs) receive various types of information when traveling, such as geographic data, traffic condition, signal timing, vehicle dynamics and engine status. Most types of information are temporally dynamic and spatially decentralized. For example, in a connected eco-driving system, the dynamic traffic information is a key input to designing a safe and energy-efficient trajectory of the host CAV, but the acquisition of that information is constrained by the communication and sensing range. It is a great challenge to design a robust speed profile that would adapt to the uncertain downstream traffic condition.
A Markov Decision Process (MDP) based approach is therefore developed in this research. Multiple decision points are distributed within the potential queuing area, so the eco-driving process is decomposed into actions with the energy consumption as the cost function. The optimal decision at each state corresponds to an adaptive and robust eco-driving strategy that minimizes the expectation of the energy consumption of all possible following actions. Numerical experiments are also conducted to validate the proposed model under different powertrain systems, such as internal combustion engine, electric vehicle and plug-in hybrid electric vehicle. This method provides a proactive approach rather than a passive way to adapt to the dynamic uncertainty in acquisition of the traffic information, and shows significant advantage in energy saving.