Multiple-Environment Markov Decision Processes: Efficient Analysis and Applications
Authors | |
---|---|
Year of publication | 2020 |
Type | Article in Proceedings |
Conference | Proceedings of the International Conference on Automated Planning and Scheduling |
MU Faculty or unit | |
Citation | |
Web | https://ojs.aaai.org//index.php/ICAPS/article/view/6644 |
Keywords | decision making; Markov decision processes; contextual recommendations |
Description | Multiple-environment Markov decision processes (MEMDPs) are MDPs equipped with not one, but multiple probabilistic transition functions, which represent the various possible unknown environments. While the previous research on MEMDPs focused on theoretical properties for long-run average payoff, we study them with discounted-sum payoff and focus on their practical advantages and applications. MEMDPs can be viewed as a special case of Partially observable and Mixed observability MDPs: the state of the system is perfectly observable, but not the environment. We show that the specific structure of MEMDPs allows for more efficient algorithmic analysis, in particular for faster belief updates. We demonstrate the applicability of MEMDPs in several domains. In particular, we formalize the sequential decision-making approach to contextual recommendation systems as MEMDPs and substantially improve over the previous MDP approach. |
Related projects: |