{"date_published":"2011-02-16T00:00:00Z","oa":1,"publication_status":"published","publication_identifier":{"issn":["2664-1690"]},"title":"Energy and mean-payoff parity Markov decision processes","oa_version":"Published Version","file":[{"file_size":329976,"date_created":"2018-12-12T11:52:57Z","access_level":"open_access","creator":"system","checksum":"824d6c70e6d3feb3e836b009e0b3cf73","date_updated":"2020-07-14T12:46:41Z","file_name":"IST-2011-0001_IST-2011-0001.pdf","content_type":"application/pdf","relation":"main_file","file_id":"5458"}],"day":"16","doi":"10.15479/AT:IST-2011-0001","date_updated":"2025-04-15T08:12:14Z","author":[{"orcid":"0000-0002-4561-241X","id":"2E5DCA20-F248-11E8-B48F-1D18A9856A87","first_name":"Krishnendu","last_name":"Chatterjee","full_name":"Chatterjee, Krishnendu"},{"full_name":"Doyen, Laurent","last_name":"Doyen","first_name":"Laurent"}],"page":"20","language":[{"iso":"eng"}],"publisher":"IST Austria","pubrep_id":"23","type":"technical_report","related_material":{"record":[{"id":"3345","relation":"later_version","status":"public"}]},"has_accepted_license":"1","month":"02","year":"2011","_id":"5387","alternative_title":["IST Austria Technical Report"],"abstract":[{"lang":"eng","text":"We consider Markov Decision Processes (MDPs) with mean-payoff parity and energy parity objectives. In system design, the parity objective is used to encode ω-regular specifications, and the mean-payoff and energy objectives can be used to model quantitative resource constraints. The energy condition re- quires that the resource level never drops below 0, and the mean-payoff condi- tion requires that the limit-average value of the resource consumption is within a threshold. While these two (energy and mean-payoff) classical conditions are equivalent for two-player games, we show that they differ for MDPs. We show that the problem of deciding whether a state is almost-sure winning (i.e., winning with probability 1) in energy parity MDPs is in NP ∩ coNP, while for mean- payoff parity MDPs, the problem is solvable in polynomial time, improving a recent PSPACE bound."}],"status":"public","department":[{"_id":"KrCh"}],"citation":{"mla":"Chatterjee, Krishnendu, and Laurent Doyen. Energy and Mean-Payoff Parity Markov Decision Processes. IST Austria, 2011, doi:10.15479/AT:IST-2011-0001.","ama":"Chatterjee K, Doyen L. Energy and Mean-Payoff Parity Markov Decision Processes. IST Austria; 2011. doi:10.15479/AT:IST-2011-0001","apa":"Chatterjee, K., & Doyen, L. (2011). Energy and mean-payoff parity Markov decision processes. IST Austria. https://doi.org/10.15479/AT:IST-2011-0001","ista":"Chatterjee K, Doyen L. 2011. Energy and mean-payoff parity Markov decision processes, IST Austria, 20p.","ieee":"K. Chatterjee and L. Doyen, Energy and mean-payoff parity Markov decision processes. IST Austria, 2011.","short":"K. Chatterjee, L. Doyen, Energy and Mean-Payoff Parity Markov Decision Processes, IST Austria, 2011.","chicago":"Chatterjee, Krishnendu, and Laurent Doyen. Energy and Mean-Payoff Parity Markov Decision Processes. IST Austria, 2011. https://doi.org/10.15479/AT:IST-2011-0001."},"date_created":"2018-12-12T11:39:02Z","user_id":"2DF688A6-F248-11E8-B48F-1D18A9856A87","file_date_updated":"2020-07-14T12:46:41Z","ddc":["000","005"]}