<?xml version="1.0" encoding="UTF-8"?>
<!-- generator="FeedCreator 1.8" -->
<?xml-stylesheet href="http://masplan.org/lib/exe/css.php?s=feed" type="text/css"?>
<rdf:RDF
    xmlns="http://purl.org/rss/1.0/"
    xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#"
    xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
    xmlns:dc="http://purl.org/dc/elements/1.1/">
    <channel rdf:about="http://masplan.org/feed.php">
        <title>MASPlan.org - models-and-methods</title>
        <description>Multiagent Sequential Decision Making under Uncertainty</description>
        <link>http://masplan.org/</link>
        <image rdf:resource="http://masplan.org/_media/wiki:dokuwiki.svg" />
       <dc:date>2026-05-14T00:33:06+00:00</dc:date>
        <items>
            <rdf:Seq>
                <rdf:li rdf:resource="http://masplan.org/models-and-methods:mmdp?rev=1399352758&amp;do=diff"/>
                <rdf:li rdf:resource="http://masplan.org/models-and-methods:overview?rev=1399370428&amp;do=diff"/>
                <rdf:li rdf:resource="http://masplan.org/models-and-methods:ti-dec-mdp?rev=1399352862&amp;do=diff"/>
            </rdf:Seq>
        </items>
    </channel>
    <image rdf:about="http://masplan.org/_media/wiki:dokuwiki.svg">
        <title>MASPlan.org</title>
        <link>http://masplan.org/</link>
        <url>http://masplan.org/_media/wiki:dokuwiki.svg</url>
    </image>
    <item rdf:about="http://masplan.org/models-and-methods:mmdp?rev=1399352758&amp;do=diff">
        <dc:format>text/html</dc:format>
        <dc:date>2014-05-06T05:05:58+00:00</dc:date>
        <dc:creator>Anonymous (anonymous@undisclosed.example.com)</dc:creator>
        <title>mmdp</title>
        <link>http://masplan.org/models-and-methods:mmdp?rev=1399352758&amp;do=diff</link>
        <description>Multiagent Markov Decision Process (MMDP)

Formalized in 1996 by Craig Boutilier [1], the Multiagent MDP is one of the earliest formalizations of an MDP frawework for multiple decision agents, and likewise one of the simplest. The MMDP specifies the transition of the world state as a function of not a single action variable (as in the MDP) but a joint action comprising $ n $ agents individual actions.</description>
    </item>
    <item rdf:about="http://masplan.org/models-and-methods:overview?rev=1399370428&amp;do=diff">
        <dc:format>text/html</dc:format>
        <dc:date>2014-05-06T10:00:28+00:00</dc:date>
        <dc:creator>Anonymous (anonymous@undisclosed.example.com)</dc:creator>
        <title>overview</title>
        <link>http://masplan.org/models-and-methods:overview?rev=1399370428&amp;do=diff</link>
        <description>Problem Subclasses, Models, and Methods

The NEXP-Complete complexity of solving the Dec-POMDP has led researchers to carve out problem subclasses wherein traction may be gained.  For better or for worse, it has become common practice within this research community to define and publish a new model for each such subclass, leading to a veritable alphabet soup of acronyms.  And so, the ambition behind this section of the wiki is to bring some order to the large body of work that has emerged.</description>
    </item>
    <item rdf:about="http://masplan.org/models-and-methods:ti-dec-mdp?rev=1399352862&amp;do=diff">
        <dc:format>text/html</dc:format>
        <dc:date>2014-05-06T05:07:42+00:00</dc:date>
        <dc:creator>Anonymous (anonymous@undisclosed.example.com)</dc:creator>
        <title>ti-dec-mdp</title>
        <link>http://masplan.org/models-and-methods:ti-dec-mdp?rev=1399352862&amp;do=diff</link>
        <description>Transition (and Observation) Independent Decentralized MDP (TOI-Dec-MDP)

The TOI-Dec-MDP, introduced by Becker, Zilberstein, Lesser, and Goldman [1] was one of the first subclasses of Dec-POMDPs developed expressly with the intention of gaining traction on structured problems.  It introduces several key assumptions that limit the mode by which agents interact.  The TOI-Dec-MDP is a factored model, where the world state $s$ comprises each agent&#039;s local state $s_i$, which the respective agent obs…</description>
    </item>
</rdf:RDF>
