Hugendubel.info - Die B2B Online-Buchhandlung 

Merkliste
Die Merkliste ist leer.
Bitte warten - die Druckansicht der Seite wird vorbereitet.
Der Druckdialog öffnet sich, sobald die Seite vollständig geladen wurde.
Sollte die Druckvorschau unvollständig sein, bitte schliessen und "Erneut drucken" wählen.

Concepts and Algorithms for Computing Maximum Entropy Distributions for Knowledge Bases with Relational Probabilistic Conditionals

BuchKartoniert, Paperback
309 Seiten
Englisch
Many practical problems are concerned with incomplete and uncertain knowledge about domains where relations among different objects play an important role. Relational probabilistic conditionals provide an adequate way to express such uncertain, rule-like knowledge of the form If A holds, then B holds with probability p . Recently, the aggregating semantics for such conditionals has been proposed, which, combined with the principle of maximum entropy (ME), allows probabilistic reasoning in a relational domain. However, there exist no specialized algorithms which would allow performing ME reasoning under aggregating semantics in practice. The main topic of this publication is the development, implementation, evaluation, and improvement of the very first algorithms tailor-made for solving the ME optimization problem under aggregating semantics. We demonstrate how the equivalence of worlds can be exploited to compute the ME distribution more efficiently. We further introduce an algorithm which works on weighted conditional impacts (WCI) instead of worlds and we present a novel algorithm which computes the WCI of a conditional by employing combinatorial means. These algorithms allow us to process some larger examples which could not be computed before at all and can also be beneficial for other relational ME semantics.mehr

Produkt

KlappentextMany practical problems are concerned with incomplete and uncertain knowledge about domains where relations among different objects play an important role. Relational probabilistic conditionals provide an adequate way to express such uncertain, rule-like knowledge of the form If A holds, then B holds with probability p . Recently, the aggregating semantics for such conditionals has been proposed, which, combined with the principle of maximum entropy (ME), allows probabilistic reasoning in a relational domain. However, there exist no specialized algorithms which would allow performing ME reasoning under aggregating semantics in practice. The main topic of this publication is the development, implementation, evaluation, and improvement of the very first algorithms tailor-made for solving the ME optimization problem under aggregating semantics. We demonstrate how the equivalence of worlds can be exploited to compute the ME distribution more efficiently. We further introduce an algorithm which works on weighted conditional impacts (WCI) instead of worlds and we present a novel algorithm which computes the WCI of a conditional by employing combinatorial means. These algorithms allow us to process some larger examples which could not be computed before at all and can also be beneficial for other relational ME semantics.