As a guest user you are not logged in or recognized by your IP address. You have
access to the Front Matter, Abstracts, Author Index, Subject Index and the full
text of Open Access publications.
Statistical relational models, such as Markov logic networks, seek to compactly describe properties of relational domains by representing general principles about objects belonging to particular classes. Models are intended to be independent of the set of objects to which these principles can be applied, and it is assumed that the principles will soundly generalize across arbitrary sets of objects. In this paper, we point out limitations of models that seek to represent the corresponding principles with a fixed set of parameters and discuss the conditions under which the soundness of fixed parameters is indeed questionable. We propose a novel representation formalism called adaptive Markov logic networks to allow more flexible representations of relational domains, which involve parameters that are dynamically adjusted to fit the properties of an instantiation by phrasing the model's parameters as functions over attributes of the instantiation at hand. We empirically demonstrate the value of our learning and representation system on a simple but well-motivated example domain.
This website uses cookies
We use cookies to provide you with the best possible experience. They also allow us to analyze user behavior in order to constantly improve the website for you. Info about the privacy policy of IOS Press.
This website uses cookies
We use cookies to provide you with the best possible experience. They also allow us to analyze user behavior in order to constantly improve the website for you. Info about the privacy policy of IOS Press.