ePrints.FRI - University of Ljubljana, Faculty of Computer and Information Science

Information-Theoretic Exploration and Evaluation of Models

Aleks Jakulin (2004) Information-Theoretic Exploration and Evaluation of Models.

[img]
Preview
PDF
Download (214Kb)

    Abstract

    No information-theoretic quantity, such as entropy or Kullback-Leibler divergence, is meaningful without first assuming a probabilistic model. In Bayesian statistics, the model itself is uncertain, so the resulting information-theoretic quantities should also be treated as uncertain. Information theory provides a language for asking meaningful decision-theoretic questions about black-box probabilistic models, where the chosen utility function is log-likelihood. We show how general hypothesis testing can be developed from these conclusions, also handling the problem of multiple comparisons. Furthermore, we use mutual and interaction information to disentangle and visualize the structure inside black-box probabilistic models. On examples we show how misleading can non-generative models be about informativeness of attributes.

    Item Type: Article
    Keywords: Kullback-Leibler divergence, Bayesian model comparison, variable importance
    Language of Content: English
    Institution: University of Ljubljana
    Department: Faculty of Computer and Information Science
    Divisions: Faculty of Computer and Information Science > Artificial Intelligence Laboratory
    Item ID: 145
    Date Deposited: 27 Oct 2004
    Last Modified: 13 Aug 2011 00:32
    URI: http://eprints.fri.uni-lj.si/id/eprint/145

    Actions (login required)

    View Item