Apple of orange?

Measuring Concept Relatedness Using Language Models

Over the years, the notion of concept relatedness has at- tracted considerable attention. A variety of approaches, based on ontology structure, information content, association, or context have been proposed to indicate the relatedness of abstract ideas. In this paper we present a novel context based measure of concept relatedness, based on cross entropy reduction. We propose a method based on the cross entropy reduction between language models of concepts which are estimated based on document-concept assignments. After introducing our method, we compare it to the methods introduced earlier, by comparing the results with relatedness judgments provided by human assessors. The approach shows improved or competitive results compared to state-of-the-art methods on two test sets in the biomedical domain.

  • [PDF] D. Trieschnigg, E. Meij, M. de Rijke, and W. Kraaij, “Measuring concept relatedness using language models,” in Proceedings of the 31st annual international acm sigir conference on research and development in information retrieval, 2008.
    [Bibtex]
    @inproceedings{SIGIR:2008:trieschnigg,
    Author = {Trieschnigg, Dolf and Meij, Edgar and de Rijke, Maarten and Kraaij, Wessel},
    Booktitle = {Proceedings of the 31st annual international ACM SIGIR conference on Research and development in information retrieval},
    Date-Added = {2011-10-12 18:31:55 +0200},
    Date-Modified = {2012-10-30 08:45:51 +0000},
    Series = {SIGIR 2008},
    Title = {Measuring concept relatedness using language models},
    Year = {2008},
    Bdsk-Url-1 = {http://doi.acm.org/10.1145/1390334.1390523}}