Affect-based information retrieval

Arapakis, Ioannis (2010) Affect-based information retrieval. PhD thesis, University of Glasgow.

Full text available as:
Download (5MB) | Preview
Printed Thesis Information:


One of the main challenges Information Retrieval (IR) systems face nowadays originates from the semantic gap problem: the semantic difference between a user’s query representation and the internal representation of an information item in a collection. The gap is further widened when the user is driven by an ill-defined information need, often the result of an anomaly in his/her current state of knowledge. The formulated search queries, which are submitted to the retrieval systems to locate relevant items, produce poor results that do not address the users’ information needs.

To deal with information need uncertainty IR systems have employed in the past a range of feedback techniques, which vary from explicit to implicit. The first category of feedback techniques necessitates the communication of explicit relevance judgments, in return for better query reformulations and recommendations of relevant results. However, the latter happens at the expense of users’ cognitive resources and, furthermore, introduces an additional layer of complexity to the search process. On the other hand, implicit feedback techniques make inferences on what is relevant based on observations of user search behaviour. By doing so, they disengage users from the cognitive burden of document rating and relevance assessments. However, both categories of RF techniques determine topical relevance with respect to the cognitive and situational levels of interaction, failing to acknowledge the importance of emotions in cognition and decision making.

In this thesis I investigate the role of emotions in the information seeking process and develop affective feedback techniques for interactive IR. This novel feedback framework aims to aid the search process and facilitate a more natural and meaningful interaction. I develop affective models that determine topical relevance based on information gathered from various sensory channels, and enhance their performance using personalisation techniques. Furthermore, I present an operational video retrieval system that employs affective feedback to enrich user profiles and offers meaningful recommendations of unseen videos.

The use of affective feedback as a surrogate for the information need is formalised as the Affective Model of Browsing. This is a cognitive model that motivates the use of evidence extracted from the psycho-somatic mobilisation that occurs during cognitive appraisal. Finally, I address some of the ethical and privacy issues that arise from the social-emotional interaction between users and computer systems. This study involves questionnaire data gathered over three user studies, from 74 participants of different educational background, ethnicity and search experience. The results show that affective feedback is a promising area of research and it can improve many aspects of the information seeking process, such as indexing, ranking and recommendation. Eventually, it may be that relevance inferences obtained from affective models will provide a more robust and personalised form of feedback, which will allow us to deal more effectively with issues such as the semantic gap.

Item Type: Thesis (PhD)
Qualification Level: Doctoral
Keywords: Information retrieval, multimedia retrieval, affective feedback, user profiling, facial expression analysis, physiological signal processing, biometrics, personalisation, classification, support vector machines
Subjects: Q Science > QA Mathematics > QA75 Electronic computers. Computer science
B Philosophy. Psychology. Religion > BF Psychology
Colleges/Schools: College of Science and Engineering > School of Computing Science
Supervisor's Name: Jose, Prof. Joemon
Date of Award: 2010
Depositing User: Dr Ioannis Arapakis
Unique ID: glathesis:2010-1867
Copyright: Copyright of this thesis is held by the author.
Date Deposited: 13 Jul 2010
Last Modified: 10 Dec 2012 13:47

Actions (login required)

View Item View Item


Downloads per month over past year