Author ORCID Identifier

Tsai - https://orcid.org/0000-0001-9188-0362

Document Type

Conference Proceeding

Publication Date

3-17-2019

Publication Title

IUI '19: Proceedings of the 24th International Conference on Intelligent User Interfaces

Volume

March 2019

First Page

391

Last Page

396

Abstract

Hybrid social recommender systems use social relevance from multiple sources to recommend relevant items or people to users. To make hybrid recommendations more transparent and controllable, several researchers have explored interactive hybrid recommender interfaces, which allow for a user-driven fusion of recommendation sources. In this field of work, the intelligent user interface has been investigated as an approach to increase transparency and improve the user experience. In this paper, we attempt to further promote the transparency of recommendations by augmenting an interactive hybrid recommender interface with several types of explanations. We evaluate user behavior patterns and subjective feedback by a within-subject study (N=33). Results from the evaluation show the effectiveness of the proposed explanation models. The result of post-treatment survey indicates a significant improvement in the perception of explainability, but such improvement comes with a lower degree of perceived controllability.

Comments

© {Authors | ACM} {2017}. This is the author's version of the work. It is posted here for your personal use. Not for redistribution. The definitive Version of Record was published in {IUI '19: Proceedings of the 24th International Conference on Intelligent User Interfaces}, https://doi.org/10.1145/3301275.3302318

Share

COinS