正在加载图片...
Artificial Intelligence Review(2005)24: 179-19 C Springer 2005 DOI10.1007/s10462-005-4612-x Explanation in Recommender Systems DAVID MCSHERRY School of Computing and Information Engineering, University of Ulster, Coleraine BT52 ISA, Northern Ireland, UK(e- mail: dmg mcsherry @ulster: ac uk) Abstract. There is increasing awareness in recommender systems research of the need to make the recommendation process more transparent to users. Following a brief review of existing approaches to explanation in recommender systems, we focus in this paper on a case-based reasoning(CBR) approach to product recommendation that offers important benefits in terms of the ease with which the recommendation process can be explained and the system s recommendations can be justified. For example, rec- ommendations based on incomplete queries can be justified on the grounds that the er's preferences with respect to attributes not mentioned in her query cannot affect the outcome. We also show how the relevance of any question the user is asked can be explained in terms of its ability to discriminate between competing cases, thus giving users a unique insight into the recommendation process. Keywords: attribute-selection strategy, case-based reasoning, explanation, recommender 1. Introduction The importance of intelligent systems having the ability to explain their reasoning is well recognised in domains such as medical decision aking and intelligent tutoring (e.g. Armengol et al., 2001; Sormo and Aamodt, 2002; Evans-Romaine and Marling, 2003). In an intelli gent tutoring system, for example, communicating the reasoning pro- cess to students may be as important as finding the right solution. Until recently, explanation in recommender systems appears to have been a relatively neglected issue. However, recent research has high lighted the importance of making the recommendation process more transparent to users and the potential role of explanation in achiev ing this objective(Herlocker et al., 2000; Shimazu, 2002; McSherry 2002b,2003b; Reilly et al,2005) Herlocker et al. (2000) suggest that the black box image of recom- mender systems may be one of the reasons why they have gained much less acceptance in high-risk domains such as holiday packages or invest- ment portfolios than in low-risk domains such as CDs or movies. TheyDOI 10.1007/s10462-005-4612-x Artificial Intelligence Review (2005) 24: 179–197 © Springer 2005 Explanation in Recommender Systems DAVID MCSHERRY School of Computing and Information Engineering, University of Ulster, Coleraine BT52 1SA, Northern Ireland, UK (e-mail: dmg.mcsherry@ulster.ac.uk) Abstract. There is increasing awareness in recommender systems research of the need to make the recommendation process more transparent to users. Following a brief review of existing approaches to explanation in recommender systems, we focus in this paper on a case-based reasoning (CBR) approach to product recommendation that offers important benefits in terms of the ease with which the recommendation process can be explained and the system’s recommendations can be justified. For example, rec￾ommendations based on incomplete queries can be justified on the grounds that the user’s preferences with respect to attributes not mentioned in her query cannot affect the outcome. We also show how the relevance of any question the user is asked can be explained in terms of its ability to discriminate between competing cases, thus giving users a unique insight into the recommendation process. Keywords: attribute-selection strategy, case-based reasoning, explanation, recommender systems 1. Introduction The importance of intelligent systems having the ability to explain their reasoning is well recognised in domains such as medical decision making and intelligent tutoring (e.g. Armengol et al., 2001; Sørmo and Aamodt, 2002; Evans-Romaine and Marling, 2003). In an intelli￾gent tutoring system, for example, communicating the reasoning pro￾cess to students may be as important as finding the right solution. Until recently, explanation in recommender systems appears to have been a relatively neglected issue. However, recent research has high￾lighted the importance of making the recommendation process more transparent to users and the potential role of explanation in achiev￾ing this objective (Herlocker et al., 2000; Shimazu, 2002; McSherry, 2002b, 2003b; Reilly et al., 2005). Herlocker et al. (2000) suggest that the black box image of recom￾mender systems may be one of the reasons why they have gained much less acceptance in high-risk domains such as holiday packages or invest￾ment portfolios than in low-risk domains such as CDs or movies. They
向下翻页>>
©2008-现在 cucdc.com 高等教育资讯网 版权所有