正在加载图片...
EXPLANATION IN RECOMMENDER SYSTEMS 193 2001). Most of the dialogue features associated with mixed-initiative interaction in CBR(Aha et al., 2001; McSherry, 2001a, 2002a) supported in Top Case, though not all are shown in the example dia- ogue. At any stage of the recommendation dialogue, for example, the user can specify a preferred value for an attribute other than the considered most useful by Top Case. The user can also indicate indifference to an attribute for which she is asked to specify a pre- ferred value Sormo et al.(2005, this issue) distinguish between different ty of explanation in CBr according to the goals they support, such as explaining why the proposed solution is a good solution (Justifica tion), explaining how the system reached the solution(Transparency), or explaining why a question is relevant (Relevance). The explana tions provided by Top Case when asked why it is recommending a particular case can be seen to address the Justification goal. As well s addressing the Relevance goal, the explanations it provides when asked to explain the relevance of the questions it asks may also con tribute to the Transparency goal of increasing the user's understanding of how the solution was obtained Strategist(McSherry, 2001a) is a CCBR tool for fault diag- which attribute selection is based on the reasoning strategies sed by doctors, such as confirming a target diagnosis or eliminating a competing diagnosis(Elstein et al., 1978; Kassirer and Kopelman 1991). As in iNN, an important benefit of CBR Strategist's goal-driven approach to attribute selection is that the relevance of any question the user is asked can be explained in terms of the purpose for which it was selected. Driven by an algorithm for strategic induction of deci sion trees(McSherry, 1999), CBR Strategist is best suited to diagno sis and classification tasks in which the number of outcome classes is small. This is not the case in product recommendation, where it is typical for each outcome class(a unique product or service)to be rep resented by a single case(McSherry, 2001b) 6. Conclusions Following a brief discussion of existing approaches to explanation in recommender systems and lessons learned from th focused in this paper on the benefits of iNN, a CBr approach to prod- uct recommendation, in terms of making the recommendation process more transparent to users. We have presented a detailed account ofEXPLANATION IN RECOMMENDER SYSTEMS 193 2001). Most of the dialogue features associated with mixed-initiative interaction in CBR (Aha et al., 2001; McSherry, 2001a, 2002a) are supported in Top Case, though not all are shown in the example dia￾logue. At any stage of the recommendation dialogue, for example, the user can specify a preferred value for an attribute other than the one considered most useful by Top Case. The user can also indicate her indifference to an attribute for which she is asked to specify a pre￾ferred value. Sørmo et al. (2005, this issue) distinguish between different types of explanation in CBR according to the goals they support, such as explaining why the proposed solution is a good solution (Justifica￾tion), explaining how the system reached the solution (Transparency), or explaining why a question is relevant (Relevance). The explana￾tions provided by Top Case when asked why it is recommending a particular case can be seen to address the Justification goal. As well as addressing the Relevance goal, the explanations it provides when asked to explain the relevance of the questions it asks may also con￾tribute to the Transparency goal of increasing the user’s understanding of how the solution was obtained. CBR Strategist (McSherry, 2001a) is a CCBR tool for fault diag￾nosis in which attribute selection is based on the reasoning strategies used by doctors, such as confirming a target diagnosis or eliminating a competing diagnosis (Elstein et al., 1978; Kassirer and Kopelman, 1991). As in iNN, an important benefit of CBR Strategist’s goal-driven approach to attribute selection is that the relevance of any question the user is asked can be explained in terms of the purpose for which it was selected. Driven by an algorithm for strategic induction of deci￾sion trees (McSherry, 1999), CBR Strategist is best suited to diagno￾sis and classification tasks in which the number of outcome classes is small. This is not the case in product recommendation, where it is typical for each outcome class (a unique product or service) to be rep￾resented by a single case (McSherry, 2001b). 6. Conclusions Following a brief discussion of existing approaches to explanation in recommender systems and lessons learned from this research, we have focused in this paper on the benefits of iNN, a CBR approach to prod￾uct recommendation, in terms of making the recommendation process more transparent to users. We have presented a detailed account of
<<向上翻页向下翻页>>
©2008-现在 cucdc.com 高等教育资讯网 版权所有