A platform for research: civil engineering, architecture and urbanism
Argumentative Explanations for Recommendations Based on Reviews
Recommender systems (RS) assist users in making decisions on a wide range of tasks, while preventing them from being overwhelmed by enormous amounts of choices. RS prevalence is such that many users of information-based technologies interact with them on a daily basis. However, many of these systems are still perceived as black boxes by users, who often have no way of seeing or requesting the reasons why certain items are recommended, potentially leading to negative attitudes towards RS by users. Providing explanations in RS can bring several advantages for users' decision making and overall user experience. Although different explanatory approaches have been proposed so far, the general lack of user evaluation, and validation of concepts and implementations of explainable methods in RS, have left open many questions, related to how such explanations should be structured and presented. Also, while explanations in RS have so far been presented mostly in a static and non-interactive manner, limited work in explainable artificial intelligence have emerged addressing interactive explanations, enabling users to examine in detail system decisions. However, little is known about how interactive interfaces in RS should be conceptualized and designed, so that explanatory aims such as transparency and trust are met. This dissertation investigates interactive, conversational explanations that enable users to freely explore explanatory content at will. Our work is grounded on RS explainable methods that exploit user reviews, and inspired by dialog models and formal argument structures. Following a user-centered approach, this dissertation proposes an interface design for explanations as interactive argumentation, which was empirically validated through different user studies. To this end, we implemented a RS able to provide explanations both through a graphical user interface (GUI) navigation and a natural language interface. The latter consists of a conversational agent for explainable RS, which supports conversation flows ...
Argumentative Explanations for Recommendations Based on Reviews
Recommender systems (RS) assist users in making decisions on a wide range of tasks, while preventing them from being overwhelmed by enormous amounts of choices. RS prevalence is such that many users of information-based technologies interact with them on a daily basis. However, many of these systems are still perceived as black boxes by users, who often have no way of seeing or requesting the reasons why certain items are recommended, potentially leading to negative attitudes towards RS by users. Providing explanations in RS can bring several advantages for users' decision making and overall user experience. Although different explanatory approaches have been proposed so far, the general lack of user evaluation, and validation of concepts and implementations of explainable methods in RS, have left open many questions, related to how such explanations should be structured and presented. Also, while explanations in RS have so far been presented mostly in a static and non-interactive manner, limited work in explainable artificial intelligence have emerged addressing interactive explanations, enabling users to examine in detail system decisions. However, little is known about how interactive interfaces in RS should be conceptualized and designed, so that explanatory aims such as transparency and trust are met. This dissertation investigates interactive, conversational explanations that enable users to freely explore explanatory content at will. Our work is grounded on RS explainable methods that exploit user reviews, and inspired by dialog models and formal argument structures. Following a user-centered approach, this dissertation proposes an interface design for explanations as interactive argumentation, which was empirically validated through different user studies. To this end, we implemented a RS able to provide explanations both through a graphical user interface (GUI) navigation and a natural language interface. The latter consists of a conversational agent for explainable RS, which supports conversation flows ...
Argumentative Explanations for Recommendations Based on Reviews
2022-04-28
Theses
Electronic Resource
English
Embodied carbon emissions in buildings: explanations, interpretations, recommendations
DOAJ | 2022
|On the argumentative work of map-based visualisation
Online Contents | 2011
|On the argumentative work of map-based visualisation
Elsevier | 2011
|The Argumentative Turn in Policy Analysis and Planning
Online Contents | 1995
|The argumentative turn in policy analysis and planning
Online Contents | 1995
|