Cardiff University | Prifysgol Caerdydd ORCA
Online Research @ Cardiff 
WelshClear Cookie - decide language by browser settings

Empirical methods for modelling persuadees in dialogical argumentation

Hunter, Anthony and Polberg, Sylwia ORCID: https://orcid.org/0000-0002-0811-0226 2018. Empirical methods for modelling persuadees in dialogical argumentation. Presented at: 2017 IEEE 29th International Conference on Tools with Artificial Intelligence (ICTAI), Boston, MA, USA, 6-8 November 2017. 2017 IEEE 29th International Conference on Tools with Artificial Intelligence (ICTAI). IEEE, pp. 382-389. 10.1109/ICTAI.2017.00066

[thumbnail of empirical_copyright.pdf]
Preview
PDF - Accepted Post-Print Version
Download (195kB) | Preview

Abstract

For a participant to play persuasive arguments in a dialogue, s/he may create a model of the other participants. This may include an estimation of what arguments the other participants find believable, convincing, or appealing. The participant can then choose to put forward those arguments that have high scores in the desired criteria. In this paper, we consider how we can crowd-source opinions on the believability, convincingness, and appeal of arguments, and how we can use this information to predict opinions for specific participants on the believability, convincingness, and appeal of specific arguments. We evaluate our approach by crowd-sourcing opinions from 50 participants about 30 arguments. We also discuss how this form of user modelling can be used in a decision-theoretic approach to choosing moves in dialogical argumentation.

Item Type: Conference or Workshop Item (Paper)
Date Type: Published Online
Status: Published
Schools: Computer Science & Informatics
Publisher: IEEE
ISBN: 978-1-5386-3876-7
ISSN: 2375-0197
Funders: EPSRC Project EP/N008294/1 “Framework for Computational Persuasion”.
Date of First Compliant Deposit: 12 March 2019
Date of Acceptance: 6 September 2017
Last Modified: 25 Oct 2022 13:43
URI: https://orca.cardiff.ac.uk/id/eprint/120623

Citation Data

Cited 7 times in Scopus. View in Scopus. Powered By Scopus® Data

Actions (repository staff only)

Edit Item Edit Item

Downloads

Downloads per month over past year

View more statistics