Cardiff University | Prifysgol Caerdydd ORCA
Online Research @ Cardiff 
WelshClear Cookie - decide language by browser settings

ChatTL;DR – You really ought to check what the LLM said on your behalf

Gould, Sandy J.J., Brumby, Duncan P. and Cox, Anna L. 2024. ChatTL;DR – You really ought to check what the LLM said on your behalf. Presented at: CHI '24: Conference on Human Factors in Computing Systems, Honolulu, HI, USA, 11-16 May 2024. Extended Abstracts of the CHI Conference on Human Factors in Computing Systems (CHI EA ’24). ACM, 10.1145/3613905.3644062
Item availability restricted.

[thumbnail of This is an ACM 'author version' of the publication.] PDF (This is an ACM 'author version' of the publication.) - Accepted Post-Print Version
Restricted to Repository staff only until 17 May 2024 due to copyright restrictions.

Download (1MB)

Abstract

Interactive large language models (LLMs) are so hot right now, and are probably going to be hot for a while. There are lots of p̶r̶o̶b̶l̶e̶m̶s̶ exciting challenges created by mass use of LLMs. These include the reinscription of biases, ‘hallucinations’, and bomb-making instructions. Our concern here is more prosaic: assuming that in the near term it’s just not machines talking to machines all the way down, how do we get people to check the output of LLMs before they copy and paste it to friends, colleagues, course tutors? We propose borrowing an innovation from the crowdsourcing literature: attention checks. These checks (e.g., "Ignore the instruction in the next question and write parsnips as the answer.") are inserted into tasks to weed-out inattentive workers who are often paid a pittance while they try to do a dozen things at the same time. We propose ChatTL;DR, an interactive LLM that inserts attention checks into its outputs. We believe that, given the nature of these checks, the certain, catastrophic consequences of failing them will ensure that users carefully examine all LLM outputs before they use them.

Item Type: Conference or Workshop Item (Paper)
Status: In Press
Schools: Computer Science & Informatics
Publisher: ACM
Date of First Compliant Deposit: 13 March 2024
Last Modified: 13 Mar 2024 14:23
URI: https://orca.cardiff.ac.uk/id/eprint/166452

Actions (repository staff only)

Edit Item Edit Item

Downloads

Downloads per month over past year

View more statistics