دورية أكاديمية

Modelling a socialised chatbot using trust development in children: lessons learnt from Tay.

التفاصيل البيبلوغرافية
العنوان: Modelling a socialised chatbot using trust development in children: lessons learnt from Tay.
المؤلفون: Bridge, Oliver, Raper, Rebecca, Strong, Nicola, Nugent, Selin E.
المصدر: Cognitive Computation & Systems; Jun2021, Vol. 3 Issue 2, p100-108, 9p
مستخلص: In 2016 Microsoft released Tay.ai to the Twittersphere, a conversational chatbot that was intended to act like a millennial girl. However, they ended up taking Tay's account down in less than 24 h because Tay had learnt to tweet racist and sexist statements from its online interactions. Taking inspiration from the theory of morality as cooperation, and the place of trust in the developmental psychology of socialisation, we offer a multidisciplinary and pragmatic approach to build on the lessons learnt from Tay's experiences, to create a chatbot that is more selective in its learning, and thus resistant to becoming immoral the way Tay did. [ABSTRACT FROM AUTHOR]
Copyright of Cognitive Computation & Systems is the property of Wiley-Blackwell and its content may not be copied or emailed to multiple sites or posted to a listserv without the copyright holder's express written permission. However, users may print, download, or email articles for individual use. This abstract may be abridged. No warranty is given about the accuracy of the copy. Users should refer to the original published version of the material for the full abstract. (Copyright applies to all Abstracts.)
قاعدة البيانات: Complementary Index