Text size
  • Small
  • Medium
  • Large
  • Standard
  • Blue text on blue
  • High contrast (Yellow text on black)
  • Blue text on beige

    Alexa, Emotions, Privacy and GDPR

    HCI 2018

    Proceedings of the 32nd International BCS Human Computer Interaction Conference (HCI 2018)

    Belfast, UK, 4 - 6 July 2018


    Eoghan Furey & Juanita Blue



    We exist in a world where emotional expression is a central facet of what makes us human. It allows us to interact richly with others and aids us in functioning as a society. Affective computing, also known as artificial emotional intelligence is the area of study that seeks to enable the development of systems and devices with the capacity to understand and replicate these human affects. The race is on to develop intelligent computing systems that can mimic human interaction and also emulate and convince humans that they are human too, building a sense of trust. The Amazon Echo and its intelligent personal assistant, "Alexa", is currently one of the most popular and pervasive of these intelligent devices.

    The human name given to this technological entity alludes to its human-like conversation abilities. However, most adult humans would quickly establish that "Alexa" is not a real person. This is obviously intimated through the automated voice, but is further demonstrated by a lack of recognition and display of any emotion. The advent of VCDs like Alexa raises a plethora of significant standard privacy considerations. This paper provides further speculation into the privacy concerns relating to the ability of these devices to gather data relating to an individual‟s emotional state. This is conducted with consideration for the new General Data Protection Regulation (GDPR) introduced in May, 2018.


    PDF filePDF Version of this Paper (369kb)