How can society get the best out of AI technology when it permeates our private lives, and affects the most intimate dimensions of human life? According to Andrew McStay, a new class of data, ‘intimate data’, is required to plug a lacuna in regulation of this type of ‘emotional AI’.
Emotional Artificial Intelligence (emotional AI) refers to technologies that use affective computing and artificial intelligence techniques to sense, learn about and interact with human emotional life. Such technologies are already becoming increasingly present in everyday objects and practices. Examples include use in digital assistants, and in cars and border control. Emotional AI is also being used to regulate and optimize the ‘emotionality’ of spaces, such as workplaces, hospitals, prisons, classrooms, travel infrastructures, restaurants, retail and chain stores.
Such use of emotional AI raises profound societal questions. According to the author, there has been an over-emphasis on identification in data privacy regulation, while non-identifying soft biometric data about emotional life has been neglected. He calls for a new class of data: ‘intimate data’ (i.e. data that is sensitive, without being personal). Moreover, he argues for a group and collective understanding of privacy.
According to McStay, there is nothing innately wrong with emotional AI; these technologies can serve, assist and entertain, if they are built and deployed in a way that respects the wishes of individuals and groups. But in order to do so, ‘intimate data’ needs serious attention from Europe’s regulators.