Spotify Mood Recognising AI Patent: A Legal Analysis of Emotion AI in the Digital Media Industry

by Prattay Lodh†

   

Introduction

Today artificial intelligence (AI) contributes 16 trillion1 to the global economy every year, driving the fourth industrial revolution2, and is becoming increasingly profitable, personal, and pervasive. Recently, Spotify acquired patent approval for a mood-recognising AI which is designed to detect background noises to enhance a user's music listening experience. This article discusses the legality of emotional artificial intelligence systems (EAIS) while commenting on Spotify's patent. In doing so, this article attempts to shed light upon the issues arising due to the ambiguous legal status of EAIS in the world. It further compares the EU's GDPR (European Union General Data Protection Regulation), Singapore's Personal Data Protection Act, 2012 and India's Information Technology (IT) Act, 20003 and subsidiary rules to highlight the need for a countrywide robust privacy law after the withdrawal of the Personal Data Protection (PDP) Bill, 20194.

Emotion AI and surrounding issues

An emotion AI system refers to the application of affective computing where the AI approaches to perceive and “feel into” human emotional life. It is a new branch of artificial intelligence in which computers analyse non-verbal signs from people such as body language, facial gestures, movements, and voice tonality to determine their emotions. This innovation is already being used extensively in fields like marketing, customer care, health, and others.

In the field of commercial advertising, emotional AI's ability to generate appealing advertisements is one of its most compelling5 business promises. Here, the user of the service would start getting advertisements for products/services that they have been wanting/thinking of. This occurs when a device owner, unknowingly “accepts” to be influenced by utilising services of the company. The company then, based on the user's mood, location, conversations, etc., displays an advertisement of appeal to them.

This props an ethical case. It can be contended that a user's intimate information should not be obtained without consent. Emotional information such as conversations, moods and expressions have the potential to be effective; emotion detection may influence a user's interactions6 with their public contexts. Additionally, this has the potential to incentivise the exchange of personal data for media content and internet advertising even at public spaces.

There are concerns regarding EAIS's manipulative skills and potential profiling mistakes/ malice that may have the ability to trigger a negative emotion within the user. For instance, in the US, researchers found7 that the AI used for follow-up healthcare coordination programmes is coded to filter out black patients and give preference to others.

Spotify's patent entitled “Identification of Taste Attributes from an Audio Signal”8, covers a “method of processing the provided audio signal, which includes speech content and background noise” and then “identifying playable content, based on the processed audio signal content.” In layman’s terms, the patent allows Spotify to recognise not only its user’s voices but also background sounds that they are unaware of9. Further, with its geo tracking patent10, the environment recognition patent11 and personality tracking technology12, the app will track conversations and noises such as TV audio, birds chirping, etc. and recommend music as per the environment.

With the general sentiment of the public against the development13 of this AI, this move by Spotify raises big questions in the domain of privacy laws surrounding EAIS.

Laws surrounding emotion AI

A.  Around the world

Currently, there is no consensus on EAIS around the globe. However, lawmakers agree14 that in case the answers to the following questions are affirmative, then the AI can be in breach of advertising the data in public:

(i) Whether an individual is identifiable from the data collected?

(ii) Whether the data collection and code generation by the company targets a specific person?

(iii) Whether the data collection and code generation “singles out” the individual in some manner?

In Spokeo Inc. v. Robins15, the US Supreme Court established the doctrine of “concrete harm,” which requires a plaintiff to demonstrate that they experienced “a concrete and specific infringement of their legally protected interest” that was “real or impending, not conjectural or speculative. However, in the case of EAIS, it becomes particularly difficult to demonstrate such “concreteness.” For instance, in LabMD, Inc., In re16, the Federal Trade Commission’s (FTC) case was dismissed following a trial before an Administrative Law Judge because the FTC had failed to demonstrate that LabMD’s failure to adopt adequate cybersecurity policies either caused or that it was likely to cause injury to consumers.

Europe's GDPR and Singapore's Personal Data Protection Act attempt to challenge the status quo in favour of the user. However, both these legislations fail to address EAIS on paper due to their broad-based and general approaches. For instance, Article 2217 of EU GDPR establishes a universal right not to be entirely subjected to automated decision-making that has a substantial impact on the individual. However, it does not include18 a right to object to these decisions; rather, it assumes that an AI-generated decision is generally lawful as long as the user does not object to it. Furthermore, there is no mention of the term “emotions” in the legislation.

For the case in point i.e. Spotify: the company's official privacy policy19 mentions its alignment with EU GDPR Article 1520. The company claims that it collects personal information under a “legal basis to fulfil various purposes”. These are inclusive of the distribution of user's profile data, street address data and voice data to third-party advertisers. Additionally, there is no fixed timeframe21 for data retention and user data can be kept as long as necessary.

B.  Existing privacy issues in India aggravated by withdrawal of Personal Data Protection Bill, 2019

In India, after the landmark Puttaswamy judgment22, the right to privacy became part of “life and liberty” under Article 2123 of the Constitution. However, there exists a vacuum of overarching data protection governance legislation across India. Especially, after the withdrawal24 of the Personal Data Protection Bill, 2019. As of today, the Information Technology Act, 2000 (IT Act) and the Information Technology (Reasonable Security Practices and Procedures and Sensitive Personal Data or Information) Rules, 201125 govern the legal relationship between companies and users regarding the privacy of their data.

Without a robust law to govern these issues, and with the courts producing different decisions per case, there is no consensus in dealing with such breaches. For instance, in Vasunathan v. Registrar General26, the Karnataka High Court recognised the plaintiff's right to be forgotten decreeing the removal of the plaintiff's personal information from a judgment, but in Dharamraj Bhanushankar Dave v. State of Gujarat27, the Gujarat High Court dismissed the exact player.

Effective legal remedy, agitation of crucial fundamental rights and measures to prohibit companies from unlawfully gathering personal data are missing, which are all aspects that a robust data protection legislation is bound to answer. Additionally, the Constitution currently does not provide writ remedies against private entities since they do not form a “State” within the sense of Article 1228. This implies that under Indian law, there is now very little redress available in cases when a private entity breaches a citizen’s right to privacy.

Furthermore, Section 43-A29 of the IT Act allows individuals to seek compensation for damages from “body corporates” that, among other things, fail to implement reasonable security practices and cause wrongful loss to a person. However, more substantial liabilities under the privacy regulations (such as taking approval, processing data for limited purposes, data retention, ability to share with only explicit permission, revealing receivers to a user, and offering a grievance redressal mechanism) only relate to “sensitive personal data”, which will include only defined kinds of information such as pass codes, health information, financial information, or biometric data thereby lacking other types of data like home addresses, phone numbers, religious beliefs, gender identity, etc. by which a user associates himself with today.

Conclusion

The worldwide emotion analytics market is estimated to expand to USD 25 billion in 2023 at a compound annual growth rate (CAGR) of 17% over the forecast period 2017-2023. Even though emotion AI systems are highly profitable for companies, it is deeply personal and pervasive for users. The lack of a robust legal infrastructure coupled with any coder’s malicious intent can lead to several negative externalities. The Spotify patent can turn its users into algorithms by predicting which emotion can be evoked by playing what kind of music. Lawmakers around the globe must come to a consensus in laying out certain basic foundations in this niche field of technology. In India, the death of the PDP Bill has already set the country on a back foot and is leading to manifold cases of cybersecurity violations daily. It will be interesting to witness what the legal arena holds for emotion AI and its stakeholders in the future.


†Student, Postgraduate Diploma in IPR Law, NLSIU, Bangalore. Author can be reached at prattay.lodh@yahoo.com/lodh.prattay@gmail.com.

1. PricewaterhouseCoopers, “Sizing the Prize: What's the Real Value of AI for your Business and how can you Capitalise?”, available at

<https://www.pwc.com/gx/en/issues/analytics/assets/pwc-ai-analysis-sizing-the-prize-report.pdf>, last accessed on 20-8-2022.

2. Brinda Sapra, “AI for All: How India can become an Artificial Intelligence Superpower”, NextBillion, available at <https://nextbillion.net/india-artificial-intelligence-superpower/> dated 1-11-2019.

3. Information Technology Act, 2000.

4. Personal Data Protection Bill, 2019.

5. Jeffrey L. Vagle, “Cybersecurity and Moral Hazard”, 23 Stan Tech L Rev 67, Stanford University, available at <https://www-cdn.law.stanford.edu/wp-content/uploads/2020/03/2020-03-06_Vagle_Final.pdf>, last accessed on 23-8-2022.

6. Terry A. Maroney, “Law and Emotion: A Proposed Taxonomy of an Emerging Field”, PMID: 16786403, National Library of Medicine, available at <https://pubmed.ncbi.nlm.nih.gov/16786403/> dated April 2006.

7. Shraddha Chakradhar, “Widely Used Algorithm for Follow-Up Care in Hospitals is Racially Biased, Study Finds”, Stat News, available at <https://www.statnews.com/2019/10/24/widely-used-algorithm-hospitals-racial-bias/> dated 24-10-2019.

8. Stéphane Hulaud, Stockholm (SE) 12-1-2021, “Identification of Taste Attributes from an Audio Signal”, US10891948B2, United States Patent Hulaud, available at <https://patentimages.storage.googleapis.com/5c/0f/84/8b53c2903a82ba/US10891948.pdf>.

9. Matt Wille, “This Musician and Activist Wants to Stop Spotify Spying on its Users”, Input Magazine, available at <https://www-inputmag-com.cdn.ampproject.org/c/s/www.inputmag.com/culture/evan-greer-surveillance-capitalism-activist-musician-wants-to-stop-spotify-spying-on-its-users/amp> dated 4-8-2021.

10. Brian Boyle, Dublin (IE); Dave Lynch, Dublin (IE); George Boyle, Dublin (IE); Brendan O'Driscoll, Dublin (IE); Craig Watson, Dublin (IE); Aidan Sliney, Dublin (IE), 26-1-2017, “A System and Method of Tracking Music or other Audio Metadata from a Number of Sources in Real-Time on an Electronic Device”, available at <https://patentimages.storage.googleapis.com/06/99/e3/66f066b54bee71/US20170024399A1.pdf>

11. Pontus Persson, Stockholm (SE); 8-9-2020, Methods and Systems for Overseeing and Playback of Audio Data Received from Distance Sources, available at <https://www.musicbusinessworldwide.com/files/2020/09/SpotKaraPatent-1.pdf>.

12. Clay Gibson, New York, NY (US); Will Shapiro, New York, NY (US); Santiago Gil, Portland, OR (US); Ian Anderson, New York, NY (US); Margreth Mpossi, Stamford, CT (US), Oguz Semerci, New York, NY (US); Scott Wolf, Brooklyn, NY (US); 6-10-2020, “Methods and Systems for Personalising User Experience Based on Personality Traits, US 10,798,214 B2, United States Patent Hulaud, available at <https://www.musicbusinessworldwide.com/files/2020/10/Spot-personality-patent.pdf>.

13. Andrew McStay, “The Right to Privacy in the Age of Emotional AI”, OHCHR, available at <https://www.ohchr.org/sites/default/files/Documents/Issues/DigitalAge/ReportPrivacyinDigitalAge/AndrewMcStayProfessor_of_Digital_Life,_BangorUniversityWalesUK.pdf>, last accessed on 30-8-2022.

14. Jennifer S. Bard, “Developing a Legal Framework for Regulating Emotion AI”, University of Florida Levin College of Law, 27 BUJSci 7 Tech L (2020).

15. 2016 SCC OnLine US SC 98 : 578 US (2016).

16. 2015 WL 7575033 (FTC 13-11-2015) (initial decision).

17. EU General Data Protection Regulation, 2016, Art. 22.

18. Panel for the Future of Science and Technology, The Impact of the General Data Protection Regulation (GDPR) on Artificial Intelligence, European Parliament, available at

 <https://www.europarl.europa.eu/RegData/etudes/STUD/2020/641530/EPRS_STU(2020)641530_EN.pdf> dated 20-6-2020.

19. Spotify AB, “Personal Data we Collect”Spotify Privacy Policy, available at <https://www.spotify.com/in-en/legal/privacy-policy/#3-personal-data-we-collect-about-you>, last accessed on 30-8-2022.

20. EU General Data Protection Regulation, 2016, Art. 15.

21. EU General Data Protection Regulation, 2016, Art. 15.

22. K.S. Puttaswamy v. Union of India, (2019) 1 SCC 1.

23. Constitution of India, Art. 21.

24. Lok Sabha, Bill for Withdrawal, Supplementary List of Business, <http://164.100.47.194/Loksabha/Business/ListofBusiness.aspx> dated 3-8-2022.

25. Information Technology (Reasonable Security Practices and Procedures and Sensitive Personal Data or Information) Rules, 2011.

26. 2017 SCC OnLine Kar 424.

27. 2017 SCC OnLine Guj 2493.

28. Constitution of India, Art. 12.

29. Information Technology Act, 2000, S. 43-A.

Join the discussion

Leave a Reply

Your email address will not be published. Required fields are marked *