Jump to content

Issues in Interdisciplinarity 2019-20/Truth in Cambridge Analytica Scandal

From Wikibooks, open books for an open world

This Wikibook chapter explores how the truth about the Facebook–Cambridge Analytica data scandal is portrayed in different disciplines and why these contrasting truths create an interdisciplinary issue.

Case Study: The Facebook-Cambridge Analytica Scandal

[edit | edit source]

The scandal was publicized in 2018 when the political consulting firm Cambridge Analytica (CA) was reported to have been harvesting data from Facebook profiles to sway the 2016 United States presidential campaign in favour of Republican candidate Ted Cruz.[1] Upon further investigation, CA was found to have influenced more than 200 elections worldwide,[2] including Donald Trump's 2016 campaign,[3] where the data of up to 87 million Facebook profiles was compromised.[4]

The acquisition of the data

[edit | edit source]

In 2013, the data scientist Aleksandr Kogan developed a personality quiz app called "thisisyourdigitallife", which paid people to answer 120 questions,[5] but first required the user to provide access to their Facebook activity.[6] Kogan was researching the link between Facebook activity and personality. Later that year, Kogan was contacted by the firm SCL Group, a parent company of CA, with a business proposal: SCL would pay Kogan to get more people to take the survey[5] if he provided them with the gathered data. SCL was interested in the app as it gathered data from its users as well as their friends' data,[6] such as likes, posts, comments, location, and even private messages.[7] An estimated number of 270 000 people took the survey,[8] compromising the private data of 87 million people.

Under 2013 Terms of Service, Facebook's Application Programming Interface (API) allowed developers to access the data of an individual in question and their friends.[9] Kogan's 2013 operations were legal since Facebook users were expected to have agreed to the terms and conditions that put them at such data extraction risk. However, Facebook changed its API version in April 2014, limiting the data scope new app developers could have, but giving existent apps a one-year enforcement delay to redesign their software to match the new API.[10] Kogan had the right to keep harvesting data under the 2013 API for one more year. Consequently, in 2015, "thisismydigitallife" ceased its operations but was allowed to keep all the data it had harvested.[10] Regarding the initial agreement between Kogan and Facebook, Kogan explicitly stated his possible intention to sell,[11] to which Facebook knowingly agreed, although it conflicted with sections of Facebook's Platform Policy.[10] The positive and objective truth from the discipline of law is that the data collected by Kogan was gathered and sold to CA legally.

The 1948 version of the Universal Declaration of Human Rights

Human Rights

[edit | edit source]

Article 12 of the Universal Declaration of Human Rights declares the right to privacy. We see a clear violation of this principle as it was unrighteous to gather users' personal data without consent as it took advantage of the users' unawareness and trust. Facebook was therefore fined £500,000 [12] by the Information Commissioner's Office in the UK for the supposed breaching of British Privacy Law section DPP1 (Data Protection Principles), as the data collection was carried out unjustly, although legally.

The exploitation of the data

[edit | edit source]

Psychology

[edit | edit source]
The OCEAN Model

To influence the voters, CA quantified 5 different personality traits in all 87 million profiles following the OCEAN (The Big Five Personality Traits) model:[6] Openness, Conscientiousness, Extroversion, Agreeableness, and Neuroticism.[13]

Once the psychographic information was obtained, CA chose to target some voters on the topic of creation of jobs: a person scoring high in the "openness" category was targeted with an ad on gaining experience through work,[6] whilst a highly neurotic person got another ad on receiving security and emotional stability through work.[6] Micro-targeting voters is much more efficient as it appeals to the individual, rather than being a mass distribution of one advertisement.

Data Science

[edit | edit source]
A simplified artificial neural network

To convert the raw data into individual OCEAN profiles, CA used a process called data mining, which predicts information based on patterns. CA began by developing a training data set (the initial Facebook activity and answers to the survey), on which they could base the target variable data set on (Facebook data they wanted to make predictions on).[6] Using artificial neural networks, the CA team mathematically linked the initial Facebook data (270000 survey participants) to their self-declared personality attributes by continually optimizing the weight of neural network connections to perfect prediction processes.[6] Once this training data set perfected, they applied it to the target variable set,[6] which resulted in 87 million OCEAN personality predictions of astounding accuracy.

CA used another algorithm that generated custom Facebook ads for each individual,[6] as Facebook's extremely lenient Advertisement Policy allowed such political targeting to take place.[14]

Ethics, Politics, and Media

[edit | edit source]

From the viewpoint of democratic ideals, social platforms being manipulatively used for micro-targeting voters is unethical because it violates the fundamental "fair and free" aspect of democracy. It gives rich campaigners the ability to indirectly suppress freedom of choice. However, this is an example of subjective truth. From a consequentialist and autocratic point of view, manipulating people in order to guide society for the autocratic "greater good" is justified. The media calling out what happened in 2018 puts into light the democratic bias which forms our current definitions of what is right and what is wrong.

Ethics as a discipline and concept is socially constructed. Thus, a constructivist approach to the scandal allows for a better understanding of why, as advocates of democracy, some people believe it was unethical.

From the perspective of the objective and positivist truth gathered from international Law, the exploitation of personal data by CA in 2016 does not have any legal consequences as it was conducted under jurisdiction. The data mining for psychographic profiles was legal as CA owned the raw data, and the micro-targeting of voters through Facebook is legal as their Advertisement Policy allows this.[15] Although CA's influence in the 2014 and 2016 elections possibly violated US election law which forbids foreign involvement in elections, the issue has not been taken to court.[16]

Conclusion

[edit | edit source]

To understand the Cambridge Analytica scandal, it is imperative to take an interdisciplinary approach as the disciplines of law, data science, psychology, and politics come into play. However, we observe that the objective and positive truth provided by law clashes with the constructivist and subjective truth provided by ethics. Legally speaking, nothing illicit took place: the reason personal information was accessed was a loophole in Facebook's API. Using psychology and data science, CA was able to transform the data into information warfare and as a consequence influence over 200 elections worldwide. However, from a modern-day ethical perspective, this was a malicious and predatory act as it undermined the fundamental values of democracy and violated human rights. This scandal exemplifies an interdisciplinary issue related to the concept of truth: the law maintains that nothing wrong happened, while modern-day ethics clearly condemns the scandal.

References

[edit | edit source]
  1. Davies H. Ted Cruz campaign using firm that harvested data on millions of unwitting Facebook users [Internet]. The Guardian. December 2015 [cited 25 November 2019]. Available from: https://www.theguardian.com/us-news/2015/dec/11/senator-ted-cruz-president-campaign-facebook-user-data
  2. The global reach of Cambridge Analytica [Internet]. BBC News. 2018 March [cited 25 November 2019]. Available from: https://www.bbc.co.uk/news/world-43476762
  3. Lewis P, Hilder P. Leaked: Cambridge Analytica's blueprint for Trump victory [Internet]. The Guardian. 2019 [cited 25 November 2019]. Available from: https://www.theguardian.com/uk-news/2018/mar/23/leaked-cambridge-analyticas-blueprint-for-trump-victory
  4. Kozlowska H. The Cambridge Analytica scandal affected nearly 40 million more people than we thought [Internet]. Quartz. 2018 April [cited 25 November 2019]. Available from: https://qz.com/1245049/the-cambridge-analytica-scandal-affected-87-million-people-facebook-says/
  5. a b Wong J, Lewis P, Davies H. How academic at centre of Facebook scandal tried – and failed – to spin personal data into gold [Internet]. the Guardian. 2018 April [cited 1 December 2019]. Available from: https://www.theguardian.com/news/2018/apr/24/aleksandr-kogan-cambridge-analytica-facebook-data-business-ventures
  6. a b c d e f g h i Hern A. Cambridge Analytica: how did it turn clicks into votes? [Internet]. the Guardian. 2018 May [cited 1 December 2019]. Available from: https://www.theguardian.com/news/2018/may/06/cambridge-analytica-how-turn-clicks-into-votes-christopher-wylie
  7. Kelion L. Facebook: Cambridge Analytica data had private messages [Internet]. BBC News. 2018 April [cited 1 December 2019]. Available from: https://www.bbc.co.uk/news/technology-43718175
  8. Reynolds M. Cambridge Analytica: Academic Dr Aleksandr Kogan claims he has been made a 'scapegoat' [Internet]. Express.co.uk. 2018 April [cited 1 December 2019]. Available from: https://www.express.co.uk/news/world/935252/cambridge-analytica-academic-aleksandr-kogan-scapegoat-privacy-app-quizzes
  9. Wagner K. Here’s how Facebook allowed Cambridge Analytica to get data for 50 million users [Internet]. Vox. 2018 March [cited 2 December 2019]. Available from: https://www.vox.com/2018/3/17/17134072/facebook-cambridge-analytica-trump-explained-user-data
  10. a b c Information Commissioner's Office. Monetary Penalty Notice to the Facebook Companies. Wilmslow; 2018 October p. 1-18. Available from: https://ico.org.uk/media/action-weve-taken/mpns/2260051/r-facebook-mpn-20181024.pdf
  11. Poulsen K. Oops! Mark Zuckerberg Surprised to Learn the Terms of Service for ‘Your Digital Life’ [Internet]. The Daily Beast. 2018 April [cited 2 December 2019]. Available from: https://www.thedailybeast.com/oops-mark-zuckerberg-surprised-to-learn-the-terms-of-service-for-your-digital-life
  12. Waterson J. UK fines Facebook £500,000 for failing to protect user data [Internet]. The Guardian. 2018 October [cited 1 December 2019]. Available from: https://www.theguardian.com/technology/2018/oct/25/facebook-fined-uk-privacy-access-user-data-cambridge-analytica
  13. The Big Five Personality Traits Model (OCEAN Model) [Internet]. Cleverism. 2019 March [cited 4 December 2019]. Available from: https://www.cleverism.com/big-five-personality-traits-model-ocean-model/
  14. Matsakis L. Facebook's Targeted Ads Are More Complex Than It Lets On [Internet]. Wired. 2018 April [cited 3 December 2019]. Available from: https://www.wired.com/story/facebooks-targeted-ads-are-more-complex-than-it-lets-on/
  15. Facebook advertising targeting options [Internet]. Facebook for Business. 2019 [cited 8 December 2019]. Available from: https://www.facebook.com/business/ads/ad-targeting
  16. Siddiqui S. Cambridge Analytica's US election work may violate law, legal complaint argues [Internet]. the Guardian. 2018 [cited 4 December 2019]. Available from: https://www.theguardian.com/uk-news/2018/mar/26/cambridge-analytica-trump-campaign-us-election-laws