Jump to content

Lentis/Social Media Mining

From Wikibooks, open books for an open world

Overview

[edit | edit source]

Social Media Mining

[edit | edit source]

Social media mining is the process of collecting and analyzing user-generated data from social media platforms to discern patterns. It is often used to conduct research or inform advertisers. Social media mining can reveal common opinions on subjects, separate large groups, and identify changes in individuals and groups over time. Worldwide social media use is expected to continue to increase [1] , as will user-generated data. As more data is generated, collected, and analyzed, different interest groups will make more informed decisions.

Facebook

[edit | edit source]

Facebook is a social media site that allows users to connect and share content online. While it was created for college students in 2004, Facebook grew to become the world’s largest social media platform, with over 2.2 billion monthly active users as of 2018 [2] In 2008, presidential candidates began using Facebook to promote their campaign by efficiently connecting with large numbers of voters.

Cambridge Analytica

[edit | edit source]

Cambridge Analytica was a London-based data analytics and consulting firm with political and commercial divisions. It was founded in 2013, and discontinued operations in May of 2018. The company claimed to use data to change audience behavior, and owned up to 5,000 data points on over 87 million Americans [3] . Cambridge Analytica is most widely known for its role in Trump’s 2016 presidential campaign. Its controversial methods of data collection were made public shortly before it discontinued operations.

Key Events of the Cambridge Analytica Scandal

[edit | edit source]

Data Collection

[edit | edit source]

By late 2015, Facebook found out that information from more than 50 million users had been harvested by an app. The app called This Is Your Digital Life was developed by Aleksandr Kogan, a research associate at Cambridge University. Through this app, more than 500,000 users were paid to take personality tests. Facebook’s platform policy allowed the collection of friends’ data to improve user experience in the app. Thanks to such policy, the app was also able to collect information from the test taker’s Facebook friends. After learning about this, Facebook changed its platform policy by making friend’s information off-limits to third-party apps. They also banned the app and demanded and received certification that the data had been destroyed. However, Kogan was still in possession of the data and sold it to Cambridge Analytica. [4]

2016 Presidential Races

[edit | edit source]

During the Republican Primary for the 2016 Presidential Election, Ted Cruz relied heavily on Cambridge Analytica’s services, spending $5.8 billion. The Cruz campaign said that Cambridge Analytica assured them that all of their data was acquired legally and within the boundaries of regulations, and that the claims made by the media in 2015 were false. [5] Shortly thereafter, the Trump campaign acquired the services of Cambridge Analytica. The company used various algorithms to target 10,000 different ads to different audiences. These ads were determined to have billions of views in the months leading up to the election. Cambridge Analytica later reported that one of these ads, an interactive graphic that detailed "10 inconvenient truths about the Clinton Foundation," had an average engagement time of four minutes, and called it "the most successful thing we pushed out." [6]

Appearances Before Lawmakers

[edit | edit source]

In April of 2018, Zuckerberg appeared before congress and spent 10 hours answering questions from 91 different senators regarding Facebook’s data privacy practices and relationship with Cambridge Analytica. While careful not to call the events a data breach, Zuckerberg took responsibility for what occurred and apologized for his handling of the situation. He also defended Facebook by arguing that users own their data and have complete control over what information they share with Facebook [7] . After the hearings concluded, lawmakers were criticized for their lack of basic social media knowledge and unwillingness to follow-through on tough questions.
One month later, Zuckerberg appeared before European Parliament, and had 22 minutes to respond to many detailed questions. While MEPs were perceived as asking more difficult questions, the inquiry’s format did not allow Zuckerberg to provide in-depth answers. Zuckerberg ran out of time before he could respond to many questions that had not been addressed in the congressional hearing [8].

Participants

[edit | edit source]

Facebook Executives

[edit | edit source]

While Facebook originated as student directory for college students, it rapidly evolved into a platform to “bring the world closer together.” [9] [10] After the company went public, however, Mark Zuckerberg and his other executives had a responsibility to make money for their shareholders. Growing the value of the company is directly tied to Facebook’s public perception. While Facebook may claim to value social good and philanthropy, its agenda to profit becomes clearer as it allows various companies to gather user data, often at the peril of the privacy of its users. Since Facebook’s value is driven in part by its users’ opinions, Facebook executives have encountered a dilemma regarding the extent to which their users information can be mined.

Cambridge Analytica Executives

[edit | edit source]

Cambridge Analytica executives, in contrast to Facebook’s, did not show the same respect and value of individuals privacy rights. Since they were hired directly by companies to “change audience behavior,” they were inherently less concerned with their own public image, but instead with legal consequences. Christopher Wylie, a founder of Cambridge Analytica who left the company in 2014, said publicly that "rules don't matter for them. For them, this is a war, and it's all fair," further demonstrating Cambridge Analytica's executives' money-driven agendas. [11]

American Facebook Users

[edit | edit source]

This case shed light on the importance of privacy to the average American Facebook user. Users value from Facebook is typically entertainment driven. They value the information they put online, and expect Facebook to be able to offer a certain level of privacy. Some Facebook users rely on Facebook as a source of news. When the news shown to each user is designed to influence rather than inform, Facebook users lose some ability to make their own informed decisions from unbiased sources.

Trump 2016 Presidential Campaign

[edit | edit source]

The Trump campaign team is entirely motivated by electing Donald Trump to office. This effort came at the expense of the American people’s privacy, demonstrated when the team paid for the services of Cambridge Analytica with the understanding that Cambridge Analytica had user information that would be leveraged to change voter behavior.

Generalizable Lessons

[edit | edit source]

Understanding Terms of Service

[edit | edit source]

According to Deloitte (2017), 91% of Americans consent to legal terms and services conditions without reading them.[12] The complexity of the language and length of the terms are the biggest deterrents as Americans would have to spend 244 hours annually in order to read all the privacy policies they agreed to within a year (McDonald and Cranor, 2008).[13] Therefore, it can be assumed that the majority of Facebook users had not read Facebook’s terms of use when the Cambridge Analytica scandal became public. The users were not aware of the possibility that their information could be collected and mined by third parties in order to pursue their agendas. Such lack of knowledge led to a massive public outcry causing Facebook to lose $100 billion on stock value within days.

[edit | edit source]

After the scandal became public, Paul Grewal (2018) claimed that there had not been a data breach as “Aleksandr Kogan requested and gained access to information from users who chose to sign up to his app, and everyone involved gave their consent.”[14] A week after, following the negative public reaction, Zuckerberg (2018) issued a statement taking responsibility for the scandal and assuming that there had been a “breach of trust between Facebook and the people who share their data with us and expect us to protect it.” [15] Companies must consider public reaction as part of the potential consequences of their terms of service and be knowledgeable of the fact that a relevant proportion of users have not read them.

While users may not have had anything to directly hide in their “like” profile, they were still targeted and influenced by ads. Their data was used to help a political candidate pursuing an agenda many Americans strongly oppose. It is clear that public outburst played an important role, but was the reaction caused by the sole factor of users’ data being mined or due to the agenda it was used to pursue? According to Deloitte (2017), “more than 80 percent of consumers believe that companies use their personal data and 78 percent believe that their personal data is shared with third parties.”[12] Therefore, it can be inferred that most users believed their data was already being used and that the public discomfort was caused by the fear of how one’s data was used and not by what the data could say about the person.

Users of Facebook and participants in Kogan’s app accepted terms of service that allowed their personal information to be mined to varying degrees. Despite this acceptance, the Cambridge Analytica scandal caused a significant public uproar, causing the hashtag #DeleteFacebook to be trending on Twitter. On March 26, 2018, when news broke that over 87 million users’ information had been collected, Facebook’s stock plummeted 24%, which equated to over $134 billion. [16] From this incident, we can better understand that while companies may have legal protection to perform certain acts via their terms of service, public opinion and backlash can cause significantly larger damage to a company’s value than a few lost legal battles.

Further Research and Chapter Extension

[edit | edit source]

Regulation

[edit | edit source]

In 1988, the UN Human Rights Committee foresaw privacy issues regarding digital information, stating that gathering and storage of digital personal information “must be regulated by law.” [17] Nevertheless, it took 30 years until the United Nations’ General Data Protection Regulation (GDPR) came into effect in May of 2018. The GDPR “was designed to modernize laws that protect the personal information of individuals.” Prior to the GDPR, the digital personal information space was largely unregulated in the United States, with some regulation in Europe that consistently struggled to keep up with technological advancement. [18]

Social Media Mining in Other Countries

[edit | edit source]

British lawmakers have published evidence proving that Vote Leave, the official Brexit Leave campaign, benefited from work by Cambridge Analytica. This claim has been confirmed by Christopher Wylie, the whistleblower who revealed the Trump campaign and Cambridge Analytica scandal. Vote Leave spent £3.9m on services from Aggregate IQ, firm founded by SLC which is the parent company of Cambridge Analytica. AIQ and Cambridge Analytica have denied the two companies collaborated during the Brexit campaign. However, Wylie (2018) affirms that both companies shared a database and software during the EU referendum campaign. [19]

References

[edit | edit source]
  1. Statista (2017). Number of social network users worldwide from 2010 to 2021 (in billions). https://www.statista.com/statistics/278414/number-of-worldwide-social-network-users/
  2. Statista (2018). Number of monthly active Facebook users worldwide as of 3rd quarter 2018 (in millions). https://www.statista.com/statistics/264810/number-of-monthly-active-facebook-users-worldwide/
  3. Cambridge Analytica (2018). CA Political. https://webcache.googleusercontent.com/search?q=cache:uSCCZ4p2O4UJ:https://cambridgeanalytica.org/+&cd=12&hl=en&ct=clnk&gl=us
  4. The Guardian (2018). 50 million Facebook profiles harvested for Cambridge Analytica in Major Data Breach https://www.theguardian.com/news/2018/mar/17/cambridge-analytica-facebook-influence-us-election
  5. Svitek, P., & Samsel, H. (2018, Mar 20). Ted Cruz says Cambridge Analytica told his presidential campaign its data use was legal.
  6. Lewis, P., & Hilder, P. (2018, March 23). Cambridge Analytica's blueprint for Trump victory.
  7. The Washington Post (2018, April 10). Transcript of Mark Zuckerberg’s Senate Hearing. https://www.washingtonpost.com/news/the-switch/wp/2018/04/10/transcript-of-mark-zuckerbergs-senate-hearing/
  8. BBC (2018, May 22). Zuckerberg’s European Parliament Testimony Criticised. https://www.bbc.com/news/technology-44210800
  9. McGirt, E. (2017, Oct 05). Facebook's Mark Zuckerberg: Hacker. Dropout. CEO.
  10. Constine, J. (2017, June 22). Facebook changes mission statement to 'bring the world closer together'.
  11. Rosenberg, M., Confessore, N., & Cadwalladr, C. (2018, Mar 17). How Trump Consultants Exploited the Facebook Data of Millions.
  12. a b Deloitte (2017). Global Mobile Consumer Survey: US edition. https://www2.deloitte.com/content/dam/Deloitte/us/Documents/technology-media-telecommunications/us-tmt-2017-global-mobile-consumer-survey-executive-summary.pdf
  13. McDonald and Cranor (2008). The Cost of Reading Privacy Policies. http://lorrie.cranor.org/pubs/readingPolicyCost-authorDraft.pdf
  14. Grewal, P. (2018). Suspending Cambridge Analytica and SCL Group From Facebook. https://newsroom.fb.com/news/2018/03/suspending-cambridge-analytica/
  15. Zuckergeberg, M. (2018). Update on Cambridge Analytica. https://www.facebook.com/zuck/posts/10104712037900071
  16. Mirhaydari, A. (2018, May 10). Facebook stock recovers all $134B lost after Cambridge Analytica data scandal.
  17. US Should Create Laws to Protect Social Media Users' Data. (2018, Apr 05).
  18. Burgess, M. (2018, November 08). What is GDPR? The summary guide to GDPR compliance in the UK.
  19. Wylie, C. (2018) The Great British Brexit Robbery: How Our Democracy Was Hijacked. https://www.theguardian.com/technology/2017/may/07/the-great-british-brexit-robbery-hijacked-democracy