Jump to content

Professionalism/Google Street View and Wi-Spy

From Wikibooks, open books for an open world

Introduction

[edit | edit source]

Google launched Street View in 2007 to provide street-level panoramic views in major cities, complementing satellite and aerial photography [1]. Imagery is collected using specialized vehicles equipped with 9 cameras facing in all directions, and areas not accessible by cars have been mapped using snowmobiles, tricycles, divers, boats, and free-walking camera-people. As of 2012, Google Street View had collected 20 petabytes of imagery covering 5 million miles of roads in 39 countries. This breadth and depth of coverage has led to novel applications, including cross-cultural education [2], wildlife photography, virtual sightseeing, and immersive tours of businesses [1]. The service raises new privacy challenges since it publishes high-resolution images of people, homes, and vehicles. Google uses automatic feature recognition to blur out faces and license plates before publishing, although the algorithms sometimes fail to obscure people with distinctive facial hair, for example.

While collecting imagery on-the-ground, Google also has many opportunities to collect non-image data not accessible during aerial and satellite photography. Specifically, their vehicles pass through hundreds of thousands of WiFi hotspots, recording their positions at all times, and can collect information to improve WiFi-based positioning services using specialized software. Additionally, many WiFi networks are unencrypted, enabling anyone in range of the wireless router to use the connection or intercept information being sent across the network. When first investigated in 2010 by a small German regulatory agency, Google denied intercepting such information: Google “does not collect or store payload data”[3]. However, in what became known as the "Wi-Spy" scandal, it was soon revealed that Google had been collecting hundreds of gigabytes of payload data since 2007 in 30 countries, and that much of this information was personal and sensitive. For example, first names, email addresses, physical addresses, and a conversation between two married individuals planning an extra-marital affair were all cited by the FCC. A few weeks later, Google’s Senior VP of Engineering and Research Alan Eustace admitted “We are acutely aware that we failed badly here” [4].

Initially, Google represented the data collection software as the work of a rogue engineer, referenced in FCC investigation as “Engineer Doe.” In April 2012, the identity of the engineer was revealed to be Marius Milner, a “hacker” who “know[s] more than [he] want[s] to about Wi-Fi” according to his LinkedIn page[5]. However, depicting his actions as the work of an individual at odds with the company is problematic, since management signed off on his detailed specifications before implementation, and outside analysts noted that his actions were consistent with an “engineering mindset” prevalent at Google: “grab the data first and worry about filtering it later” [5].

Role of Corporate Culture

[edit | edit source]

The WiSpy case illustrates that corporate cultures and practices can facilitate ethics violations. At Google, several factors contributed to the ethics failure in this case. First, any engineer on the project could modify the source code without notifying others on the team. [6]:16 Second, the project managers claim that they did not read the documentation pertaining to the detection of wireless network.[6]:16 Milner stated that the packet sniffing details would need to be discussed with the product’s legal counsel, but the review was never conducted.[6]:11 Finally, Google repeatedly blocked requests by the FCC to furnish emails and documentation that were pertinent to the case.[6]:2

The fact that any engineer on the Street View project could modify code without notifying colleagues or supervisors, provided that the changes are thought to be beneficial, is what allowed the ethics breach to occur in large part. According to the interviewed engineers who worked on the project, no one but Milner knew that packet data was being collected from unencrypted networks.[6]:16 A second engineer was assigned to review and verify Milner’s code line-by-line, but this engineer also claimed that he did not know what the code was used for, stating that his job was limited to verifying proper syntax and correct operation before the code was checked into Google’s source repository.[6]:17 With this little oversight, it seems possible for a “rogue employee,” as Google insists Milner was, to do virtually anything and remain undetected.

The fact that the managers did not read the documentation that Milner provided is also highly problematic. They claimed that they had “pre-approved” all of Milner’s work, giving him free reign over his component of the project, and so they did not need to verify his work.[6]:16 Interestingly, Milner did not make any attempt to obfuscate his activities, and he expressly mentioned in the documentation that he was capturing data frames from packets on unencrypted networks. Furthermore, Milner notes in his documentation that while he does not believe that his packet snooping plans are problematic, they should be discussed with the legal counsel for the project.[6]:10-11 This never happened. Again, this lack of oversight is surprising: any employee who has been “pre-approved” can design applications that could do virtually anything.

The corporate culture at Google also contributed to this ethical failure. According to Al Hilwa, an analyst at IDC, the organization has a “collect data first, filter it later” mentality[5] that encourages employees to record as much data and as many metrics on the users as possible without considering whether the data is even useful or should be collected. Milner himself wrote in his documentation that “[a]nalysis of the gathered data [was] a nongoal (though it [would] happen).”[6]:11 Furthermore, Google is very secretive and self-protecting. The company repeatedly disregarded or incompletely responded to requests from the FCC for emails and documentation. From the FCC report:

“Google’s document production included no e-mails, and the Company admitted that it had ‘not undertaken a comprehensive review of email or other communications’ because doing so ‘would be a time-consuming and burdensome task.’ Google also failed to identify any of the individuals responsible for authorizing its collection of Wi-Fi data or any employees who had reviewed or analyzed Wi-Fi communications collected by the Company.”[6]:9

When emails were finally furnished, a total of 5 emails were provided. The contents of these emails are quite telling; several emails comprise a conversation between a Street View project manager and Milner. Milner wrote “We store the whole body of all non-encrypted data frames. One of my to-do items is to measure how many HTTP requests we’re seeing.” After Doe performed some basic analysis on the collected data, he told the manager “You might recall asking me about URLs seen over Wi-Fi from the [StreetView] cars… I got round to running a quick mapreduce. Out of 300M wi-fi packets, there were 70K HTTP requests for 32K unique URLs.” The manager replied “Are you saying that these are URLs that you sniffed out of Wifi packets that we recorded while driving?” Milner replied “[T]he data was collected during daytime when most traffic is at work (and likely encrypted),”[6]:15 which is an elaborate way of saying “yes.”

Some of the strategies that Google employed in handling the WiSpy case recall Arthur Andersen’s policy of plausible deniability. Provided that Google as a corporation can claim that it is unaware of ethics violations, it can portray failings as the result of “rogue engineers.” According to the New York Times, Milner has stated that “Depicting his actions as the work of a rogue ‘requires putting a lot of dots together,’”[5] indicating that others were likely involved in the case but the organization denied their involvement.

Fines for Ethics Violations

[edit | edit source]

This case also illustrates the difficulty in holding companies accountable for ethics violations. While the FCC found that the Wi Spy program does not violate the law, it did fine Google $25,000 for interfering with their investigation[7]. This represents merely 6 seconds of Google’s profits. Ultimately, Google was fined $7 million, to be payed to the various states’ Attorneys General who took part in the class action lawsuit [8]. While this is still a small amount, the plaintiffs considered it a symbolic victory. Connecticut Attorney General George Jepsen claimed that the agreement would "ensure that Google will not use similar tactics in the future to collect personal information"[8]. However, even while the civil suit was being decided, Google was fined $22.5 million for disregarding users’ browser privacy settings[9]. This shows how little impact the case had on Google’s operations.

The small effect of ethics penalties on corporations is observable in many cases. In 2013, the investment bank HSBC was fined $1.9 billion for a money laundering scandal[10]. The bank had failed to monitor hundreds of billions of dollars in wire transfers, because it had dismantled its auditing program to save costs. This allowed drug cartels in Mexico to move billions of dollars through HSBC’s services[10]. Despite the fine, HSBC still made $13 billion in profit that year, and its gains from the illegal transactions were never considered[11].

These cases are even more stark when compared to those of individuals convicted of similar crimes. The hacker Andrew “weev” Auernheimer was sentenced in 2012 to pay $373,000, and spend 41 months in prison, for stealing personal information from AT&T [12]. Auernheimer was able to access the information on a publicly-viewable web page on the AT&T site, and sent the details to the company[12]. The rhetoric in his case was a world away from the FCC Wi Spy ruling, where the FCC compared Google’s activities to listening to a public radio broadcast. Prosecutors for the Auernheimer case likened his actions detonating a nuclear power plant in New Jersey [13]. While Auernheimer was released this year, due to an error in his trial’s proceedings, he still spent 10 months in prison. No one from Google was imprisoned for their collection of personal data; neither were any HSBC employees for laundering money. This shows how ethical behavior is held to different standards on the individual and corporate levels, in regards to legal rulings.

References

[edit | edit source]
  1. a b About Street View -- About -- Google Maps. Retrieved from: https://www.google.com/maps/about/behind-the-scenes/streetview/
  2. Austen, I. (2012, August 22). Coming Soon, Google Street View of a Canadian Village You’ll Never Drive To. The New York Times. Retrieved from: http://www.nytimes.com/2012/08/23/business/an-inuit-village-too-remote-for-cars-gets-street-view.html
  3. Fleischer, P. (2010, April 27). Data collected by Google cars. Retrieved from: http://googlepolicyeurope.blogspot.com/2010/04/data-collected-by-google-cars.html
  4. Eustace, A. (2010, May 14). WiFi data collection: An update. Retrieved from: http://googleblog.blogspot.com/2010/05/wifi-data-collection-update.html
  5. a b c d Lohr, S., & Streitfeld, D. (2012, April 30). Engineer in Google’s Street View Is Identified. The New York Times. Retrieved from http://www.nytimes.com/2012/05/01/technology/engineer-in-googles-street-view-is-identified.html
  6. a b c d e f g h i j k Federal Communications Commission. (2012). NOTICE OF APPARENT LIABILITY FOR FORFEITURE (No. DA 12-592). Washington, D.C. Retrieved from http://www.scribd.com/doc/91652398/FCC-Report-on-Google-Street-View-personal-data-mining
  7. Kena Betancur. Google fined $25,000 for impeding FCC investigation [1]
  8. a b Tony Romm. Google settles state Wi-Spy probe for $7M [2]
  9. Megha Rajagopalan. Is $22.5 Million a Big Enough Penalty for Google? [3]
  10. a b Christie Smythe. HSBC Judge Approves $1.9B Drug-Money Laundering Accord [4]
  11. Rachel Straus. HSBC pays 204 of its bankers more than £1m after profits dip to £13.7bn with UK arm hit by PPI and rate-swap mis-selling [5]
  12. a b Kim Zetter. AT&T Hacker ‘Weev’ Sentenced to 3.5 Years in Prison [6]
  13. Mike Masnick. Prosecutors Admit They Don't Understand What Weev Did, But They're Sure It's Like Blowing Up A Nuclear Plant [7]