Professionalism/Amazon, Rekognition, and Law Enforcement
Computer Vision
[edit | edit source]Amazon’s Rekognition software is one of a set of AI technologies known as computer vision.[1] In this field computers are trained to interpret and visualize the world around them. It uses databases of reference images or videos with labels as comparators to the target image or video in a process known as deep learning. Deep learning models use these databases to identify objects and react accordingly. Computer vision is a recent field made possible through a proliferation of image taking devices and increases in computing power and data capacity. It can be used to detect and identify people as well, including their pathing and other details. This technology has widespread applications throughout medical, manufacturing, and security industries.
Amazon Rekognition
[edit | edit source]Amazon’s department for facial recognition through computer vision is known as Amazon Rekognition.[2] Running on Amazon's own AWS system of cloud based computing, it offers a scaling suite of computer vision features to both large and small scale consumers. Abilities include object, scene, and activity detection, facial recognition, facial analysis, pathing, and nsfw content detection. It is capable of analyzing and filtering huge amounts of information in small periods of time. Extraction of text from images is another small feature. It's computer vision algorithms can be run on either user created or Amazon generated databases. This allows customers to customize the systems accuracy to their particular use cases. One of the biggest customers for Amazon Rekognition is law enforcement groups. Quick algorithms to recognize individuals can be used in cases like kidnappings as well as tracking criminals. Computer's are also more reliable for repeat offenders where the database of images is larger.
Law Enforcement
[edit | edit source]Washington County Sheriff's Office
[edit | edit source]Amazon as been working to modify the Rekognition technology to better suit police forces. Amazon has sold the Rekognition technology to the Washington County Sheriff’s Office in Oregon. The police use this technology to identify persons of interest. Before the technology, persons of interest were identified by on-duty police officers. If a certain officer was off-duty during a facial recognition breakthrough, the process would be slowed. Amazon’s Rekognition technology allows faces to be run through a database to find a match must faster than before.
Amazon’s Rekognition technology is not always accurate. Amazon recommends that police forces acquire at least a 95% match before proceeding with arrest. The databases used by Rekognition must been expanded to increase the reliability that the software will make a correct match.
BodyWorn
[edit | edit source]BodyWorn is a facial recognition technology that utilizes the body cameras worn by police officers.[3] The body cameras take footage that is immediately sent to the precinct. The precinct can then run the footage through the facial recognition software in hopes of finding a match. This decreases the length of time that it would take for a police investigation to find a person of interest.
Celebrity Mismatching
[edit | edit source]Amazon’s Rekognition online dashboard offers a Celebrity Recognition feature. Users may upload photos of celebrities and have them matched with a degree of certainty. Uploaded photos that are closer to a portrait have a higher degree of certainty. A portrait of Tony Bennett [4] had a higher degree of certainty (91%) than a photo of him after winning the Final Four game against Purdue (67%).[5]
Some celebrities look somewhat alike, but Rekognition is still able to distinguish them. Rekognition labeled an uploaded photo of Katy Perry and Zooey Deschanel side by side correctly with certainty at or above 80%.[6] Although these two look similar, there are thousands of pictures of them on the internet. Actor Will Ferrell and drummer Chad Smith also look similar, but Rekognition was still able to distinguish them with certainty of one hundred percent.[7] Like Perry and Deschanel, these two are pictured widely across the web. Rekognition compares these photos to databases full of pre-labeled celebrity images. it is no surprise that these images were labeled correctly because the uploaded images were taken directly from Google. Some of the images in the comparison databases may be the exact ones that were uploaded. Being an exact match, one hundred percent certainty of labeling the celebrity makes sense.
Rekognition may be successful in the comparing of popular images, but may rely on the popularity of the celebrity. The algorithm does seem to have promising applications, but may rely on features that are not always present. Ferrell and Smith were distinguished correctly, but there are small features that differentiate them. Both celebrities have different hairlines, tend to dress differently, and often have different facial hair. Smith often tends to wear backwards hats, whereas Ferrell does not. In an uploaded image of Ferrell and Smith wearing the same clothes, Rekognition labeled Ferrell and Smith both as Chad Smith.[8] Having a similar facial structure, wearing the same clothes and hats may take away some of the differences in the two celebrities, producing inaccurate results.
Politician Mismatching
[edit | edit source]The American Civil Liberties Union (ACLU) conducted a study in which images of lawmakers were run through the Rekognition software. They found that the software misidentified 28 lawmakers. Of these 28 misidentifications, the majority of them were African-American and Latino. In some of the cases, the lawmakers were matched with images of people who had previously been arrested.[9]
Three of the lawmakers who were misidentified contacted Jeff Bezos to voice their concerns. They emphasized that the 5% error rate among known lawmakers indicates that the Rekognition software has issues and should not be sold to law enforcement any time soon. They requested additional information on how Amazon tests the Rekognition technology and who Amazon's government customers are to ensure that injustices are not occurring due to the inaccuracies of the technology.[9]
ACLU
[edit | edit source]The American Civil Liberties Union (ACLU) has written a letter to Amazon’s CEO, Jeff Bezos, expressing their concerns regarding Rekognition. In the letter, the ACLU discussed the frequent misidentification of people of color by the Rekognition software. They fear that the government could use the software to remove freedom from already over-policed communities of color. They discuss that the Rekognition software would allow the government to “continuously track immigrants as they embark on new lives” and “identify political protesters captured by officer body cameras.”[10]
One major issue that the ACLU had with Amazon was their connection to U.S. Immigration and Customs Enforcement (ICE). Amazon has been in contact with ICE regarding Rekognition.[10] The ACLU argues that if ICE has the Rekognition technology in their toolbelt, it will make it much easier for them to target and separate families living in the U.S.[10] The Rekognition software could theoretically have a database of all illegal immigrants in the U.S. and compare any individual that comes into contact with the software to the database.
The ACLU used other companies to show that facial recognition technology is highly flawed. In the letter, they discussed how both Google and Microsoft have acknowledged the risks with facial recognition software and do not intend to market it until these risks are mitigated.[10] Other organizations signed the letter to demonstrate solidarity against Rekognition being sold to law enforcement. Two of the organizations include the Muslim Justice League and the Center on Policy Initiatives.
Failure to Comply with Standards
[edit | edit source]The National Institute of Standards and Technology (NIST) is a government organization that aims to provide standards for technology in the US. NIST sets standards for certain algorithms, one of which is facial recognition and gender classification.[11] They have an ongoing Face Recognition Vendor Test (FRVT),[12] working with 127 algorithms among 45 vendors to set facial recognition accuracy standards and make sure that all are performing in accordance. Although the Rekognition service has been available since 2016, they are still not a NIST vendor.[13] With a reputable computing infrastructure, users would have no reason to suspect Amazon’s service was inaccurate. Their large brand name was the driving force in user trust of the system even though it was not verified by NIST standards for three years.
Free Comparison Databases
[edit | edit source]Social media content sharing sites (Facebook, Instagram, Twitter, etc.) allow for the harvesting of free data to populate recognition comparison databases. Nothing is stopping someone from saving Facebook photos and creating their own database full of human images to compare other faces to. Camera footage ran against some of these databases may be skewed. Comparing footage to databases of mugshots will produce results only for those previously convicted. Another resort is to compare footage against images of random people, hoping to find a match. Georgetown researchers found that about half of all american adults have their face in a comparison database.[14] These databases may produce inaccurate results. Security camera footage is often inaccurate because of the blurriness of the image, which does not allow for the heightening of facial features. When such footage is compared with these databases, a match may occur, regardless of its validity. Innocent people may be drawn into an investigation or convicted of a crime they did not commit, tarnishing their reputation.
Algorithm Justice League
[edit | edit source]Letter to Bezos
[edit | edit source]The Algorithm Justice League (AJL) was started by Joy Buolamwini in response to the flaws in the Rekognition service.[15] They wrote Bezos in 2018 to explain that there are several issues recognizing minorities with his service.[16] Some of the features that were being used to identify people (skin color, hair color, etc.) were not favorable to minority groups.[17] In the case of gender classification, there is 50% chance of correctly labeling an image. Facial recognition has much lower odds, with a small chance that the compared face will be in a database. Law enforcement would not be using this technology for identifying makes and females, but identities. They urged Bezos to stop working with law enforcement as this technology can be more harmful than good when people are misidentified.[16]
Safe Face Pledge
[edit | edit source]The AJL has started the Safe Face Pledge, which forbids the use of facial recognition for police use and any government use of it must be fully transparent, meaning non-skewed databases that represents all races and genders equally.[18] The AJL urged Amazon to sign the pledge, stop working with law enforcement, and work with NIST to produce accurate results. As of May 2019, Amazon continues to work with law enforcement. They are not a NIST vendor, claiming that the Rekognition infrastructure cannot be downloaded as it is a part of AWS. Amazon is open to working with NIST to develop better benchmarks and testing of external APIs.[19]
References
[edit | edit source]- ↑ "Computer Vision: What it is and why it matters". www.sas.com. Retrieved Oct 14, 2022.
- ↑ https://aws.ama zon.com/rekognition/
- ↑ "Body Worn Camera & Evidence Management Tools | Utility, Inc". Retrieved Oct 14, 2022.
- ↑ https://virginiasports.com/images/2018/8/22/Bennett_Tony_Mug.jpg?width=300
- ↑ https://sportshub.cbsistatic.com/i/2019/03/31/9e8e3806-9213-42b6-993e-37faa84f072e/tony.jpg
- ↑ https://i0.heartyhosting.com/radaronline.com/wp-content/uploads/2010/12/splitspl160060003_-_1.jpg?fit=1380%2C880&ssl=1
- ↑ https://images.complex.com/complex/image/upload/c_limit,w_680/fl_lossy,pg_1,q_auto/s1bq6dxkxf9bkgrxitck.jpg
- ↑ https://media.fromthegrapevine.com/assets/images/2015/4/will-ferrell-chad-smith.jpg.480x0_q71_crop-scale.jpg
- ↑ a b Singer, Natasha (Jul 26, 2018). "Amazon's Facial Recognition Wrongly Identifies 28 Lawmakers, A.C.L.U. Says". Retrieved Oct 14, 2022 – via NYTimes.com.
- ↑ a b c d "Coalition Letter to Amazon Urging Company Commit Not to Release Face Surveillance Product". American Civil Liberties Union. Retrieved Oct 14, 2022.
- ↑ "About NIST". NIST. Jul 10, 2009. Retrieved Oct 14, 2022 – via www.nist.gov.
- ↑ https://nvlpubs.nist.gov/nistpubs/ir/2018/NIST.IR.8238.pdf
- ↑ https://www.nist.gov/sites/default/files/documents/2018/06/21/frvt_report_2018_06_21.pdf
- ↑ "Half of All American Adults are in a Police Face Recognition Database, New Report Finds". Retrieved Oct 14, 2022.
- ↑ "Algorithmic Justice League - Unmasking AI harms and biases". www.ajl.org. Retrieved Oct 14, 2022.
- ↑ a b https://uploads.strikinglycdn.com/files/e286dfe0-763b-4433-9a4b-7ae610e2dba1/RekognitionGenderandSkinTypeDisparities-June25-Mr.%20Bezos.pdf?id=125030
- ↑ Buolamwini, Joy (Dec 12, 2018). "Amazon's Symptoms of FML — Failed Machine Learning — Echo the Gender Pay Gap and Policing Concerns". Retrieved Oct 14, 2022.
- ↑ "Pledge". Safe Face Pledge. Retrieved Oct 14, 2022.
- ↑ https://aws.ama zon.com/blogs/machine-learning/thoughts-on-recent-research-paper-and-associated-article-on-amazon-rekognition/