Jump to content

Amazon Rekognition

From Wikipedia, the free encyclopedia
Amazon Rekognition
Developer(s)Amazon, Amazon Web Services
Initial release30 November 2016; 7 years ago (2016-11-30)[1]
TypeSoftware as a service
Websiteaws.amazon.com/rekognition

Amazon Rekognition is a cloud-based software as a service (SaaS) computer vision platform that was launched in 2016. It has been sold to, and used by, a number of United States government agencies, including U.S. Immigration and Customs Enforcement (ICE) and Orlando, Florida police, as well as private entities.

Capabilities

[edit]

Rekognition provides a number of computer vision capabilities, which can be divided into two categories: Algorithms that are pre-trained on data collected by Amazon or its partners, and algorithms that a user can train on a custom dataset.

As of July 2019, Rekognition provides the following computer vision capabilities.[1][2]

Pre-trained algorithms

[edit]
  • Celebrity recognition in images[3][4]
  • Facial attribute detection in images, including gender, age range, emotions (e.g. happy, calm, disgusted), whether the face has a beard or mustache, whether the face has eyeglasses or sunglasses, whether the eyes are open, whether the mouth is open, whether the person is smiling, and the location of several markers such as the pupils and jaw line.[5][6]
  • People Pathing enables tracking of people through a video. An advertised use-case of this capability is to track sports players for post-game analysis.[1][7]
  • Text detection and classification in images[8][9]
  • Unsafe visual content detection[10]

Algorithms that a user can train on a custom dataset

[edit]
  • SearchFaces enables users to import a database of images with pre-labeled faces, to train a machine learning model on this database, and to expose the model as a cloud service with an API. Then, the user can post new images to the API and receive information about the faces in the image. The API can be used to expose a number of capabilities, including identifying faces of known people, comparing faces, and finding similar faces in a database.[11][12]
  • Face-based user verification[1]

History and use

[edit]

2017

[edit]

In late 2017, the Washington County, Oregon Sheriff's Office began using Rekognition to identify suspects' faces. Rekognition was marketed as a general-purpose computer vision tool, and an engineer working for Washington County decided to use the tool for facial analysis of suspects.[12][13] Rekognition was offered to the department for free,[14] and Washington County became the first US law enforcement agency known to use Rekognition. In 2018, the agency logged over 1,000 facial searches. The county, according to the Washington Post, by 2019 was paying about $7 a month for all of its searches.[15] The relationship was unknown to the public until May 2018.[14] In 2018, Rekognition was also used to help identify celebrities during a royal wedding telecast.[16]

2018

[edit]

In April 2018, it was reported that FamilySearch was using Rekognition to enable their users to "see which of their ancestors they most resemble based on family photographs".[17] In early 2018, the FBI also began using it as a pilot program for analyzing video surveillance.[16]

In May 2018, it was reported by the ACLU that Orlando, Florida was running a pilot using Rekognition for facial analysis in law enforcement,[18] with that pilot ending in July 2019.[19] After the report,[20][21] on June 22, 2018, Gizmodo reported that Amazon workers had written a letter to CEO Jeff Bezos requesting he cease selling Rekognition to US law enforcement, particularly ICE and Homeland Security.[21] A letter was also sent to Bezos by the ACLU.[20] On June 26, 2018, it was reported that the Orlando police force had ceased using Rekognition after their trial contract expired, reserving the right to use it in the future.[20] The Orlando Police Department said that they had "never gotten to the point to test images" due to old infrastructure and low bandwidth.[14]

In July 2018, the ACLU released a test showing that Rekognition had falsely matched 28 members of Congress with mugshot photos, particularly Congresspeople of color. 25 House members afterwards sent a letter to Bezos, expressing concern about Rekognition.[22] Amazon responded saying the Rekognition test had generated 80 percent confidence, while it recommended law enforcement only use matches rated at 99 percent confidence.[23] The Washington Post states that Oregon instead has officers pick a "best of five" result, instead of adhering to the recommendation.[15]

In September 2018, it was reported that Mapillary was using Rekognition to read the text on parking signs (e.g. no stopping, no parking, or specific parking hours) in cities.[9]

In October 2018, it was reported that Amazon had earlier that year pitched Rekognition to U.S. Immigration and Customs Enforcement agency.[22][24] Amazon defended government use of Rekognition.[23]

On December 1, 2018, it was reported that 8 Democratic lawmakers had said in a letter that Amazon had "failed to provide sufficient answers" about Rekognition, writing that they had "serious concerns that this type of product has significant accuracy issues, places disproportionate burdens on communities of color, and could stifle Americans' willingness to exercise their First Amendment rights in public."[25]

2019

[edit]

In January 2019, MIT researchers published a peer-reviewed study asserting that Rekognition had more difficulty in identifying dark-skinned females than competitors such as IBM and Microsoft.[16] In the study, Rekognition misidentified darker-skinned women as men 31% of the time, but made no mistakes for light-skinned men.[14] Amazon called the report "misinterpreted results" of the research with an improper "default confidence threshold."[16]

In January 2019, Amazon's shareholders "urged Amazon to stop selling Rekognition software to law enforcement agencies." Amazon in response defended its use of Rekognition, but supported new federal oversight and guidelines to "make sure facial recognition technology cannot be used to discriminate."[26] In February 2019, it was reported that Amazon was collaborating with the National Institute of Standards and Technology (NIST) on developing standardized tests to improve accuracy and remove bias with facial recognition.[27][28]

In March 2019, an open letter regarding Rekognition was sent by a group of prominent AI researchers to Amazon, criticizing its sale to law enforcement[26] with around 50 signatures.[16]

In April 2019, Amazon was told by the Securities and Exchange Commission that they had to vote on two shareholder proposals seeking to limit Rekognition. Amazon argued that the proposals were an "insignificant public policy issue for the Company" not related to Amazon's ordinary business, but their appeal was denied.[16] The vote was set for May.[15][29] The first proposal was tabled by shareholders.[30] On May 24, 2019, 2.4% of shareholders voted to stop selling Rekognition to government agencies, while a second proposal calling for a study into Rekognition and civil rights had 27.5% support.[31]

In August 2019, the ACLU again used Rekognition on members of government, with 26 of 120 lawmakers in California flagged as matches to mugshots. Amazon stated the ACLU was "misusing" the software in the tests, by not dismissing results that did not meet Amazon's recommended accuracy threshold of 99%.[32] By August 2019, there had been protests against ICE's use of Rekognition to surveil immigrants.[33]

In March 2019, Amazon announced a Rekognition update that would improve emotional detection,[15] and in August 2019, "fear" was added to emotions that Rekognition could detect.[34][35][36]

2020

[edit]

In June 2020, Amazon announced it was implementing a one-year moratorium on police use of Rekognition, in response to the George Floyd protests.[37]

2024

[edit]

The Department of Justice disclosed that the FBI is initiating the use of Amazon Rekognition.[38] The DOJ's AI inventory revealed the FBI's "Project Tyr" aims to customize Rekognition to identify nudity, weapons, explosives, and other information from lawfully acquired media.[39]

Controversy regarding facial analysis

[edit]

Racial and gender bias

[edit]

In 2018, MIT researchers Joy Buolamwini and Timnit Gebru published a study called Gender Shades.[40][41] In this study, a set of images was collected, and faces in the images were labeled with face position, gender, and skin tone information. The images were run through SaaS facial recognition platforms from Face++, IBM, and Microsoft. In all three of these platforms, the classifiers performed best on male faces (with error rates on female faces being 8.1% to 20.6% higher than error rates on male faces), and they performed worst on dark female faces (with error rates ranging from 20.8% to 30.4%). The authors hypothesized that this discrepancy is due principally to Megvii, IBM, and Microsoft having more light males than dark females in their training data, i.e. dataset bias.

In January 2019, researchers Inioluwa Deborah Raji and Joy Buolamwini published a follow-up paper that ran the experiment again a year later, on the latest versions same three SaaS facial recognition platforms, plus two additional platforms: Kairos, and Amazon Rekognition.[42][43] While the systems' overall error-rates improved over the previous year, all five of the systems again performed better on male faces than on dark female faces.

See also

[edit]

References

[edit]
  1. ^ a b c d Lardinois, Frederic (2016-11-30). "Amazon launches Amazon AI to bring its machine learning smarts to developers". TechCrunch. Retrieved 2019-07-21.
  2. ^ "What Is Amazon Rekognition?". Amazon Rekognition Developer Guide. Retrieved 2019-07-21.
  3. ^ "What is the Celebrity Recognition API? Is that the same or different than doing a face search?". AWS. Retrieved 2019-07-21.
  4. ^ Lardinois, Frederic (2016-06-08). "Amazon Rekognition can now recognize celebrities". TechCrunch. Retrieved 2019-07-21.
  5. ^ "Detecting Faces in an Image". Amazon Rekognition Developer Guide. Retrieved 2019-07-21.
  6. ^ "Amazon Rekognition launches enhanced face analysis". Planet Biometrics. 2019-03-19. Retrieved 2019-07-21.
  7. ^ "People Pathing". Amazon Rekognition Developer Guide. Retrieved 2019-07-21.
  8. ^ "Detecting Text". Amazon Rekognition Developer Guide. Retrieved 2019-07-21.
  9. ^ a b O'Brien, Chris (2018-09-13). "Mapillary will use Amazon Rekognition in effort to ease urban parking crunch". Venture Beat. Retrieved 2019-07-21.
  10. ^ "Detecting Unsafe Content". Amazon Rekognition Developer Guide. Retrieved 2019-07-21.
  11. ^ "Searching Faces in a Collection". Amazon Rekognition Developer Guide. Retrieved 2019-07-21.
  12. ^ a b "Amazon's facial-recognition technology is supercharging Washington County police". Oregon Live. Retrieved 2019-07-21.
  13. ^ "Amazon Rekognition Customers". AWS. Retrieved 2019-07-21.
  14. ^ a b c d Glaser, April (July 19, 2019). "How to Not Build a Panopticon". Slate. Retrieved August 27, 2019.
  15. ^ a b c d Harwell, Drew (April 30, 2019). "Oregon became a testing ground for Amazon's facial-recognition policing. But what if Rekognition gets it wrong?". The Washington Post. Retrieved August 27, 2019.
  16. ^ a b c d e f Pasternack, Alex (April 4, 2019). "Amazon says face recognition fears are "insignificant." The SEC disagrees". Fast Company. Retrieved August 27, 2019.
  17. ^ "Amazon Rekognition Improves Accuracy of Real-Time Face Recognition and Verification". AWS. 2018-04-02. Retrieved 2019-07-21.
  18. ^ Brandom, Russell (2018-05-22). "Amazon is selling police departments a real-time facial recognition system". The Verge. Retrieved 2019-07-21.
  19. ^ Statt, Nick (2019-07-18). "Orlando police once again ditch Amazon's facial recognition software". The Verge. Retrieved 2019-07-21.
  20. ^ a b c Zhou, Marrian (June 26, 2018). "Orlando stops using Amazon's controversial facial recognition tech". CNET. Retrieved August 27, 2019.
  21. ^ a b Keane, Sean (June 22, 2018). "Amazon employees protest sale of face recognition software to police". CNET. Retrieved August 27, 2019.
  22. ^ a b Singh Guliani, Neema (October 24, 2018). "Amazon Met With ICE Officials to Market Its Facial Recognition Product". ACLU. Retrieved August 27, 2019.
  23. ^ a b Statt, Nick (November 8, 2018). "Amazon told employees it would continue to sell facial recognition software to law enforcement". The Verge. Retrieved August 27, 2019.
  24. ^ Day, Matt (October 23, 2018). "Amazon Officials Pitched Their Facial Recognition Software to ICE". The Seattle Times. Retrieved August 27, 2019.
  25. ^ Boyce, Jasmin (December 1, 2018). "Lawmakers demand answers from Amazon on facial recognition tech". NBC News. Retrieved August 27, 2019.
  26. ^ a b Crist, Ry (March 19, 2019). "Amazon's Rekognition software lets cops track faces: Here's what you need to know". CNET. Retrieved August 27, 2019.
  27. ^ Lacy, Lisa (February 19, 2019). "Amazon Rekognition May Finally Be Audited and Ranked Alongside Other Vendors". Adweek. Retrieved August 27, 2019.
  28. ^ Hale, Kori (March 12, 2019). "Auditing Amazon's 'Rekognition' A.I. Could Remove Bias". Forbes. Retrieved August 27, 2019.
  29. ^ Singer, Natasha (May 5, 2019). "Amazon Faces Investor Pressure Over Facial Recognition". The New York Times. Retrieved August 27, 2019.
  30. ^ Whittaker, Zack (May 20, 2019). "Amazon under greater shareholder pressure to limit sale of facial recognition tech to the government". TechCrunch. Retrieved August 27, 2019.
  31. ^ Dastin, Jeffrey (May 24, 2019). "Amazon facial recognition ban won just 2% of shareholder vote". Reuters. Retrieved August 27, 2019.
  32. ^ Wehner, Mike (August 14, 2019). "Amazon's facial recognition system flags dozens of California lawmakers as criminals". BGR. Retrieved August 27, 2019.
  33. ^ Protalinski, Emil (August 16, 2019). "ProBeat: Breakthrough or BS, Amazon's Rekognition is dangerous". VentureBeat. Retrieved August 27, 2019.
  34. ^ Menegus, Bryan (August 13, 2019). "Amazon Rekognition Can Now Identify the Emotion It Provokes in Rational People". Gizmodo. Retrieved August 27, 2019.
  35. ^ Crowe, Michael (August 15, 2019). "Amazon says facial recognition can detect fear, raising concern for some privacy advocates". King5. Retrieved August 27, 2019.
  36. ^ Mihalcik, Carrie (August 15, 2019). "Amazon's Rekognition software can now spot fear". CNET. Retrieved August 27, 2019.
  37. ^ "We are implementing a one-year moratorium on police use of Rekognition". June 10, 2020. Retrieved June 19, 2020.
  38. ^ mbracken (2024-01-25). "Justice Department discloses FBI project with Amazon Rekognition tool". FedScoop. Retrieved 2024-04-12.
  39. ^ "Artificial intelligence | Digital Watch Observatory". Retrieved 2024-04-17.
  40. ^ Buolamwini, Joy; Gebru, Timnit (2018). "Gender Shades: Intersectional Accuracy Disparities in Commercial Gender Classification" (PDF). Proceedings of Machine Learning Research.
  41. ^ Quach, Katyanna (2018-02-13). "Facial recognition software easily IDs white men, but error rates soar for black women". The Register. Retrieved 2019-07-21.
  42. ^ Raji, Inioluwa Deborah; Buolamwini, Joy (2019-01-27). "Actionable Auditing: Investigating the Impact of Publicly Naming Biased Performance Results of Commercial AI Products" (PDF). AAAI/ACM Conference on Artificial Intelligence, Ethics, and Society.
  43. ^ Wiggers, Kyle (2019-01-24). "MIT researchers: Amazon's Rekognition shows gender and ethnic bias (updated)". Venture Beat. Retrieved 2019-07-21.