The Pixel 4 and Pixel 4 XL will be launched with a facial recognition which promises to be quite accurate. However, this sophistication may have been based on a very questionable practice: a company hired by the Google he would have used, without consent, images of homeless people, university students and dark-skinned people to train the technology.
It is important that the company has an extensive and diverse image bank to prevent its facial recognition technology from failing so that it appears that there has been racial discrimination, for example.
But it is also important that image collection procedures follow ethical and moral principles. This is where Google may have failed, albeit unintentionally: an investigation by the New York Daily News points out that Randstad, the agency hired to register faces, would not have informed participants about the purpose of the procedure.
The report states that Randstad hired temporary workers to approach homeless people in Atlanta, students from various universities, participants in a festival in Los Angeles, among others, to find volunteers willing to participate in a survey.
So far, nothing unusual. However, several workers told the New York Daily News who were instructed to approach mainly people with dark skin and not to tell participants that their faces would be recorded.
To convince the people approached, the participants had to use arguments such as “play with your cell phone for a few minutes and get a gift card” and “test this app and get $ 5”. There were also those who were instructed to say that the purpose of the approach was to test a Snapchat-like selfie app.
The most important thing was not to reveal that the participant’s face was being recorded. If any of them suspected, it was necessary to deny that the recording was being made and, if necessary, to speed up the conversation before the person refused to participate.
Workers would also have been instructed to approach homeless people because “they are less likely to say anything to the media”. As for university students, there would also have been incentives to approach them because, as they generally have a tight budget, they tend to take part in paid surveys more easily.
In July, Google confirmed to the The Verge which conducts “field research” for digitizing faces in order to train its facial recognition technology. The company also says the scanned faces are given an abstract identity and are not linked to the emails that participants report at the time of the approach.
Also according to Google, facial data is kept for 18 months, but any participant can request that the information be deleted before that.
The problem here is: if people did not know that their faces were being scanned – many would not even have realized that there was a consent form – how can they request the data to be deleted?
For now, Google says it is taking complaints against Randstad seriously and will investigate them. The company also explained that the approaches, if confirmed, violate its conditions for research with volunteers.
It was not clear, however, what measures the company will take if the irregularities are proven.