The iPhone X, which was released on Nov. 3, was created with a new sleek design and advanced features such as Face ID, TruthDepth camera, Animoji and more. The wave of futuristic features in the iPhone X have been praised, although not going without more than a few noticeable glitches. One of the most popular features, Face ID, is all the craze, as it’s used to unlock your phone, activate animoji and make purchases easier.
Apple claims that the X’s facial recognition is top-notch, knowing that your face is biometric, a measurable characteristic. These measurable characteristics they used for Face ID consist of fingerprints, speech waveforms, facial features and the iris in the eyes. There’s a very simple scale used to determine its accuracy: it computes a score that ranges between 0 and 1. One means it’s closer to the similar features and 0 recognizes that it is NOT the same person. With that being said, the threshold to determine if there will be allowed access to the phone ranges with what action that is trying to be completed.
Reaching a 0.7 typically allows access to unlock the phone, but if you are purchasing something within the phone that might be expensive, the levels could be raised to a 0.9 for obvious security reasons.
In most recent news, a women name Yan proved that the chances of inaccuracy of the facial recognition is certainly not “one-in-a-million” as an Apple previously stated.
— LADbible (@ladbible) December 15, 2017
Realizing that her co-worker was able to unlock her phone using Face ID, she hastily contacted Apple to which her experience was brushed off. Taking her co-worker to the nearest Apple store, they went in and displayed the issue to the workers, who then claimed it was a internal malfunction and offered her a new phone. The most surprising thing about it all was that nothing was truly wrong with her old phone, as the new one was able to be unlocked by her co-worker once too.
This was an immediate conversation starter on Twitter, as many users pointed out many facts. Firstly, we can recognize there are bias in the algorithms that are being used all around the world, as well as the lack of a wider diversity make up in technology labs and so-forth.
An MIT grad student named Joy Buolamwin, who works in facial recognition, has come across this problem many times.
“Over time, you can teach a computer how to recognize other faces, however, if the training sets aren’t really that diverse, any face that deviates too much from the established norm will be harder to detect.”
After watching her video, I realize that we can’t blame the algorithms for having a bias, but the thought would always be that computers are taught everything they know with a simple input. Who’s teaching them these things?
One of the reasons for more ethnic diversity in tech. Devices can't be biased, but if the creators don't account for their own biases it shows up in things like Asian women being indistinguishable to iPhones and black hands not triggering sensors in soap machines. https://t.co/b0A2IgrsSS
— Simply TC (@BienSur_JeTaime) December 16, 2017
Now, it was just recently brought up to me about the diversity in technology and I’ve never realized the mediocre recognition that people of color deal with. This seems to fall under the stereotypical ideology that “all (insert race) look the same.” If there isn’t a stereotypical view, then there is the lack of consideration in general.
In the video below, a black man tries to use the soap dispenser and the color of his skin wasn’t recognized, but once he put the white paper and his friends hand under the sensor, it was noticed immediately. This goes back to the concept of the algorithms that are programmed into devices that we use every day.
If you have ever had a problem grasping the importance of diversity in tech and its impact on society, watch this video pic.twitter.com/ZJ1Je1C4NW
— Chukwuemeka Afigbo (@nke_ise) August 16, 2017
If products are tested on all different ethnic groups and taken into consideration, which Apple claims it is, would this glitch have been noticed and fixed before the product was sold?