-
There is obviously a problem there, but there are actually loads of different problems layered into it. Fundamentally, there is a many-to-one issue which results from the loss of information when you pixelate. I'm no expert on AI, but if you train on a dataset that's representative of the population it may always avoid producing faces from a minority group.
Facial characteristics can (under some models) be represented as a deviation from a gender/ethnic norm. So if you try to infer ethnicity and gender first you may get a higher likelihood of a correct outcome, but then there's the issue that you have an unknown lighting source, which means that skin tone is really hard to read.
Essentially this is a great example of something that humans are good at and computers aren't (yet). I suspect, however, that the example of Barack Obama is fooling us as to the size of that disparity because it triggers a whole load of contextual information for our recognition that computers have no access to.
AI, everybody. Its totally fine and unbiased in any way shape or form.
https://www.vice.com/en_us/article/7kpxyy/this-image-of-a-white-barack-obama-is-ais-racial-bias-problem-in-a-nutshell