You are reading a single comment by @SwissChap and its replies. Click here to read the full conversation.
  • I don't think it has much to do with the bias of the coder, I understand that most issues currently come from the AI looking to replicate the current situation in terms of successful applicants so just continue to feed square white men into the interview pipeline.

  • Exactly - any output of an AI is at most as good as the data used to train it with. So if your training data contains biases, those will be present in the output as well. Avoiding bias is becoming a big field in itself.

    For a semi funny semi concerning example that isn't even complicated (like hiring decisions would be): "HP computers are racist"

About

Avatar for SwissChap @SwissChap started