You are reading a single comment by @dubtap and its replies. Click here to read the full conversation.
  • Exactly - any output of an AI is at most as good as the data used to train it with. So if your training data contains biases, those will be present in the output as well. Avoiding bias is becoming a big field in itself.

    For a semi funny semi concerning example that isn't even complicated (like hiring decisions would be): "HP computers are racist"

About

Avatar for dubtap @dubtap started