They have no concept of truth. They just produce text that is statistically similar to other text that exists in the world. That's it. That's all they do. It doesn't push anything at you. You ask it for some randomly generated text that is similar to other text in the world, and it randomly generates some text.
If you're playing Monopoly and you roll a dice and land on someone's hotel and go bankrupt, the dice hasn't done anything wrong
People just don't get LLMs ("AI").
They have no concept of truth. They just produce text that is statistically similar to other text that exists in the world. That's it. That's all they do. It doesn't push anything at you. You ask it for some randomly generated text that is similar to other text in the world, and it randomly generates some text.
If you're playing Monopoly and you roll a dice and land on someone's hotel and go bankrupt, the dice hasn't done anything wrong