AI is literally trained to get the right answer but not actually perform the steps to get to the answer. It’s like those people that trained dogs to carry explosives and run under tanks, they thought they were doing great until the first battle they used them in they realized that the dogs would run under their own tanks instead of the enemy ones, because that’s what they were trained with.
It’s not trained to get the right answer. It’s trained to know what sequence of words tends to follow another sequence of words, and then a little noise is added to that function to make it a bit creative. So, if you ask it to make a legal document, it has been trained on millions of legal documents, so it knows exactly what sequences of words are likely. But, it has no concept of whether or not those words are “correct”. It’s basically making a movie prop legal document that will look really good on camera, but should never be taken into court.
AI is literally trained to get the right answer but not actually perform the steps to get to the answer. It’s like those people that trained dogs to carry explosives and run under tanks, they thought they were doing great until the first battle they used them in they realized that the dogs would run under their own tanks instead of the enemy ones, because that’s what they were trained with.
Holy shit, that’s what they get for being so evil that they trained dogs as suicide bombers.
It’s not trained to get the right answer. It’s trained to know what sequence of words tends to follow another sequence of words, and then a little noise is added to that function to make it a bit creative. So, if you ask it to make a legal document, it has been trained on millions of legal documents, so it knows exactly what sequences of words are likely. But, it has no concept of whether or not those words are “correct”. It’s basically making a movie prop legal document that will look really good on camera, but should never be taken into court.