I actually think this is kind of a case of overfitting. The AI is factoring in extra data to the analysis that isn’t the important variables.
This difference was not in the training data, and so the AI fell over at what appears to be a cellular anomaly it had never seen before.
The difference was in the training data, it was just less common.
This is like if someone who knows about swans, and knows about black birds, saw a black swan for the first time and said “I don’t know what bird that is”. It’s assuming the whiteness is important when it isn’t.
I actually think this is kind of a case of overfitting. The AI is factoring in extra data to the analysis that isn’t the important variables.
The difference was in the training data, it was just less common.
This is like if someone who knows about swans, and knows about black birds, saw a black swan for the first time and said “I don’t know what bird that is”. It’s assuming the whiteness is important when it isn’t.