Let me be clear that this is just an idea that has no substantiation, but given that the user explicitly identifies their young age and, you know, the creepy rest of it, could it literally be that the AI interprets the situation as “I need instances in my training data where someone compliments the appearance of a child in a ‘romantic’* context (etc.)” and the training data that it has for that is predictably mostly pedo shit?
*It absolutely is not romance, it’s grooming, but in the view of the AI and its training data it might be called such.
It’s weird that it’s talking in such grand and abstract terms like Humbert Humbert from Lolita too
almost as if it’s sucked up the book, had a child ask a question like that and gone ‘ah, i know just the trick’
Let me be clear that this is just an idea that has no substantiation, but given that the user explicitly identifies their young age and, you know, the creepy rest of it, could it literally be that the AI interprets the situation as “I need instances in my training data where someone compliments the appearance of a child in a ‘romantic’* context (etc.)” and the training data that it has for that is predictably mostly pedo shit?
*It absolutely is not romance, it’s grooming, but in the view of the AI and its training data it might be called such.