• Passerby6497@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    15日前

    Ah, well then, if he tells the bot to not hallucinate and validate output there’s no reason to not trust the output. After all, you told the bot not to, and we all know that self regulation works without issue all of the time.