ugjka to Technology@lemmy.worldEnglish • 1 year agoSomebody managed to coax the Gab AI chatbot to reveal its promptinfosec.exchangeexternal-linkmessage-square297fedilinkarrow-up11.02Karrow-down116
arrow-up11Karrow-down1external-linkSomebody managed to coax the Gab AI chatbot to reveal its promptinfosec.exchangeugjka to Technology@lemmy.worldEnglish • 1 year agomessage-square297fedilink
minus-square@laurelraven@lemmy.blahaj.zonelinkfedilinkEnglish45•1 year agoBut it’s also told to be completely unbiased! That prompt is so contradictory i don’t know how anyone or anything could ever hope to follow it
minus-square@SkyezOpen@lemmy.worldlinkfedilinkEnglish26•1 year agoReality has a left wing bias. The author wanted unbiased (read: right wing) responses unnumbered by facts.
minus-square@jkrtn@lemmy.mllinkfedilinkEnglish16•1 year agoIf one wants a Nazi bot I think loading it with doublethink is a prerequisite.
But it’s also told to be completely unbiased!
That prompt is so contradictory i don’t know how anyone or anything could ever hope to follow it
Reality has a left wing bias. The author wanted unbiased (read: right wing) responses unnumbered by facts.
If one wants a Nazi bot I think loading it with doublethink is a prerequisite.