chobeat@lemmy.ml to Technology@lemmy.worldEnglish · 5 个月前ChatGPT advises women to ask for lower salaries, study findsthenextweb.comexternal-linkmessage-square46fedilinkarrow-up1406arrow-down113cross-posted to: [email protected]
arrow-up1393arrow-down1external-linkChatGPT advises women to ask for lower salaries, study findsthenextweb.comchobeat@lemmy.ml to Technology@lemmy.worldEnglish · 5 个月前message-square46fedilinkcross-posted to: [email protected]
minus-squaregenevieve@sh.itjust.workslinkfedilinkEnglisharrow-up30arrow-down7·edit-24 个月前deleted by creator
minus-squarechaosCruiser@futurology.todaylinkfedilinkEnglisharrow-up31arrow-down1·5 个月前Demand for these services was clearly taken into account in the salary.
minus-squarehansolo@lemmy.todaylinkfedilinkEnglisharrow-up24arrow-down1·5 个月前You’re a baby made out of sugar? What an incredible job. I guess that explains being the Gulf region, it doesn’t rain much there. Otherwise you’d melt.
minus-squareTakapapatapaka@tarte.nuage-libre.frlinkfedilinkFrançaisarrow-up6·5 个月前Is that a pick-up line? Can we flirt on lemmy?
minus-squarehansolo@lemmy.todaylinkfedilinkEnglisharrow-up6·5 个月前No, sorry, we can’t flirt. You are only allowed to send blast DMs calling yourself the Fediverse Chick/Dude/Person.
minus-squarePieisawesome@lemmy.dbzer0.comlinkfedilinkEnglisharrow-up8·5 个月前And if you tried this 5 more times for each, you’ll likely get different results. LLM providers introduce “randomness” (called temperature) into their models. Via the API you can usually modify this parameter, but idk if you can use the chat UI to do the same…
deleted by creator
Demand for these services was clearly taken into account in the salary.
You’re a baby made out of sugar? What an incredible job.
I guess that explains being the Gulf region, it doesn’t rain much there. Otherwise you’d melt.
Is that a pick-up line? Can we flirt on lemmy?
No, sorry, we can’t flirt. You are only allowed to send blast DMs calling yourself the Fediverse Chick/Dude/Person.
And if you tried this 5 more times for each, you’ll likely get different results.
LLM providers introduce “randomness” (called temperature) into their models.
Via the API you can usually modify this parameter, but idk if you can use the chat UI to do the same…
What model is this?