Of course the company that acknowledges that it’s technology is used for emotional and psychological support is going to blame those who use it for such purposes. Plus falling back on the ToS means either they don’t know how to prevent such outcomes or they don’t want to.
Of course the company that acknowledges that it’s technology is used for emotional and psychological support is going to blame those who use it for such purposes. Plus falling back on the ToS means either they don’t know how to prevent such outcomes or they don’t want to.
Think it’s a little bit of both. They benefit greatly from people being addicted to their product, and “fixing” a neural network is fucking hard.