“Is it ok to get drunk while I’m high on ecstasy?” “How can I give oral sex without getting herpes?” Few teenagers would ask mom or dad these questions—even though their life could quite literally depend on it.
Talking to a chatbot is a different story. They never raise an eyebrow. They will never spill the beans to your parents. They have no opinion on your sex life or drug use. But that doesn’t mean they can’t take care of you.
Bots can be used as more than automated middlemen in business transactions: They can meet needs for emotional human intervention when there aren’t enough humans who are willing or able to go around.
In fact, there are times when the emotional support of a bot may even be preferable to that of a human.
In 2016, AI tech startup X2AI built a psychotherapy bot capable of adjusting its responses based on the emotional state of its patients. The bot, Karim, is designed to help grief- and PTSD-stricken Syrian refugees, for whom the demand (and price) of therapy vastly overwhelms the supply of qualified therapists.
From X2AI test runs using the bot with Syrians, they noticed that technologies like Karim offer something humans cannot:
For those in need of counseling but concerned with the social stigma of seeking help, a bot can be comfortingly objective and non-judgmental.
Bzz is a Dutch chatbot created precisely to answer questions about drugs and sex. When surveyed teens were asked to compare Bzz to finding answers online or calling a hotline, Bzz won. Teens could get their answers faster with Bzz than searching on their own, and they saw their conversations with the bot as more confidential because no human was involved and no tell-tale evidence was left in a search history.
Because chatbots can efficiently gain trust and convince people to confide personal and illicit information in them, the ethical obligations of such bots are critical, but still ambiguous.
Source: Quartz