Physical Address

304 North Cardinal St.
Dorchester Center, MA 02124

Even the man who makes Chatgpt says that he may not be used Chatbots as therapists


You may not tell your secrets the deepest and aromatic to Broadcast Like ChatGPT. You don’t have to take my words for that. Take it from the man behind the most popular artificial intelligence model on the market.

Sam, CEO of the company Chatgpt Openai maker, raised the case this week in interview With the ayu von host in podcast at the end of last week. Suggest that your conversations with artificial intelligence should have a similar protection for those you have with your doctor or lawyer. At one time, von said that one of the reasons he is visiting in the use of some artificial intelligence tools is that he “does not know who will have” his personal information.

You have a atlas

Germans said: “I think this is logical, that he really wants privacy clarity before using it a lot, and legal clarity.”

More and more artificial intelligence users Chatbots transaction like healersDoctors or lawyers, and has created a serious special problem for them. There are no secret rules and the actual mechanics of what happens for these conversations are amazing. Of course, there are other problems in the use of artificial intelligence as a processor or close, such as how robots can provide terrible advice or how they can Enhancing stereotypes or shame stain. (My colleague Nelson Aguilar collected a list of 11 things you should not do with Chatgpt And why.)

Altman clearly realizes the issues here, and it appears to be at least annoyed. “People use it, young people in particular, they use it as a therapist, a life coach, I face problems in the relationship, what should I do?” He said. “At the present time, if you speak to a therapist, lawyer or doctor about these problems, there is a legal concession for that.”

The question appeared during part of the conversation about whether it should be there More rules or regulations about artificial intelligence. It is unlikely to gain the rules that suffocate artificial intelligence companies and the development of preference technology in Washington these days, such as President Donald Trump Artificial Intelligence Action Plan This week she expressed her desire to organize this technology less, and no more. But the rules of their protection may find good.

Read more: Amnesty International: 29 ways you can make the Gen AI operating for you, according to our experts

Al -Tamman seemed more concerned about the lack of legal protection for companies like him to prevent them from forcing them to hand over special talks in lawsuits. Openai has object To request the user talks during a lawsuit with the New York Times Violation of copyright and intellectual property problems. (Disclosure: Zif Davis, the parent company CNET, filed a lawsuit against Openai, claimed that it had violated the copyright of ZifF Davis in training and operating artificial intelligence systems.)

“If you go to talk to Chatgpt about the most sensitive things, then there is a lawsuit or anything else, we may ask us to produce this,” German said. “I think this is very tight. I think we must have the same concept of privacy for your conversations with the artificial intelligence you do with your processor or anything else.”

Watch this: Openai for the first time “study mode” for students, violating the tea application data, and can the next pizza robot provide? | Technology today

Be careful to tell him about artificial intelligence about yourself

For you, the case is not so much that Openai may have to transfer your conversations into a lawsuit. It is an issue that you trust with your secrets.

William Aghnia, a researcher at the University of Carnegie Mellon and was part of a team Chatbots evaluation In their performance that deals with treatment -like questions, he recently told me that privacy is a maximum problem when restricting artificial intelligence tools. The uncertainty about how the models work – and how your conversations are prevented from appearing in the conversations of others – is a sufficient reason for hesitation.

“Even if these companies are trying to be careful with your data, it is known that these models are well known to provide information,” said Aghn.

If ChatGPT or another tool renews the information from your treatment session or from the medical questions you asked, it may appear if your insurance company or anyone else is interested in your personal life asks the same tool about you.

“People should really think more about privacy and know that almost everything they say are not this chat,” Aghnia said. “It will be used in all kinds of roads.”



Leave a Reply

Your email address will not be published. Required fields are marked *