Students are increasingly relying on chatbots, but at what cost?


From Tara Garcia MattsonCalmness

"Two
Students take notes during the English class of Dr. Adam Kaisserman at the Canyons College in Santa Clarita on May 6, 2025. A photo from Jul Hotz for CalMatters

This story was originally published by CalmattersS Register about their ballots.

Students do not have the same incentive to talk to their teachers – or even their classmates – already. Chatbots like Chatgpt, Gemini and Claude gave them a new path to self -sufficiency. Instead of asking a professor for help on a paper theme, students can go to a chatbot. Instead of forming a learning group, students can ask AI for help. These chatbots give them quick answers to their own timeline.

For students, they juggle school, work and family responsibilities, this lightness can seem like a savior. And you may turn to a home -made chatbot here and there is not so much work is insulated. But every time a student decides to ask a question about a chatbot instead of a professor or peer or teacher, this is a smaller opportunity to build or strengthen a relationship, and human connections make students in the campus are among the most important benefits of college.

Julia Freeland-Fisher studies how technology can help or prevent students’ success in Institute Clayton ChristensenS She said the consequences of turning to help can be complicated.

“Over time, this means that students have less people in their corner, which can help them at other times of struggle that can help them in a way that a bot may not be capable,” she said.

As colleges further embed Chatgpt and other chatbots in the life of the campus, Freeland-Fisher warns the lost relationships can turn into a pernicious involuntary consequence.

Request

Christian Alba said he never turned into an AI task. Alba, 20, attends a canyon college, a major college in the community north of Los Angeles, where she studies business and history. And while he did not ask Chatgpt to write any documents about him, he turned to the technology when an empty page and a flashing cursor seemed to be overwhelming. He asked for a contour. He asked for ideas to start with an introduction. He asked for advice on what to prioritize first.

“It’s hard to just start something fresh from your mind,” Alba said. “I’m not going to lie. It’s a useful tool.” Alba, however, wondered whether to turn to a chat with similar types of issues was the over -protection of AI. But Alba, like many others in higher education, is worried mainly about the use of AI, as it refers to academic integrity rather than social capital. And that’s a problem.

Jean Rhodes, Professor of Psychology at the University of Massachusetts Boston, has spent decades to study the way students seek help in campus and how the relationships formed during these interactions ultimately take advantage of students in the long run. Rhodes does not imply students integrating chatbots in their work streams, as many of their teachers have, but she is worried that students will get lower answers to even simple sound questions, such as “How do I change their main?”

A chatbot can direct a student to the secretary’s office, Rhodes said, but if a student has asked an advisor, that person may have asked important questions-why the student wants a change, for example, which can lead to a more in-depth conversation and obstacles of the student.

“We understand the broader context of students’ lives,” Rhodes said. “They are smart, but they are not wise, these tools.”

Rhodes and one of her former doctoral students, Sarah Schwartz, have created a program called Connected Scholars to help students understand why it is valuable to talk to professors and have mentors. The program has helped them improve their networking skills and understand what people are coming out of their networks throughout their lives – namely social capital.

Related scientists are offered as a semester course in the U Mass Boston, and the upcoming book looks at the results in the last decade, finding students who go into the course, are three times more likely to finish. Over time, Rhodes and her colleagues have discovered that the key to the success of the program is to overcome students from disgust to ask others for help.

Students will make many excuses for not asking for help, Rhodes said, by checking a list of them: “I don’t want to stand out”, “I don’t want people to realize that they don’t fit here,” “This person will not answer,” “I shouldn’t visit,” I’ll worry “,” this person will not answer. ” If you can overcome this and make them recognize the value of reaching, it is quite amazing what is happening. “

Links are key

The search for human assistance not only leaves students with a resolution of one problem, but gives them a connection with another person. And this person, below, can become a friend, mentor or business partner – a “strong relationship”, as social scientists describe their centrality on a person’s network. They could also become a “weak tie” that a student may not see often but, importantly, still offers leading work or decisively social support One day.

Daniel Shamblis, a retired sociologist from Hamilton College, emphasized the value of the relationship in his book How to work a college, co -authored with Christopher Takox. In the course of their research, the couple found that the key to successful college experience was reduced to relationships, in particular, two or three close friends and one or two trusted adults. Hamilton College is starting from its way to make sure that students can form these relationships by structuring the work of training to bring students into campus offices and around teachers and employees, creating a place for students with various athletic skills in sports teams and more.

“We understand the broader context of students’ lives. They are smart, but not reasonable, these instruments.”

Jean Rhodes, Professor of Psychology at the University of Massachusetts Boston

Shamblis is worried that AI-moving chatbots make it too easy to avoid interactions that can lead to important relationships. “We are suffering from epidemic levels of loneliness in America,” he said. “This is a really major problem, historically speaking. It’s very unusual and is deeply bad for people.”

As students are increasingly turning to artificial intelligence for help and even negligent conversation, Chamlis predicts that it will make people even more isolated: “This is another place where they will not have a personal relationship.”

Actually, Recent study Researchers from MIT Media Lab and Openai have found that the most common users of Chatgpt – Power users – are more lonely and isolated from human interaction.

“What scares me about this is that Big Tech would all want us to be energy users,” Freeland-Fisher said. “This is in the fabric of the business model of a technology company.”

Yenia Pacheco is preparing to refine himself at the Long Beach City College for his last semester after more than a year of rest. The last time she was on the campus, Chatgpt existed, but was not widely used. She now knows that she returns to college where Chatgpt is deeply embedded in the life of students, as well as in the life of teachers and staff, but Pacheco expects to return to her old habits – she will go to the office hours of her professors and stick after class to ask them questions. She sees the value.

She understands why others may not. Today’s high school students, she noticed, are not used to talking to adults or building a mentor -style relationship. At 24, she knows why they matter.

“Chatbot,” she said, “will not give you a recommended letter.”

This article was Originally Published on CalMatters and was reissued under Creative Commons Attribution-Noncommercial-Noderivatives License.

Leave a Reply

Your email address will not be published. Required fields are marked *