What was the last thing you asked an artificial intelligence chatbot to do for you?

Perhaps you asked for an essay structure to provide an answer to a difficult question, an extended analysis of a large amount of data, or simply to check if your cover letter matched the job description.

But some experts are raising the alarm that delegating these tasks to artificial intelligence could mean our brains are working less. This, they say, could harm our abilities to think critically and solve problems.

Earlier this year, the Massachusetts Institute of Technology (MIT) published a study showing that people who used ChatGPT to write essays showed less activity in brain networks associated with cognitive processing while performing the task.

The researchers stated that their study highlights "the urgent need to explore a possible decline in learning abilities."

The study involved 54 people, all recruited from MIT and surrounding universities. Their brain activity was recorded using electroencephalography (EEG), a method that involves placing electrodes on the scalp.

Some of the “prompts” used by participants included requests for summarizing essay questions, finding sources, and improving grammar and style.

AI was also used to generate and formulate ideas, although some users felt that artificial intelligence was not particularly good at this.

“AI makes finding answers very easy”

In another study, Carnegie Mellon University and Microsoft, the company that operates Copilot, found that people's problem-solving skills can weaken if they become overly dependent on artificial intelligence.

They surveyed 319 white-collar workers who used AI tools for their work at least once a week, asking them how they applied critical thinking while using them.

The researchers analyzed about 900 examples of tasks given to artificial intelligence, ranging from analyzing data to extract new insights to verifying whether a paper met certain rules.

The study found that the higher the users' confidence in the tool's ability to perform a task, the lower the "engagement in critical thinking."

"While GenAI can increase employee efficiency, it can hinder critical engagement with work and potentially lead to long-term dependence on the tool, as well as weakening independent problem-solving skills," the study concludes.

UK schoolchildren were also surveyed in a study published in October by Oxford University Press (OUP). The results showed that six in ten of them thought AI had negatively impacted their ability to do schoolwork.

So, with the massive explosion in the use of artificial intelligence, are our cognitive abilities really at risk?

According to Dr. Alexandra Tomescu, a specialist in generative artificial intelligence at OUP and co-author of the study with the students, the answer is not that simple.

“Our research shows that nine out of ten students say that AI has helped them develop at least one school-related skill, whether it's problem-solving, creativity or repetition,” she explains.

“But at the same time, about a quarter say that using AI has made it much easier to get work done for them… So it’s a nuanced picture.”

She adds that many students want more guidance on how to use artificial intelligence correctly.

ChatGPT, which according to CEO Sam Altman has more than 800 million active users each week, has published a list of 100 “prompts” for students, designed to help them get the most out of technology.

But Professor Wayne Holmes, a researcher in artificial intelligence and education at University College London (UCL), says this is not enough.

He calls for much more academic research on the effects of AI tools on the learning process before pupils and students are widely encouraged to use them.

"Today there is no independent, large-scale evidence for the effectiveness of these tools in education, nor for their safety, or even for the idea that they have a positive impact," he tells the BBC.

Better results, but poorer learning?

Prof. Holmes also refers to research on what is called cognitive atrophy, where a person's abilities weaken after using AI.

He cites the case of radiologists using AI tools to interpret X-rays before making a diagnosis.

A Harvard Medical School study published last year found that AI assistance improved the performance of some clinicians but harmed that of others, for reasons that researchers still don't fully understand.

The authors called for more research into how humans interact with artificial intelligence, in order to find uses that "enhance human performance, rather than impair it."

Prof. Holmes expresses concern that students, both at school and university, may become overly dependent on AI to do their work and, as a result, fail to develop the fundamental skills that education provides.

An essay may get better grades thanks to the help of artificial intelligence, but the main question remains: Do students end up understanding less?

As he sums it up, "Their products are better, but the actual learning is weaker."

On the other hand, Jayna Devani, head of international education at OpenAI, the company that owns ChatGPT and is involved in the deal with the University of Oxford, says the company is "very aware of this debate."

"We definitely don't think students should use ChatGPT to do work," she tells the BBC.

According to her, the tool works better as a tutor than as a simple source of answers. She gives the example of using ChatGPT in “study mode”, where the student can have a dialogue with the chatbot.

The student enters the question they are having difficulty with and the chatbot breaks it down into parts, helping to understand it.

The example she gives is that of a student working late at night on a topic he doesn't fully understand.

“If you have a presentation to give and it's midnight, you're not going to email your university tutor for help. I think the potential is really there for ChatGPT to accelerate learning, if used properly,” she says.

However, Prof. Holmes insists that any student using AI tools should be aware of how their reasoning works and how the companies that provide them handle the data. He stresses that the results should always be verified.

“I never tell my students that they shouldn't use AI… But what I try to say is that we need to understand it well, so that we can make good decisions.”

© BalkansWeb
To become part of the group "Balkanweb" just click: Join Group and your request will be approved immediately. Groups Balkanweb