How are colleges handling students' use of artificial intelligence?

Question Everything: How are colleges handling the use of artificial intelligence?

BOSTON - There are a lot of myths, predictions, and half-truths about artificial intelligence: it's coming for your job. It's too risky. 

Generative AI, harnessed through programs like ChatGPT, is a popular technology that can be used to create new content, including audio, code, images, text, simulations, and videos.

In fact, AI can write entire academic essays, outlines, bibliographies – you name it.

"AI is a great tool to augment what you do but we have to have humans checking the work," said Patty Patria, the Chief Information Officer at Babson College. "If we are not checking the work that AI is generating, then we are not using it properly."

A recent survey by BestColleges revealed that 56% of college students admit completing assignments using AI technology.

In fact, when WBZ-TV surveyed random students at Boston University and Northeastern, many said they, too, use it for schoolwork. 

"It's pretty convenient to give you a broad view," one Northeastern student said. "I've been using ChatGPT for most of my assignments and it works really well," a BU student said.

"I sometimes use it as a background resource to get information on a subject and then I go from there," another added.

There are obvious concerns: plagiarism, inaccurate information, and students not learning how to write their own papers or do their own work.

"We have seen that our library has received more requests for interlibrary loans, for books or journals that don't even exist because somebody looks at a reference that was generated by a generative AI model that has, you know, journals and books in there that don't exist," Tilman Wolf, the Senior Vice Provost of Academic Affairs at UMass Amherst explained. 

But the popularity of the technology makes it hard for local universities to ignore, and as a result, most are currently in the process of developing AI policies or strategies for students and faculty.

"I think the best thing to do is to leave it up to the faculty members to decide what makes sense to them. The really important thing is to articulate what these expectations are by the faculty members, so that the students know what to do," Wolf explained.

"We're at the beginning but I think pretty ahead of other high learning institutions in terms of creating an AI model," added Patty Patria of Babson. The college hopes to have a formal policy on the books by the end of the academic year.

Some considerations and recommendations they offer professors:

  1. Forbid the use of AI.
  2. Discourage AI use – but require proper credit when AI is used.
  3. Encourage – and even assign – AI augmented work.
  4. Decide whether AI can be used on a case-by-case basis for assignments.

"Generative AI models are not going to go away," Tilman Wolf said. "And I think the important thing is that we create awareness on our campus, what they can and cannot do, and that we think about how we can be transparent about where we use them and where we don't use them, and that we train our students so that they are prepared for the workforce where they can use these tools in an appropriate manner."

If you have a question you'd like us to look into, please email questioneverything@cbsboston.com.   

Read more
f

We and our partners use cookies to understand how you use our site, improve your experience and serve you personalized content and advertising. Read about how we use cookies in our cookie policy and how you can control them by clicking Manage Settings. By continuing to use this site, you accept these cookies.