I agree with much of this. One of the biggest problems with AI in academic science, in my opinion, is with trainees who will NOT learn how to read, synthesize, or create on their own if they are leaning into AI tools. Then they will never be able to make the kinds of educated decisions about how to use AI that you describe in writing this post!
It's tricky. In my experience, many and maybe most students are wary of AI outputs and want to do the real work. I've been incorporating it into class activities and we discuss what it gets right and what it doesn't get right. And where the limits of the technology are. Most students recognize it's not a panacea. But there are always students who view classes as transactional and don't have the desire to learn much. They lean heavily into AI. I'm not sure that's a solvable problem.
I agree with much of this. One of the biggest problems with AI in academic science, in my opinion, is with trainees who will NOT learn how to read, synthesize, or create on their own if they are leaning into AI tools. Then they will never be able to make the kinds of educated decisions about how to use AI that you describe in writing this post!
It's tricky. In my experience, many and maybe most students are wary of AI outputs and want to do the real work. I've been incorporating it into class activities and we discuss what it gets right and what it doesn't get right. And where the limits of the technology are. Most students recognize it's not a panacea. But there are always students who view classes as transactional and don't have the desire to learn much. They lean heavily into AI. I'm not sure that's a solvable problem.
In that regard, I guess it's not that different than the essays they've always been able to purchase and the roster of old exams kept by a fraternity!