Rapid rise of ChatGPT and other generation tools AI system has disrupted education and changed the way students study and learn.
Students around the world are turning to chatbots to help with homework, but the capabilities of artificial intelligence are blurring the lines of what they should and shouldn’t be used for.
The widespread adoption of this technology in many other areas of life also adds to the confusion about what academic dishonesty is.
Here are some do’s and don’ts when using AI for academic purposes.
Don’t just copy and paste
Chatbots are so good at giving detailed written answers to questions that it’s easy to take their work as your own.
However, if it is not already clear, AI should not be used as a replacement for labor. And it cannot replace our ability to think critically.
Never copy and paste information from a textbook or someone else’s essay and advertise it as your own. The same principle applies to chatbot replies.
“AI can help you understand concepts and generate ideas, but it doesn’t replace your own thinking and effort,” says the University of Chicago. say In its guidance on the use of generative AI. “Always create original work and use AI tools for guidance and clarity, not to do the work for you.”
So don’t hesitate to put pen to paper or fingers to keyboard to write yourself.
“Using an AI chatbot to write your text, whether it’s an explanation, summary, topic idea, or initial outline, will result in less learning and poorer performance on subsequent exams and attempts to apply that knowledge.” yale university Pourve Education and Learning Center says:
Use AI as a learning aid
Experts say AI can be used as a tutor or study companion. Therefore, try using a chatbot to explain difficult concepts or brainstorm ideas such as essay topics.
Casey Cuny, a high school English teacher in California, advises her students to use ChatGPT to quiz themselves before a test.
He instructed students to upload class notes, study guides, slideshows, and other materials they would use in class to the chatbot, and to tell the chatbot which textbooks and chapters to focus on on the test.
The student must then instruct the chatbot to: “Based on all the material cited, ask one question at a time, and then create a teaching plan for everything you got wrong.”
Cuny posts AI guidance in the form of traffic lights on classroom screens. Permitted uses include brainstorming, soliciting feedback on presentations, or conducting research. When using AI is red flags or prohibited: Asking an AI tool to revise your thesis statement, draft, or essay. A yellow light is when a student is unsure if they are allowed to use AI, in which case we instruct them to come and ask.
Or try using ChatGPT’s voice dictation feature, says Sohan Choudhury, CEO of Flint, an AI-powered education platform.
“Just do a brain dump of exactly what you got and what you didn’t get” on a topic, he said. “You can talk for five minutes about what you do or don’t understand about a topic. You can throw out a random analogy and they’ll come back with something tailored to you based on that.”
Check your school’s AI policy
As AI shakes up academia, educators are under pressure to shape technology policy.
In the United States, approximately 24 states have introduced state-level AI instruction for schools, but adoption has been uneven.
It’s worth checking what schools and universities are saying about AI. Some companies may have broad policies that span the entire organization.
University of Toronto stance It says, “Students will not be allowed to use generative AI in their courses unless explicitly permitted by the instructor,” and students should review their course descriptions to see the do’s and don’ts.
Many other companies don’t have blanket rules.
The State University of New York at Buffalo “does not have a universal policy,” the report said. Online instruction for instructors. “Instructors have academic freedom to decide what tools students can and cannot use to achieve the learning objectives of their courses. This includes artificial intelligence tools such as ChatGPT.”
Don’t hide your use of AI from teachers
AI is not the educational idiot it once was.
There is a growing understanding that AI is here to stay and the next generation of workers will have to learn how to use the technology, potentially disrupting many industries and professions.
Therefore, students should not be afraid to discuss its use with their teachers, as transparency prevents misunderstandings, Chowdhury said.
“Two years ago, a lot of teachers were totally against this issue. It’s over, just don’t bring up AI at all in this class,” he said. But three years after ChatGPT’s debut, “a lot of teachers understand that kids are using it, so they’re much more open to having conversations rather than having a blanket policy.”
Teachers said they are aware that students are wary of asking if they are allowed to use AI for fear of being labeled as cheaters. But it’s important to be clear because it’s very easy to unknowingly cross the line, said Rebecca Fitzsimmons, AI faculty advisory chair at Carnegie Mellon University’s Heinz School of Information Systems and Public Policy.
“Students often don’t realize they have tools to help them modify the content they create and when they cross the line between producing content,” said Fitzsimmons, who helped draft detailed new guidelines for students and faculty seeking clarity.
The University of Chicago says students must cite AI if it is used to come up with an idea, summarize a piece of writing, or help draft a paper.
“Where appropriate, please be aware of this in your work,” the university said. “Just like citing a book or website, giving credit to AI when applicable helps maintain transparency.”
And don’t forget ethics
Educators want students to use AI in ways that are consistent with their school’s values and principles.
University of Florida students say You should familiarize yourself with your school’s honor code and academic integrity policy to “ensure that the use of AI is consistent with ethical standards.”
oxford university say AI tools must be used “responsibly and ethically” and in line with academic standards.
“We must always maintain a critical approach to using AI tools and using the output produced by these tools with honesty, integrity, and transparency,” it reads.
____
Do you have a technical topic that you think should be covered? If you have any suggestions for future versions of One Tech Tip, please contact us at (email protected).
