How universities can take steps to integrate AI effectively into teaching - Joe Ferraro
Many universities have made their expectations for assessments and examinations regarding AI clear. For example, the University of York views the use of generative AI as cheating or plagiarism and reserves the right to “treat generative AI use as academic misconduct.”
The general consensus is that students should not use AI to generate assessment answers unless they have been told to.
However, despite these basic rules and policies in place, institutions still have a long way to go in terms of ensuring staff and students are AI literate and introducing AI monitoring to assess students’ use of the tools.
This is not the first time in recent years that educators have had to try and balance the human element of teaching and learning with technology.
During the pandemic, many universities had to make a quick pivot to online and asynchronous learning, and student engagement suffered. Classes held over Zoom made it more challenging to connect with students on an individual level.
And previously, in the 1990s to 2000s, all the real-time collaboration tools, like online discussion forums, video conferencing, and document sharing, meant the education sector started to shift from traditional teaching methods to online resources and e-learning to satisfy student demands.
The internet revolutionised access to knowledge, aided remote learning communication, boosted in-class collaboration and changed information sharing. As a result, educational institutions reinvented themselves to stay relevant.
But during the transition to the internet, misinformation and information overload became commonplace too—so it seems we’ve gone full circle. That’s why the expectation was that teachers would become advisors in the maze of information, encouraging a critical approach and creating an opportunity to express opinions and verify knowledge.
When it comes to AI, the same issue is arising in terms of creative thinking, sensible use of information and evaluation skills in higher education. Today’s teachers need to get students to go further, thinking beyond the initial responses presented by AI models and developing skills to identify biases and evaluate the credibility of sources. That’s why all 24 Russell Group universities recently reviewed their policies and created guiding principles for generative AI.
However, institutions can take other steps to keep students engaged and integrate AI effectively. For example, they can encourage comparative analysis between human and AI-generated text, assign research that requires engagement with local issues, invite external speakers to share expertise and provide opportunities for internships.
Many of the debates that happened a decade ago are reminiscent of what is happening now. Back then, an academic report said: “It is necessary to help children understand that technology is only a tool to help in our lives and cannot be a lifestyle.”
This pointer still stands, and teachers, from primary education to university, are responsible for teaching students about the ethical implications of heavily relying on AI for decision-making and its limitations, like privacy concerns.
Redesigning education post-internet also wasn’t only about the “tools used at school,” but a profound “change in the philosophy of education.”
Being in a similar situation now, it’s time for educators to be asking thought-provoking questions like: What skills do students need to engage with generative AI sensibly? How can teachers prioritise learning over grades? How can we move toward experiential learning?
Today’s knowledge students are not simply creating information, but synthesising and analysing information that is readily available.
So, that’s when institutions can also look to external help to find ways to incorporate AI into assignment processes.
Some online tools encourage faculty to have students examine AI writing, review the pitfalls, and act as editors or fact-checkers.
There’s also social annotation: an innovative conversation layer over the web, which allows students and educators to use collaborative annotation and discussion to think critically about AI-related content.
Higher education institutions have an important role: to capitalise on the opportunities of AI and make students career-ready while simultaneously protecting academic integrity.
The key is to balance Al-driven automation and education with human guidance and mentorship.
Joe Ferraro is VP of revenue at Hypothesis.