A discussion from a LUNCH & LEARN webinar on Institutional AI Policies and Guidelines

Published On: 18 June 2025|

The panel presentations at the LUNCH & LEARN webinar of 22 May, titled “Institutional AI Policies and Guidelines in South Africa for Learning and Teaching,” triggered rich discussions and led to an exchange of ideas on diverse institutional approaches to Generative AI, and how these are taking shape.  

Co-facilitated by Dr Nicola Pallitt, Senior Lecturer and Educational Technology Specialist, Centre for Higher Education Research, Teaching and Learning (CHERTL), at Rhodes University, and Ms Elizabeth Booi (Head of Business Intelligence and Data Architect, University of the Western Cape), the webinar made room for attendees’ questions and comments after three panellists’ presentations on their universities’ approaches to Generative AI.  The panellists represented the University of Johannesburg, the North-West University and the University of the Western Cape. 

Below are the key takeaways, organised by speaker contributions.

Reducing overwhelm: Dr Gloria Castrillón, Senior Director – Division for Teaching Excellence, University of Johannesburg

Dr Castrillón (left) began by acknowledging how rapidly AI has reshaped higher education. “How do we keep up?”—  a question raised by Professor Anné H. Verhoef and echoed by Dr Juliet Stoltenkamp — reflects anxiety across institutions. Dr Castrillón shared that UJ’s strategy relies on peer-to-peer learning rather than solely on top-down “expert” workshops. She noted:

“AI makes us feel stupid, so we neutralise that by having academics share experiences with colleagues. If you want to do X, here’s how someone you work with is doing it.”

This human-centred approach gives staff a safe space to learn together, ask questions, and observe real-life examples before experimenting themselves.

De-emphasising detection tools: Ms Janet Bruce-Brand, School of Accounting, Economics and Finance Lecturer, University of KwaZulu-Natal, and Dr Gloria Castrillón, Senior Director for Teaching Excellence, University of Johannesburg

Ms Janet Bruce-Brand (right) emphasised what several experts, including international AI scholars, echoed: AI-detection tools are unreliable. She commented:

“You cannot rely accurately on those tools. They will not stand up legally. We must de-emphasise detection and focus on equipping students with skills to face this technology.”

Dr Castrillón agreed:

“I think we’re wasting too much time on detection. Instead, obtain a stress-free writing sample from each student so you can compare styles rather than chase phantom AI use.”

Both stressed that detection tools should be a last resort, not the primary line of defence. Instead, academic integrity should focus on transparent dialogue, baseline writing samples, and trust-building.

Balancing Bans and Pedagogy: Dr Juliet Stoltenkamp, Director, Centre for Innovative Education & Communication Technologies (CIECT), University of the Western Cape

Dr Stoltenkamp (left) reminded participants that blanket prohibitions often backfire. She shared examples of lecturers who explicitly forbid AI, yet students continue using it. In one history course, a lecturer gave students AI-generated text alongside scholarly articles, asking them to critically compare positions. Dr Stoltenkamp observed:

“Some lecturers still say, ‘I don’t want students to use AI.’ That is their prerogative. But if students will use it anyway, the wiser choice is to teach them how to use it responsibly.”

She also noted that UWC continues to implement detection tools where needed, but only as part of a broader strategy that includes robust pedagogical design and clear ethical guidelines.

AI as a Teaching Tool: Dr Gloria Castrillón, Senior Director of the Division for Teaching Excellence, University of Johannesburg

Shifting from prohibition to active integration, Dr Castrillón described how UJ uses AI within assessments. Instead of one-size-fits-all tests, UJ’s encourages the use of online assessments that adapt in real time. An item bank—combined with AI logic—identifies knowledge gaps and delivers tailored follow-up questions, ensuring students master concepts before advancing. Dr Castrillón likened AI to a calculator:

“It’s no different from a calculator. If we trust calculators for advanced math, we can trust AI for targeted learning interventions. We just have to design our activities so AI becomes an intelligent assistant rather than a cheat sheet.”

Reframing Plagiarism and AI: Ms Janet Bruce-Brand, Lecturer, University of KwaZulu-Natal

Ms Janet Bruce-Brand clarified that AI output is not plagiarism per se, since plagiarism involves copying another person’s work. She explained:

“Using AI doesn’t inherently break copyright laws. The risk is inadvertently repeating someone else’s phrasing or ideas. If you do that, that is plagiarism. So you can still use Turnitin’s similarity indices—but you must interpret them carefully, knowing AI isn’t human.”

She urged academics to help students navigate this nuance: AI-generated text might flag high similarity, but that doesn’t automatically mean misconduct—only close, unverifiable copying does.

Linguistic Equity and AI Literacy: Ms Elizabeth Booi, Head of Business Intelligence and Data Architect, University of the Western Cape & Dr Gloria Castrillón, Senior Director of the Division for Teaching Excellence, University of Johannesburg

In response to a question about non-native English speakers, Ms Booi (right) warned that forbidding AI use could deepen linguistic inequities. Many students rely on AI for translation and editing. Booi argued:

“AI literacy should no longer be optional; it must be embedded into curricula, teaching practices and institutional processes. Otherwise, students who already face language barriers will be further disadvantaged.” 

She further said that AI literacy must become a foundational skill, taught alongside digital literacy and academic writing. She advocated for dedicated resources—structured workshops, embedding AI tasks in assessments, and ensuring staff and students co-develop guidelines. Failure to do so, she warned, risks widening the digital divide, where those with resources outperform under-resourced peers.

Dr Castrillón added that UJ’s writing centre and certain courses now instruct students to leverage AI for grammar and style. Students learn to ask AI, “What’s wrong with my text?” rather than “Fix this for me,” preserving critical-thinking skills and academic voice.

Cultivating Exemplars: Dr Hanelie Adendorff, Higher Education Academic Advisor, Stellenbosch University and Ms Jacqueline Batchelor, Associate Professor and Vice Dean: Teaching and Learning, University of Johannesburg

Dr Adendorff (left) described how Stellenbosch’s workshops highlight concrete practices shared by academics themselves. At their final assessment workshop, participants pitched their AI uses—and the team drafted short case studies with AI assistance before circulating them campus-wide. “Seeing actual, local examples makes the concept less abstract,” she explained.

Associate Professor Jacqueline Batchelor shared an AI‐assessment instrument developed under UNESCO’s Digital Learning Week. Hosted on a blog, it offers levels of AI integration (low to high) and references frameworks like the AI‐Creator Indicator. Batchelor urged colleagues to scale these exemplars into research:

“We need to move from informal sharing to scholarly study of our AI practices. Document what you do, test its impact and publish it.”

Developing Prompt Literacy: Professor Michael van Wyk, Professor in Economics Education, College of Education, University of South Africa

Professor van Wyk (right) highlighted a critical but often overlooked skill: writing effective prompts. Through his research, he developed a “prompt-literacy” taxonomy (Context, Output, Role, Example). In his SLP, students submit AI prompts to learn how phrasing shapes output quality. He emphasised that prompt literacy is as fundamental as reading:

“If you cannot read, you cannot write a good prompt. Teaching students how to craft clear prompts—so they receive accurate, relevant responses—is a 21st-century skill we cannot ignore.”

Next Steps and Resources: Dr Nicola Pallitt, Senior Lecturer and Educational Technology Specialist, CHERTL, Rhodes University & Dr Hanelie Adendorff, Higher Education Academic Advisor, Stellenbosch University

Dr Pallitt (left) invited attendees to contribute local exemplars to an open, collaboratively curated bibliography of AI resources. She explained that even short, two-page case studies—detailing a single AI-based assessment or classroom activity—can be highly valuable.

Dr Adendorff encouraged colleagues to share practical teaching ideas and podcasts via that bibliography. Both emphasised that collecting and disseminating these examples fosters a community of practice, where institutions learn from one another rather than reinventing solutions.

Key Takeaways

From the discussion, several themes emerged as crucial for embedding AI effectively in higher education. First, peer-to-peer learning proved essential in reducing the sense of overwhelm: colleagues feel more comfortable learning from someone they know than from an external “expert,” and seeing real examples from their environment helps demystify AI. In parallel, participants agreed that over-reliance on detection tools is counterproductive; instead of chasing flags in Turnitin or other detectors, institutions should focus on transparent dialogue, baseline writing samples and trust-building measures to uphold academic integrity. 

At the same time, AI must be reframed as a pedagogical ally rather than a threat—many universities are already using adaptive assessments powered by AI item banks to personalise learning, much like a calculator supports advanced mathematics.

A second cluster of insights addressed equity and skill development. AI literacy needs to be embedded as a foundational competence, taught alongside digital literacy and academic writing, to prevent deepening existing divides among students who rely on translation and editing tools. Equally important is prompt literacy: without the ability to craft clear, context-rich prompts, students receive unreliable or irrelevant AI outputs. 

Finally, the group emphasised the need to document and share local exemplars—short case studies, assessment instruments and frameworks—and commit to ongoing, semi-annual reviews of policies and guidelines. By capturing real-world practices and fostering a community of collaboration, institutions can ensure that AI enhances rather than undermines teaching, learning and research.

Mduduzi Mbiza is a commissioned writer for Universities South Africa.