Institutional AI Policies and Guidelines in South Africa for Learning and Teaching

Published On: 18 June 2025|

Generative AI tools such as ChatGPT have become common on South African campuses, but their rapid uptake has necessitated responses concerning governance, training and assessment design. On May 22, 2025, the USAf Community of Practice on Digital Education in Learning and Teaching  (DELT CoP) facilitated a LUNCH & LEARN webinar titled “Institutional AI Policies and Guidelines in South Africa for Learning and Teaching.” 

The discussion featured invited panellists from three universities. Professor Anné H. Verhoef, Director at the Artificial Intelligence Hub & Professor in Philosophy, Faculty of Humanities, represented North-West University (NWU). Dr Juliet Stoltenkamp, Director at the Centre for Innovative Education & Communication Technologies, represented the University of the Western Cape (UWC), and Dr Gloria Castrillón, Senior Director for Teaching Excellence, represented the University of Johannesburg (UJ). 

They each discussed the processes and stakeholders involved in crafting the AI policy or guidelines in use at their university. They also discussed the relevance of context in policy making and implementation, how lecturers and students are applying the policies and guidelines in practice, and the challenges encountered during implementation by students, educators, and the institution, particularly in terms of ethical dilemmas. 

The webinar was facilitated by Dr Nicola Pallitt (left), Senior Lecturer and Educational Technology Specialist at the Centre for Higher Education Research, Teaching and Learning (CHERTL), at Rhodes University, and Ms Elizabeth Booi, Head of Business Intelligence and Data Architect at the University of the Western Cape. 

Why was this webinar held?

Ms Elizabeth Booi (below right) stated, “We want to know what it is that we are putting in place to allow for the use of these new technologies within our environment, and we also want to find out what tensions remain as we develop regulations and governance that can promote AI use”.

The webinar noted that some universities have had a policy for some time, while others have more responsive guidelines, and some universities are in the process of developing guidelines.

During their presentations, panellists considered the processes and stakeholders involved in crafting their policies or guidelines, highlighting how each institution included faculty representatives, librarians, IT specialists, administrators and student voices in policy drafting.

They addressed context relevance, such as institutional culture, existing governance structures and local technological capacity in shaping the writing and implementation of their institutional AI policies and/or guidelines. Speakers shared how these policies and guidelines have moved beyond theory, demonstrating how lecturers and students are already using them in practice: whether through department-level workshops on prompt design or by integrating AI modules into core courses.

 

Finally, panellists reflected on the challenges encountered during implementation, particularly ethical dilemmas. These included questions about equitable access to paid AI tools, uncertainty over detection methods, and the emotional resistance felt by both students and educators when confronting rapid change.

Who was it for?

This session addressed academic developers, learning technology specialists, department heads, policymakers, and anyone responsible for AI governance or supporting its practical use. 

Professor Anné H. Verhoef of NWU described the institution’s AI Steering Committee, which brought together faculty representatives from teaching, learning, library and IT—and critically, included student voices from the outset. 

Dr Juliet Stoltenkamp of UWC stressed that at her university, “AI is everybody’s business,” involving academics, multimedia trainers, career-guidance officers and administrative staff. 

At UJ, Dr Gloria Castrillón explained that rather than creating another policy, “The university decided not to have a policy … a policy is a slow thing to change in a university … we didn’t know where this generative AI … was going … so the decision was actually to integrate generative AI … into the various policies in which it was relevant” ensuring that Senate committees, faculty boards and library services all play a part. 

These three institutional responses to Generative AI offered national blueprints for committees, teaching centres and administrative units.

What was discussed?

The North-West University perspective

North-West University began its AI governance work in early 2024, when its University Management Committee set up an AI Steering Committee under the IT division. That group drafted a framework policy urging staff and students to “embrace AI ethically and responsibly.” By year-end, however, it became clear that a single committee could not keep pace with rapidly evolving tools. “AI will change everything—even the fundamentals of how we assess and supervise,” said Professor Anné H. Verhoef (left).

Under his leadership, three developments emerged. Firstly, a concise, agile policy has been drafted and is now under institutional review. It opens with a motivation section that explains “why” the university needs AI governance, rather than merely listing prohibitions. Secondly, AI guidelines have been integrated into NWU’s academic integrity policy, allowing all ethical-use rules to sit alongside traditional misconduct clauses. Thirdly, the Hub is rolling out two targeted training programmes. A student Short Learning Programme (SLP) on “AI for Academic and Career Success” was launched in February 2025, guiding learners through tool-exploration exercises and culminating in a personalised career-planning certificate. A lecturer SLP on redesigning assessments to include AI ethically is scheduled for July 2025, with development guided by the Perkins AI Assessment Scale.

 

Throughout, campus-wide roadshows and faculty-level feedback sessions ensured policies reflect real pedagogical concerns, not generic templates: “We did not just ask our deans; we invited faculty members who showed genuine interest,” Professor Verhoef explained.

The University of the Western Cape approach

At the University of the Western Cape, the focus is on collective ownership. In late 2024, a cross-campus task team comprising academics, professional support staff, IT specialists and student representatives drafted practical AI guidelines rather than a standalone policy document. Those guidelines were refined in a Senate Academic Planning Committee workshop, incorporating feedback from every constituency.

Dr Juliet Stoltenkamp (right) emphasised the importance of “pre-engagement” workshops for each faculty member. A second-year pharmacy lecturer, for instance, introduced students to AI-powered pharmaceutical research assistants, demonstrating how to craft precise prompts, critically evaluate outputs and integrate findings responsibly into lab reports. Building on these workshops, Stoltenkamp’s centre integrated AI literacy into all ICT and multimedia training packages. It launched a self-paced online module on ethical use, citation practices and tool overviews. To sustain engagement, her team circulates weekly “snippets” via email and blog—step-by-step guides such as “Here’s how to generate an image illustrating the water cycle” and “Follow these steps to cite ChatGPT outputs correctly.” These bite-sized updates ensure busy staff and students receive continuous, practical guidance without feeling overwhelmed.

How the University of Johannesburg did it

The University of Johannesburg opted for integration over addition. In January 2024, it published a ’practice’ note and two guidelines for the use of AI tailored for staff and for students—and simultaneously integrated AI considerations into existing governance structures. The Deputy Vice-Chancellors for Teaching & Learning Research and innovation and ICT collaborated to embed AI rules within teaching-and-learning policies, the academic misconduct code and research integrity guidelines.

Dr Gloria Castrillón (left) highlighted UJ’s peer-to-peer showcase series, where lecturers demonstrate concrete examples of AI-enhanced pedagogy: coding assignments critiqued via AI in computer science, multi-draft problem statements refined through AI in engineering and AI-driven data visualisations in business modules. UJ offers two SLPs to complement these: a free open course on AI ethics and large-language-model literacy and Moodle-embedded modules for first- and final-year students that blend AI awareness with research skills, academic writing and career planning. The university’s annual Teaching Innovation Fund further incentivises pilot projects that harness AI, with grant recipients presenting their innovations in peer-to-peer sessions held online.

Making sense of it all

These webinar insights map directly onto existing university workflows. In curriculum and programme design, faculties should revisit learning outcomes to include AI-literacy competencies. NWU plans to embed AI modules within core first-year courses, ensuring all graduates possess critical evaluation skills and ethical-use frameworks. UJ integrates AI discussions into its programme approval process, requiring any new course proposal to demonstrate how AI tools will support specific learning objectives.

Assessment redesign offers another immediate application. Rather than viewing AI as a threat, institutions can leverage rubrics such as the Perkins AI Assessment Scale to guide instructors in evaluating AI-augmented student work. In UWC’s pharmacy pre-engagement sessions, assessments demand not only AI-generated analyses but also student critiques of reliability and ethical implications. UJ’s practice notes encourage multi-draft assignments, where students submit successive versions—some AI-augmented, some manual—to demonstrate both tool proficiency and independent thought. This transforms assessment into a pedagogical opportunity rather than a policing exercise.

Centralised resource hubs within Learning Management Systems (LMSes) form a third application point. Each institution can create a dedicated AI portal linking to tool guides, citation manuals and exemplar prompts. UWC’s weekly “snippets” could be pooled into a searchable repository, while NWU’s AI Hub envisions an online dashboard tracking policy updates, training schedules and roadshow recordings. A single point of reference helps staff and students find answers quickly.

Professional development pathways  must also evolve. Short learning programmes—whether NWU’s SLPs, UWC’s self-paced module or UJ’s open SLP—provide structured, flexible online professional development opportunities that can be embedded into postgraduate diplomas, orientation sessions and faculty mentorship schemes. Meanwhile, teaching innovation funds like UJ’s encourage grassroots experimentation, with small grants enabling departments to pilot AI tools in their own contexts.

Finally, governance and review cycles must be formalised. Senate committees, IT governance boards and research ethics panels should incorporate AI oversight into regular agendas, with built-in review dates and agile amendment processes. NWU’s AI Hub, for example, reports quarterly to its University Management Committee, ensuring policies remain aligned with technological advances. Diverse stakeholder engagement must continue through steering committees and task teams that include faculty, librarians, IT staff, administrators and students. Ongoing feedback mechanisms—roadshows, focus groups, online surveys—allow institutions to track user experiences and refine their guidelines, not just needs but also changes in the GenAI landscape. 

The session made clear how AI policies are integrally connected to institutional contexts and values. Policy decisions and governance structures shape curriculum design, assessment strategies and professional development programmes. Indeed, although the focus of the session was on teaching and learning, the implications of AI impact the very core mission of South African universities.

Mduduzi Mbiza is a commissioned writer for Universities South Africa.