Register      Login
Microbiology Australia Microbiology Australia Society
Microbiology Australia, bringing Microbiologists together
RESEARCH ARTICLE (Open Access)

Teaching and assessment of the future today: higher education and AI

Melissa M. Lacey A * and David P. Smith A
+ Author Affiliations
- Author Affiliations

A Sheffield Hallam University, Howard Street, Sheffield, S1 1WB, UK.




Mel Lacey is a microbiologist with over a decade’s teaching experience at Sheffield Hallam University. She is an active researcher in both the Accessibility of Science and Molecular Microbiology groups. Mel’s research spans bacterial biofilms, novel antibacterial agents and their applications, the microbiome and environmental microbiology through to pedagogy around inclusive practices and public engagement.



David Smith is a National Teaching Fellow and Professor of Bioscience Education, teaching Molecular Bioscience and Biochemistry. He is a Senior Fellow of the Higher Education Academy. David has been research-active in the field of biosciences for over 20 years focusing on the molecular basis of neurodegeneration in diseases such as Alzheimer’s and Parkinson’s and pedagogy in Higher Education.

* Correspondence to: m.lacey@shu.ac.uk

Microbiology Australia 44(3) 124-126 https://doi.org/10.1071/MA23036
Submitted: 7 June 2023  Accepted: 30 June 2023   Published: 14 July 2023

© 2023 The Author(s) (or their employer(s)). Published by CSIRO Publishing on behalf of the ASM. This is an open access article distributed under the Creative Commons Attribution 4.0 International License (CC BY).

Abstract

Artificial intelligence (AI), once a subject of science fiction, is now a tangible, disruptive force in teaching and learning. In an educational setting, generative large language models (LLM), such as OpenAI’s ChatGPT, perform and supplement tasks that usually require human thought, such as data analysis, understanding complex ideas, problem-solving, coding and producing written outputs. AI advances are moving quickly. From the emergence of ChatGPT 3.5 in November 2022, we have witnessed the arrival of other progressive language models, like OpenAI’s GPT-4, Google’s Bard AI and Microsoft’s Bing AI. Most recently, AIs gained the ability to access real-time information, analyse images and are becoming directly embedded in many applications.

AI in an educational setting

Many generative artificial intelligence (AI) platforms are open access and easy to use through intuitive chat bot interfaces. The outputs are presented in a convincing human-like language and can greatly increase productivity when used well. These models employ machine-learning algorithms to generate text that mirrors human-like communication, a feature that bears numerous implications for pedagogical practices. Preventing AI from affecting our teaching and assessments is an impossible task. Instead, we need to embrace the change and think about how to enhance our teaching and learning with AI and be aware of how it can be used and where its limitations sit.1

The speed at which AI is changing and improving leads to three immediate questions:

  1. How will students use this technology?

  2. How can we embed AI in our teaching, learning and assessment?

  3. How will we as educators keep up with what is happening?

In this article, we will talk about some potential positives and negatives of AI technology and how these present themselves in today’s learning spaces.

How will students use AI technology?

Academics and students now have ready access to applications that can help them create essays, blogs, video transcripts, reflections, workflows, summarise peer review publications, etc. in the form of several AI programs. When constructed well, AI prompts can enlighten and give the user targeted and relevant insights into real-time information.

One of the most compelling applications of such AI in education is its role as a ‘virtual tutor’. Students can interact conversationally with an intelligent system that can provide explanations, offer guidance, answer queries and give feedback. Such interactions can be particularly useful for individual study, allowing each student to receive personalised support at their own pace and on their own schedule. AI can also generate new scenarios or problems for students to solve, making every learning experience unique and engaging. AI also has a role to play in inclusive practices, particularly when thinking about dyslexic and neurodivergent students. Students with such needs can struggle with large complex written documents and AI can summarise or rephrase often complex scientific literature in a more useful format for that specific student.

Should we teach students how to use AI?

AI literacy will become an essential digital skill, and our role as educators is to introduce our students to this new area, providing them with the language and thinking needed to utilise this technology at a graduate level.2 Prompt engineering, the act of interacting with large language models (LLM), is a key part of skill development, as the information gained from an AI is only as good as the questions used when interacting. One potential future use could be to assess not the AI-written product but to assess the students’ prompt engineering so they can probe a given topic more deeply and gain meaningful outputs. AI tools can then be used to generate expanded notes on the learning materials and help students understand complex topics. Effective use of AI can increase students’ productivity and result in a high standard of work through improvements in spelling, grammar and structure.3

A novel challenge is how to get students to value the skill of writing when AI can do it for them. We should also teach our students the limitations of the technology and its biases, equipping them to evaluate the outputs and their use. AI will not make a bad writer good, but it will make a good writer better. Those who can write and have a good understanding of AI will have a life-long skill that translates beyond university and essay writing to increased efficiency and effectiveness in the graduate world.4

It is also critical to discuss with our students the ethical use of AI. Who owns the copyright, for example, and what is happening to the data they input? It is also worth noting that current LLMs have an inherent bias within them due to the training sets used in their creation. Academics should be equally mindful of the digital divide that can be created through promoting AI use; not all students have the required skill sets or the monetary means to access the latest models.

Will AI mean that my students will cheat in my assessments?

Often, the first topic of conversation with colleagues about AI is assessment: ‘Will my student use AI to cheat?’, ‘Will I be able to tell?’ and ‘How can I make my assessment AI-proof’.5,6

When considering these questions, it is important to remember that students have been cheating since assessments began. That is not to say that academic integrity is unimportant, it is one of the cornerstones of education, but to put into context that AI is simply another tool that students who are inclined to cheat could use.

It is tempting to speculate that a technology-based problem needs a technology-based solution. If a student uses AI to write an essay, can we detect it using AI or another tool? This approach has had reasonable success with detecting basic plagiarism and collusion using anti-plagiarism software such as Turnitin; however, more nuanced collusion and plagiarism by those with a good understanding of how the anti-plagiarism software works can often go undetected. The same arms race will play out with AI-detecting software and AI as has happened with plagiarism and anti-plagiarism software, just a lot quicker! AI-detecting software may be able to help with detecting some AI-based cheating but should certainly not be relied on.

The only way to make a piece of assessment entirely AI-proof is to have in-person assessments with no access to a computer or the internet, such as in-person exams, phase tests, oral exams and practical exams. These assessments are excellent for determining the amount of knowledge a student has and their skills, but only assess a subset of the skills that we hope our students will acquire during their time in higher education. Our students, however, will be graduating into a world where AI is embedded, and we would be well placed to prepare our students for that.

How to design assessments in an AI world

When designing or updating an assessment, it is important to reflect on what you want out of that assessment in terms of knowledge, understanding, skills and behaviour you would wish to instill in your students leading up to the assessment. When assessing students’ core knowledge, a closed-book face-to-face assessment is still an option but comes with issues around equality, diversity, inclusion and fair access. If you wish to create an assessment of authentic scenarios, then including AI and testing problem-solving skills on topics that students value is an appropriate strategy.7,8

Given that AI can create convincing written outputs from an assessment perspective, it is becoming increasingly clear that the student’s individual input needs to be documented and recorded alongside that of AI. Educators need to consider the student’s contribution to the final product and how they used AI outputs to help develop it. Students may be required to produce drafts and document search and prompt strategies. There are also benefits to tackling academic integrity by having an assessment that builds and works to the final article, where you can track who created it and how a student justifies their contribution to the work. Sweeney recently presented a paper at the ‘9th International Conference on Higher Education Advances’ that embraces these ideas.9 They presented ideas whereby, instead of a formulaic academic essay, students need something more open-ended and studied to the individual. Proposals are made whereby assessments are restructured into a reflective account focusing on analysis and critiquing the source materials.9

How do educators keep up to date?

How do we as educators keep up to date in this fast-moving field when many of us do not have a passion for IT and our time is taken up with other responsibilities, and we may not have the headspace to develop an understanding of this technology? Several online self-paced courses exist, such as Google’s Introduction to Generative AI (see https://www.cloudskillsboost.google/course_templates/536).10 Many of the support and practice-sharing networks in place during the COVID-19 pandemic have become busy again with colleagues from across institutes and fields sharing knowledge and practice of AI, it is these communities of practice that will enable us to learn and adapt to the next step change in learning and teaching. Many of the ideas discussed and skills developed during the COVID-19 pandemic directly apply to teaching, learning and assessment in an AI world. It is not our role to be experts but to learn with and alongside our students. Consider what you would do if you were in your students’ place, how would the technology help you? Educational institutes will be places of learning and support, in grassroots and whole-institute guidance and practice sharing. Publishers and learned societies have a role in sharing good practices and up-to-date thinking in the forms of publications, blogs, conferences and workshops. This learning and skills development will not only be passed onto our student cohorts but used to improve our practice and teaching will applications in marking, report writing and even session planning.1113

Summary

AI should be seen as a new tool, rather than a threat to our practice. AI will change how students study and will affect our teaching, learning and assessment. We need to focus on teaching our students how to ethically use this tool and create assessments that encompass AI skills, seeing the proficient use of AI as a graduate attribute. The higher education teaching community has dealt with fast change before in an inclusive and supportive manner, and we can do it again, at least this time it’s not in a global pandemic!

Data availability

No data have been generated linked to this manuscript.

Conflicts of interest

The authors declare that they have no conflicts of interest.

Declaration of funding

This article did not receive any specific funding.

References

Sallam M (2023) ChatGPT utility in healthcare education, research, and practice: systematic review on the promising perspectives and valid concerns. Healthcare 11, 887.
| Crossref | Google Scholar |

Chen X et al. (2022) Two decades of artificial intelligence in education. Educ Technol Soc 25, 28-47.
| Google Scholar |

Brew M et al. (2023) Towards developing AI literacy: three student provocations on AI in higher education. Asian J Distance Educ 18, 1-11.
| Crossref | Google Scholar |

Mhlanga D (2023) Open AI in education, the responsible and ethical use of ChatGPT towards lifelong learning. In Education, the Responsible and Ethical Use of ChatGPT Towards Lifelong Learning. 10.2139/ssrn.4354422

King MR, , chatGPT (2023) A conversation on artificial intelligence, chatbots, and plagiarism in higher education. Cell Mol Bioeng 16, 1-2.
| Crossref | Google Scholar |

Rudolph J et al. (2023) ChatGPT: bullshit spewer or the end of traditional assessments in higher education? J App Learn Teach 6, 342-364.
| Crossref | Google Scholar |

Newton PM (2023) ChatGPT performance on MCQ-based exams. EdArXiv 21 February 2023 [Preprint].
| Crossref | Google Scholar |

Nikolic S et al. (2023) ChatGPT versus engineering education assessment: a multidisciplinary and multi-institutional benchmarking and analysis of this generative artificial intelligence tool to investigate assessment integrity. Eur J Eng Educ 48, 559-614.
| Crossref | Google Scholar |

Sweeney S (2023) Academic dishonesty, essay mills, and artificial intelligence: rethinking assessment strategies. In 9th International Conference on Higher Education Advances (HEAd’23), 19–22 June 2023, Valencia, Spain. Editorial Universitat Politècnica de València. 10.4995/HEAd23.2023.16181

10  Vergadia P (2023) Seven new no-cost generative AI training courses to advance your cloud career. Google Cloud Blog, 21 June. https://cloud.google.com/blog/topics/training-certifications/new-google-cloud-generative-ai-training-resources

11  Firat M (2023) How chat GPT can transform autodidactic experiences and open education. OSF Preprints 12 January 2023 [Preprint].
| Crossref | Google Scholar |

12  Hill-Yardin EL et al. (2023) A Chat (GPT) about the future of scientific publishing. Brain Behav Immun 110, 152-154.
| Crossref | Google Scholar |

13  Naumova EN (2023) A mistake-find exercise: a teacher’s tool to engage with information innovations, ChatGPT, and their analogs. J Public Health Policy 44, 173-178.
| Crossref | Google Scholar |