The landscape of bioscience education has changed dramatically over the last few years, first due to the COVID pandemic and now with the explosion of artificial intelligence. David Smith, Professor of Bioscience Education at Sheffield Hallam University, discusses some of the bigger questions in relation to the use of AI in bioscience education and assessment.

Here, we are thinking about generative AI tools like the popular ChatGPT/GPT4 that generate text and images against a prompt, rather than AI tools like AlphaFold2, which have done amazing things in protein folding prediction. Depending on how the GenAI technology is used, it can either replicate and circumvent the thinking process or, more appropriately, supplement and support students in their educational journey. GenAI tools, when used as a support, can aid individuals in generating ideas, stimulate new directions of thinking; they can help generate structures and remove the daunting blank page when starting to write. Well-crafted prompts can be utilised for skills development, for interview processes or to help identify one’s reflection on skills developed in the laboratory environment.

Prompt for interview practice: "In this conversation, you will take on the role of an interviewer. You are looking to hire an undergraduate placement student for 1 year. The company is a pharmaceutical company looking for a bench scientist. Ask questions that would be suitable for this role. You will ask the question, and I will then give you my answer. You will then give feedback".

GenAI tools like ChatPDF and Consensus can also simplify research articles, pulling out the main points and making them more accessible to students and those who are not experts in the field. Overall, the potential challenges and opportunities depend on how the tool is used.

Although there are tools available that can determine the likelihood of a text being generated by GenAI, their results should be used with caution since they only provide a probability score and not, for example, a direct match like plagiarism detection tools would. Additionally, these detection tools are not always accurate and can flag human-written text as AI-generated, especially for neurodivergent students. However, AI-generated text does have a distinct writing style and voice, which can be easier to detect when the AI is being used without examples or prompts to guide it. In these cases, the output tends to be generic and nonspecific; they lack the individuality that students bring and are devoid of spelling errors but prone to inaccuracy. Conversely, when examples or multi-shot prompts are given (where the AI model is trained for a given task), detecting whether the text is AI-generated becomes much harder. Detecting AI-generated text depends on how the AI is being used. If it has been trained well with the users’ own written examples, it can be challenging to differentiate between human and AI-generated text.

This can be challenging due to the digital divide that GenAI has created. Students who can afford access to high-end tools have an advantage over those who cannot. To address this issue, assessments should not mandate using GenAI tools. There are ethical, environmental, and accessibility reasons for this; instead, assessments should be designed with GenAI in mind and how it can supplement the learning and not replace it (Generative AI in Assessment), for example, focusing on the critical analysis of the information and the act of creation rather than assessing the final article. Mandating a GenAI tool should only be considered if all students have equitable access to that tool and it complies with your institution’s data protection policies.

Numerous productivity tools can assist us in our work. For instance, while recording this article, I am utilizing GTP4 text-to-speech feature to dictate the content. Once I finish speaking, the tool processes and edits the text into a written format with correct punctuation (great for dyslexics like myself). GenAI tools have proven to be an excellent way to rephrase written content. For example, I have used them to convert my academic-sounding exam questions and assessment briefs into language my students can easily understand. Similarly, some of my colleagues have used GenAI to transcribe their Panopto videos and lecture captures. They then use these transcripts to create instant notes for their students based on the delivered content, making sure the material is relevant and tailored to their course. By providing the GenAI model with examples of past assessment briefs and rubrics, you can reword these to make them accessible to your students or to fit your current assessment needs. GenAI tools offer a vast range of creative possibilities. They can generate case studies that you can share with your students. Each student can potentially have their own case study to work on.

Research tools such as Consensus and Research Rabbit are great ways to remain up to date with the literature by searching through research questions rather than keywords. ConnectedPapers will generate networks of papers and citations so you can see how the body of literature fits together. GenAI tools have proven helpful in debugging coding, and I have found GTP4 particularly useful in data analysis. I used dummy data within GTP4 and then exported the code to R for analysis.

Focusing on staff training and development is crucial to ensuring the efficient use of GenAI as a support tool. Various websites, including AI-forge, put together by Dr Nigel Francis from Cardiff University and myself, offer ideas and guidance on incorporating this technology into teaching practice, specifically around prompt engineering. The simplest way to get going is to create and use an account as you believe your students would. Ask it about a topic you are an expert on and critique the outputs, then ask about a topic you know nothing about. This first-hand experience will provide insight into its usage from the student perspective.

From the student perspective, integrating GenAI into your curriculum provides a unique opportunity to discuss with your students the ethical considerations, data integrity and environmental impact of AI technology. Through this integration, you can delve into how the models function, what they generate and how they operate. By exploring the ethics of GenAI and how it uses data, how model data is collected and potential copyright issues, you can initiate a conversation about the ethical use of GenAI in their coursework and assessments (Using Generative Artificial Intelligence: A Student Guide). Engaging in an open dialogue with your students about their use of AI and how AI tools align with university regulations and assessment requirements can also help them understand what is and is not acceptable.