AI in UK Further Education

Artificial intelligence (AI) can be defined as “machines that can complete cognitive tasks” (Brynjolfsson & McAfee, 2014 p. 91). Recent developments in generative AI, which creates texts from prompts using large language models trained on vast amounts of text data, have prompted a re-examination of the use of AI in education (Cotton et al., 2023). The launch of OpenAI’s ChatGPT in October 2022 has particularly driven conversation due to its ease of use and conversation style. Claims have been made that it can pass several accredited exams including UK medical exams (Maitland et al., 2024). However, the use of AI tools is not new to education. AI has been embedded in tools used by educators for many years (Zawacki-Richter et al., 2019). For example, spell checking in Microsoft Word is based on AI. In wider society, Apple’s Siri and Amazon’s Alexa are examples of AI-based digital assistants. JISC (2024a) claim that generative AI has the power to transform education, and commercial companies such as Teachermatic have produced ChatGPT-based tools that offer educators tools to produce educational content such as multiple-choice quizzes, lesson plans, and marking rubrics, which they claim will save teacher time and reduce teacher workload (Sheppard, 2023).

In addition, this technology has many possible benefits for students. For example, it could help with accessibility, organise work, offer suggestions for starting points, compose writing frames, and generate plausible content. Indeed, one academic paper, Cotton et al., (2023), used ChatGPT to generate a large proportion of the paper.

The reaction in education has been mixed. On one hand, some educators have embraced the opportunities offered by generative AI (Meakin, 2024), while others are sceptical (Selwyn, 2024). There has been much discussion around the use of AI tools by students completing assessments and whether this use amounts to academic malpractice. This needs to be considered in the light of the fact that 58% (FE News, 2023) of students in a survey reported using ChatGPT. However, the survey also pointed out that students were using ChatGPT as a study aid rather than a straightforward plagiarism tool, and only 4% of those surveyed admitted that they copy and pasted material directly from chatGPT.

Technological solutions for detecting student use of GenAI have been launched (Turnitin, n.d.), with some educators and exam bodies relying on the technology to detect academic misuse of the tools. However, it has been shown that this technology is unreliable, produces false positives (Weber-Wulff et al., 2023), and is discriminatory towards those who have English as a second language (Rospigliosi, 2023), irrespective of the claims made by the manufacturers of these tools. There has been a mixed reception to these tools by educational establishments, with some relying on it (University of Lincoln, 2024), while others have turned it off (McMahon & Hutt, 2023).

Within the education sector in the UK, much advice has been given to both the school sector and the higher education sector. The Russell Group (2023) has produced guidance for universities which takes a progressive stance on AI adoption and use; however, at the time of writing, no such guidance has been produced by the Association of Colleges for the FE sector.

As a manager in FE, I want to understand how to support my college in responding appropriately to these generative AI developments. I want to understand the feelings of, and impact on teachers of GenAI.  The implementation of new technology in educational settings can be complex. UNESCO state that pedagogy can be transformed by the integrating and accepting digital technologies within education (UNESCO, 2018). This can only be achieved if the digital literacy of the educators and the technical resources of the institution are sufficient. Training in digital use has only been recently introduced in initial teacher training in FE (ETF, 2023) and as Bećirović (2023) points out, many teachers’ digital literacy and competence has not developed beyond the level from their own training. The successful integration of digital tools in education is often due to the enthusiasm of the individual for digital technology (Beetham & Sharpe, 2023).

In considering the impact of AI on FE, I will use the Technology Acceptance model (Davis 1989). This considers the factors which impact technology acceptance and is derived from both the theory of reasoned action and the theory of planned behaviour. Granić & Marangunić, (2019) state that perceived usefulness is the key factor underlying technology acceptance, which in turn is affected by the perceived ease of use. Generative AI is a disruptive innovation in education, challenging our existing practices and systems. Bowyer and Christianson’s (1995) theory of disruptive innovation suggests that incumbents often fail to exploit innovation. As a relatively newly formed college group, there is the opportunity to embrace GenAI to improve the delivery of education to our community.

There is little academic literature to date evaluating the impact of AI from a manager’s perspective. Therefore, I will analyse the guidance for FE given to educators in my college group. I will assess the guidance tensions, agreements, and their relevance as guidance for my staff as an FE manager. It is appropriate start with an analysis of the guidance as this has not yet been extensively implemented in any context, because of how new the technology is, and how slow institutional responses have been. Considering whether the guidance from different stakeholders aligns is important, to examine whether advice is consistent.

The key advice given to staff in our college group are the JISC guide:  Principles for the use in FE colleges (JISC, 2024b), and JCQ AI Use in Assessment: Protecting the Integrity of Qualifications (JCQ, 2024). JISC is the UK agency for digital and technical support in further and higher education. They conduct research, provide advice, and offer digital services for these sectors. JISC leads on the digital transformation of further and higher education in the UK. JCQ is the Joint Council for Qualifications, representing the major awarding bodies in the UK. They are primarily concerned with maintaining examination integrity through policy and standardisation of exam procedures. As such this advice and guidance may not be consistent as they are trying to achieve different goals.

I will now consider the guidance, first examining the guidance regarding staff using AI to support their own work, and then looking at the guidance regarding staff supporting students use of AI tools.

The JISC advice takes an innovative approach to the use of AI. It looks at how staff may use the tool to support their own work. It discusses the limitations and biases of the technology and suggests that students should be informed when AI is used to create learning resources or in supporting marking.

“Education institutions should also be open and transparent, ensuring the learners understand when AI is used to create learning resources, support learning delivery or within the assessment and monitoring process. Learners will also be informed how their personal data is being processed using AI tools.” (JISC, 2024b. 1.2)

It discuses that there seems to be some evidence for the tools reducing workload by automating some educational tasks.

“Alongside making tasks quicker, activities that were challenging before because of time constraints become possible.  Examples include improved differentiation for learners, using AI to create resources in multiple ways and using AI to create formative assessment resources and materials.” (JISC, 2024b. 3.1)

The guidance seeks to set out fair and open approach to AI use, whilst acknowledging that the use of AI is only going to increase, not only in education but the wider world and places the responsibly on colleges to equip students with the required skills to progress in an AI enabled world.

“AI is evolving at a rapid pace, and therefore, while teaching learners to use the AI tools of today is valuable, this needs to be supplemented to include a broader AI literacy, to enable learners to critically evaluate tools of the future.” (JISC, 2024b. 2.1)

The guidance also points appropriate use of AI in assignments – it places the responsibility on colleges of ensure that students are informed about what constitutes appropriate AI use for any assignment.

“All colleges will provide clear guidance to learners on appropriate use of AI in their assignments. This includes general principles and guidance, along with more specific guidance at assessment level.” (JISC, 2024b. 5.2)

It discusses the use of AI detection tools but warns that these tools can be discriminatory and provide false positives when used. As such, it suggests that users of these tools in colleges should be trained in understanding the limitations of AI detection tools.

“There is a risk AI detection can unfairly discriminate and can compound existing bias.” (JISC, 2024b. 5.3)

However, the guidance does suggest that assignments should be made relevant for an AI world, and that colleges should be designing assignments with this in mind and looking at ways that authentic assessment can be maintained whilst ensuring that students develop the necessary AI skills.

“…help them become AI Citizens equipped to navigate the use of AI in their everyday lives.” (JISC, 2024b 2.3)

“…move towards a consistent approach for the use of AI in assessments, with the aim of making assessments authentic and relevant to an AI enhanced workplace and society …” (JISC, 2024b 2.4)

The JCQ guidance is concerned about maintaining the integrity of assessments. Its guidance is concerned about AI use as malpractice in assessments. It does not acknowledge that AI can assist in the development of skills and knowledge and clearly states that students must not rely on AI tools to help them develop knowledge and skills.  The guidance emphasises that students work must be their own without AI assistance and warns against using AI generated content as it may be inaccurate or generate false information such as references. There is an acknowledgement that AI might be used in the earlier stages of course work but it doesn’t give clear guidance on the exact extent of what it considers reasonable. It does, however, echo the JISC guidance on being aware of the potential biases of AI generated content and makes emphasise that AI generated information may be incorrect.

The general message coming from the JCQ guidance is to restrict the use of AI. This is contrast to JISC which looks at way to develop the integration of AI into assessment.  This restriction is further enhanced by the suggestion that AI tools should be used to detect possibility use of AI in submitted work together with a range of indicators of AI use. However, whilst JCQ suggest that this is part of a holistic approach to detecting AI use, it has been criticised by Advance HE’s Charles Knight (2024) as “vibes” and whilst indicators such as inconsistent grammar, spelling or language use could be caused by the use of an AI tool it isn’t conclusive proof that a tool has been used. This can lead to false accusations of AI use and could discriminate against students with lower standards of English proficiency (Perkins et al., 2023).

The emphasis on the use of automatic detection tools, by most notably, Turnitin and others is again misplaced. Even though is it is suggested with much caution by JISC in their guidance, there has been subsequent further guidance where JISC discourages its use as “institutions therefore shouldn’t rely on AI detection” (Webb, 2023b). Furthermore, studies into the use of AI detection tools confirm this. Weber-Wulff et al., (2023) state that both publicly available and commercial detection tools are “neither accurate nor reliable.” Price & Sakellarios (2023) warn that AI detectors will generate more false positives to non-native English speakers, than native ones. While Sadasivan et al, (2024) point out that AI detection can be broken by paraphrasing the generated text.

The two documents approach to educators’ use of AI also differs in tone. It is encouraged in the JISC guidance. It acknowledges the possibilities that AI can offer and actively encourages the discussion on how colleges can integrate these tools in their teaching and assessment, whilst at the same time ensuring that students are prepared for an AI enabled world. The JCQ document however sees AI use as a risk to assessments. It places the emphasis on the educator’s responsibility to ensure that students understand the JCQ stance on what constitutes AI misuse. JCQ believe that students must not use AI tools in the production of assessed work.  There is no discussion of how AI could be integrated into assessment practice to produce better, more relevant assessment as the validity of the assessment is related to their authenticity. Understanding AI use as merely a risk diminishes the educational gain that might be achieved when incorporating AI. 

As guidance for educators and mangers within my institution, these documents, whilst containing inherent contradictions, make the possibilities of AI adoption problematic. As a manager I wish to incorporate AI technologies within the department. The possibilities of AI to enhance learning are great, however due to rapidly changing landscape of AI use and adoption guidance for staff should help them navigate the change to an AI enabled world. 

The JISC document is helpful as it gives guiding principles to use when considering integrating AI into the curriculum. The JCQ documents gives rules to be adhered to when conducting existing currently designed assessments. AI is a disruptor to current assessment practices, and as such this is point where we need to consider what assessment is for and how does it represent the skills required by students within my sector. Assessments need to be valid and authentic for an AI enabled world.  However, there is no discussion around this coming from JCQ as the representative of awarding bodies.

Within my department, most of my students are apprentices or preparing for work in engineering. The wider engineering workplace is rapidly adopting AI and students need to be equipped with AI skills (UKESF, 2023). Students need to understand what AI is doing in their specific area of engineering. The world is being rapidly AI enabled and students need to have the tools to navigate the rapid developments. This aligns with the JISC guidance and is a progressive approach to AI adoption in education.

The engineering apprenticeship is assessed in several different ways. Direct observation and professional discussions in the workplace are clearly AI-proof, as the apprentice is actively working in the environment, producing, or repairing engineering machinery. This method is reliable and authentic for assessment. Apprentices also must sit short-answer paper exams under exam conditions and online multiple-choice question papers, again under JCQ exam conditions. These are AI-proof assessments as they assess the candidates recall of knowledge under traditional exam conditions.

The area where AI may be used is in the production of job reports. Within the apprenticeship, for every assessment, the candidate must produce a reflective technical account relating to a task for which they have been examined. This is to contextualize learning at college within their specific engineering practice. This is an area where generative AI could be used to produce the report, which could be seen as plagiarism in line with JCQ guidance. Conversely, generative AI could be used to help the student produce a better-quality written report, and help them frame, explore, and develop their reflective technical account. However, this would then be outside the scope of the JCQ guidance.

The issues raised in the JCQ document concerning assessment need to be addressed. As pointed out, there is concern about the use of generative AI in summative assessments. Plagiarism by use of generated text within a submitted assessment is described in the document. The use of generated text is not allowed in these assessments. 

When asking Chat GPT 3.5 a question from the level 3 OAL Food & Drink Engineering portfolio, the response produced would fail due to technical errors and not fulfilling the learning outcomes. Chat GPT 3.5 had not been trained on the specific technical data required and therefore cannot produce realistic content; this aligns with the findings of Ji et al. (2023). Hence the use of AI here is insufficient to meet the learning outcome. Therefore, there is little use for AI used in this manner. The JCQ rules do not really have an effect as the assessment would fail due to the content irrespective of who or what produced it.

However, a better use of AI would be to help formulate a writing frame or a starting point for students to develop their work from. This would be useful and allows the student to think about the writing in a logical way. It can also support students who have difficulty writing, with support for spelling and grammar.  As an example, I have produced this in the appendix.

However, this represents a grey area in the JCQ documents; the AI is not being used to generate the content – the student needs to provide the correct technical information but is supported in presenting the information in a clear and logical way. This is an appropriate use of the technology. The student will have to have the knowledge, and as such, the assessment is valid and an authentic representation of their knowledge. The AI has been used to support the structure and the English language. As we are assessing for technical knowledge, its use should not present accusations of AI plagiarism.

Staff also need training in digital and AI skills; indeed, this is now being included in the initial teacher training specifications in England (Department for Education, 2024). However, staff in post may have digital literacy training needs. These should be addressed by institutions. At my college, a huge emphasis has been placed on the digital expectations of teachers. Courses are digitally enhanced using the VLE, and the focus is to use this extensively in future. Staff will need to have the required skills to allow this, and training programs are being developed in line with JISC recommendations.

Generative AI can help teachers with producing content. Studies have shown that gen AI tools have the potential to automate the creation of teaching materials, marking, lesson planning, etc., thereby saving time and reducing workload. The UK government has invested £2 million to provide AI resources to educators such as lesson planning, report writing, and content creation (GOV. UK, 2023). The direction from government is to support the progressive use of AI tools within education.

Next, I will gather and analyse qualitative evidence from FE educators on their knowledge and attitudes towards AI usage, evaluating the usefulness of provided guidance and comparing it with my own viewpoint. It is important take in account the lived experience of skilled and experienced educators, as they navigate this new technology. It is interesting to see if the guidance aligns with educators’ viewpoints and is useful to them. As a manager I need to produce supportive guidance that is meaningful to my staff and aligns with their values. Therefore, considering qualitative responses to both AI, and the guidance, will allow me to investigate the best way of supporting my staff in navigating AI.

Methodology

There were two different phases of data collection. The first was part of an evaluation of a GenAI tool based on Chat GPT 3.5, Teacher Matic which includes views on AI and AI tools generally. The second was specifically considering educators response to the JISC and JCQ guidance.

In the first collection of data, 22 staff (9 females) gave short answers to an online survey. Participants provided consent. Demographic data such as their age, gender, and how long they had been teaching was gathered. They were asked open ended questions about their prior use of digital pedagogical tools and their familiarity and experience with generative AI technologies. Finally, they were asked five open ended questions about the AI tool. These questions aimed to obtain initial perceptions and potential biases about the use of AI in education.

The second phase of qualitative data collection was conducted by sending questions to teaching staff, staff developers and the Director of Quality, about the JISC and JCQ guidance. A deductive analytical approach was taken, examining alignment between the values of the staff and the guidance that has been issued.

I analysed the data using reflexive thematic analysis (RTA, Braun & Clarke, 2021). RTA can help understand the thoughts and feelings of educators in depth and as such is an appropriate analytical approach. It allows for the bias and perspective that the researcher brings to the process. While also being flexible and allowing the capture of the themes of perceived ease of use and usefulness which are relevant to TAM. It also allows the capture of what educators think about AI and why they think that way.

Findings

Four themes were identified in the qualitative responses – positive attitudes to technology, can we trust GenAI, students are already using AI, and impact on assessment.

Theme 1. Positive attitudes to technology

Educators were generally positive towards the adopting of AI technology with views ranging from the entirely enthusiastic “I love it” to acknowledging the potential benefits but requiring “a clear standardised policy”.  A common positive was that the hope that the technology would aid the production of teaching material and planning, and the reduction of workload. One respondent said, “it can help to plan more interesting exercises”, and “swiftly create lesson plans and learning activities”. Another highlighted the positive benefits of AI in education saying it can help with “personalised training, feedback and improved assessment processes.” There is clearly a desire for using the AI technology and its potential usefulness. There is a clear wish for it to be incorporated as an additional tool to help with the teaching an assessment process.

Theme 2. Can we trust GenAI?

This theme was evident in the responses from both data sets. Concerns were raised over the accurucay and correct responses of the AI generated text with one respondent noting that the “information can sometimes be incorrect, and it is important to remember this.” While another stated that the “content requested was not current legislation”.  Another was similarly concerned over the sources of information and their accuracy “I am still unsure where the information these platforms have gained from and is it correct?”

Another area of trust was around the ethical implications of the content created such as the “potential risk associated with its use, particularly deep fakes and ownership “. Whilst another educator pointed out that we need to be concerned “about the biases in AI algorithms.

Theme 3. Students are already using AI

Educators were aware that students are using AI: “students are increasingly using AI in various aspects of their academic lives”. The use of AI was not just limited to the production of study aids or as research tool. Some students had been running their assessment work through AI to “make it sound cleverer” whilst claiming by doing this that they would “learn bigger words”. Whilst others had been subjected to malpractice allegations due generating content for assessment. It is interesting to note that in several cases this was in students who did not have English as a first language, and this could have been due to the biases of the AI detection software in line with the findings of Price & Sakellarios (2023).

Theme 4.  Impact on assessment.

In this theme there was some discussion over the impact on assessment, with one respondent welcoming the JCQ guidance as “baseline guidance that can be applied to the context of a variety of assessments” and “guidance on how to inform, guide and tackle around misuse”. Others thought that AI would impact assessment integrity due to the “possibility of cheating facilitated by AI-powered tools.”

However, others thought that AI would impact on the nature of assessment and that this could “inspire teams to look at authentic assessment in more depth rather than simply doing what they’ve always done.” Going beyond this one respondent noted that “I think the way we assess is wrong“ and that “we need to focus on skills and behaviours development”. This reassessment due to AI of what assessment is for and why assess was seen as a good thing as it “should make awarding bodies and educators rethink what we are actually assessing”.

Throughout the themes there is an acceptance of AI. The entry to using AI is low and consequently easy due to AI generators being readily accessible to both students and staff. The adoption of AI use is therefore due its ease of use and usefulness in creating content. This would indicate that acceptance of AI as a new technology is in line with TAM.

Conclusion

As a manager, I cannot stop the march of AI. Instead, we will need to critically consider how and why we assess students. We need to reconsider the purpose of assessment. I can see the educational possibilities of AI for both staff and students. Within my disciplines, many of the practical aspects of the work will not be immediately affected by AI. We will need to ensure we are assessing process – something which is already a key part of vocational qualifications and have less of a reliance on assessment of artefacts for valid and reliable assessment. AI is the disruptor of education assessment hegemony. However, whenever there is disruption there is also opportunity.  As pointed out by Brynjolfsson & McAfee (2011) we need to learn to “race with the machines” rather than fight them. Our focus should be on skills and behaviours development rather than regurgitation of facts. The punitive JCQ rules whilst being guidance for existing assessments that may be appropriate for A level written assessments are not suitable for the wider needs of vocational education where skills fluency is a more appropriate metric.

My students will be working in an AI enabled world. Without teaching them the critical thinking skills necessary to engage with and analyse the output of AI, this will impact their ability to work effectively in their disciplines. As Ji et al, (2023) have pointed out, genAI often produces incorrect information. However, within engineering it is imperative that precise information is provided as it is often safety critical. As such critical thinking skills should be taught and reinforced to help students understand that they must assess whether the output given from AI generation is correct. AI education needs to be placed into the wider context of ethical use to ensure that students are taught about the inherent biases with the AI due to the how, where and by who it has been trained (Webb, 2023a).

AI challenges us to move from the assessment and learning methods of the past and move to ones that are relevant for the future UK as a global AI superpower (DDCMS, 2021).

References

Bećirović, S. (2023). Digital Competence of Teachers and Students. In S. Bećirović (Ed.), Digital Pedagogy (pp. 39-50). Springer Briefs in Education. Springer. https://doi.org/10.1007/978-981-99-0444-0_4 

Beetham, H., & Sharpe, R. (Eds.). (2019). Rethinking Pedagogy for a Digital Age: Principles and Practices of Design (3rd ed.). Routledge. https://doi.org/10.4324/9781351252805  

Bower, J. L., & Christensen, C. M. (1995). Disruptive technologies: Catching the wave. Harvard Business Review, 73(1), 43–53.

Brynjolfsson, E., & McAfee, A. (2011). Race against the machine: How the digital revolution is accelerating innovation, driving productivity, and irreversibly transforming employment and the economy. Digital Frontier Press.

Brynjolfsson, E., & McAfee, A. (2014). The second machine age: Work, progress, and prosperity in a time of brilliant technologies. WW Norton & Company.

Cotton, D. R. E., Cotton, P. A., & Shipway, J. R. (2023). Chatting and cheating: Ensuring academic integrity in the era of ChatGPT. Innovations in Education and Teaching Internationalhttps://doi.org/10.1080/14703297.2023.2190148

Davis, F. D. (1989). Perceived usefulness, perceived ease of use, and user acceptance of information technology. MIS Quarterly, 13(3), 319 –339.

Department for Digital, Culture, Media & Sport. (2021, September 22). New ten-year plan to make the UK a global AI superpower. [Press release]. https://www.gov.uk/government/news/new-ten-year-plan-to-make-britain-a-global-ai-superpower

Department for Education. (2023). Generative artificial intelligence (AI) in education. https://www.gov.uk/government/publications/generative-artificial-intelligence-in-education/generative-artificial-intelligence-ai-in-education

Department for Education. (2023, May 7). Diploma in Teaching (Further Education and Skills) qualification framework. https://www.gov.uk/government/publications/further-education-initial-teacher-education/diploma-in-teaching-further-education-and-skills-qualification-framework

ETF (2023). Qualification framework for the full teaching qualification for the further education and skills sector https://www.et-foundation.co.uk/wp-content/uploads/2024/01/Qualification-Framework-for-Diploma-in-Teaching-FES-Dec23-update.pdf 

FE News. (2023, March 13). PebblePad survey finds UK students using AI tools to support studies, not to outsource writing. https://www.fenews.co.uk/student-view/pebblepad-survey-finds-uk-students-using-ai-tools-to-support-studies-not-to-outsource-writing/

GOV.UK (2023, October 30). New support for teachers powered by Artificial Intelligence. https://www.gov.uk/government/news/new-support-for-teachers-powered-by-artificial-intelligence#:~:text=The%20Government%20is%20investing%20up,personalised%20AI%20lesson%2Dplanning%20assistant.

Granić, A., & Marangunić, N. (2019). Technology acceptance model in educational context: A systematic literature review. British Journal of Educational Technology, 50(5), 2572–2593. https://doi.org/10.1111/bjet.12864 

Ji, Z., Lee, N., Frieske, R., Yu, T., Su, D., Xu, Y., Ishii, E., Bang, Y., Madotto, A., & Fung, P. (2023). Survey of hallucination in natural language generation. ACM Computing Surveys, 55(12), 1-38. https://doi.org/10.1145/3571730

JISC. (2024a) Generative AI – a primer. Version 1.3. https://repository.jisc.ac.uk/9407/1/Jan_2024_generative_AI_primer-final.pdf

JISC. (2024b). Principles for the use of AI in FE colleges. https://www.jisc.ac.uk/further-education-and-skills/principles-for-the-use-of-ai-in-fe-colleges

Joint Council for Qualifications. (2024). Artificial Intelligence (AI) Use in Assessments: Protecting the Integrity of Qualifications. https://www.jcq.org.uk/wp-content/uploads/2024/04/AI-Use-in-Assessments_Feb24_v6.pdf

Knight, C., [@Charlesknight]. (2024, April 5) The more I think about it – it’s wild that people think ensuring academic integrity in an age of AI comes down to individuals using “vibes”. https://twitter.com/Charlesknight/status/1776155966523752526

Maitland, A., Fowkes, R., & Maitland, S. (2024). Can ChatGPT pass the MRCP (UK) written examinations? Analysis of performance and errors using a clinical decision-reasoning framework. BMJ Open, 14, e080558. https://doi.org/10.1136/bmjopen-2023-080558

McMahon, M., Hutt, I. (2023). Use of AI (Artificial Intelligence) detection tools in Turnitin. https://www.staffnet.manchester.ac.uk/bmh/about-fbmh/news-and-events/news/display/?id=29633

Meakin, L, A. (2024) Embracing Generative AI in the Classroom Whilst Being Mindful of Academic Integrity. In S. Mahmud (Ed), Academic Integrity in the Age of Artificial Intelligence. IGI Global. https://doi.org/10.4018/979-8-3693-0240-8.ch004

OAL. (2018). Level 3 Diploma in Food and Drink Engineering Maintenance 603/2374/7. Qualification Handbook. Occupational Awards Limited.

OpenAI. (2024). ChatGPT (May 1 version) [Large language model]. https://chat.openai.com/chat

Perkins, M., Roe, J., Postma, D., McGaughran, J., & Hickerson, D. (2024). Detection of GPT-4 Generated Text in Higher Education: Combining Academic Judgement and Software to Identify Generative AI Tool Misuse. Journal of Academic Ethics 22, 89–113. https://doi.org/10.1007/s10805-023-09492-6

Price, G., & Sakellarios, M. (2023). The Effectiveness of Free Software for Detecting AI-Generated Writing. International Journal of Teaching, Learning and Education, 2, 31-38. https://doi.org/10.22161/ijtle.2.6.4

Rospigliosi, P. A. (2023). Artificial intelligence in teaching and learning: What questions should we ask of ChatGPT? Interactive Learning Environments, 31(1), 1-3. https://doi.org/10.1080/10494820.2023.2180191

Russell Group. (2023, July 4). New principles on use of AI in education. https://russellgroup.ac.uk/news/new-principles-on-use-of-ai-in-education/

Sadasivan, V. S., Kumar, A., Balasubramanian, S., Wang, W., & Feizi, S. (2024). Can AI-Generated Text be Reliably Detected?  arXiv. https://arxiv.org/abs/2303.11156

Selwyn, N. (2024). Constructive Criticism? Working with (Rather than Against) the AIED Back-Lash. International Journal of Artificial Intelligence in Education 34, 84–91. https://doi.org/10.1007/s40593-023-00344-3

Sheppard, P., (2023, August 8). TeacherMatic Pilot – Using generative AI to reduce workload. JISC Artificial intelligence. https://nationalcentreforai.jiscinvolve.org/wp/2023/08/08/teachermatic-pilot-using-generative-ai-to-reduce-workload/

Turnitin. (n.d.). Turnitin’s AI writing detection available now. Turnitin. Retrieved May 1, 2024, from https://www.turnitin.co.uk/solutions/topics/ai-writing/

University of Lincoln. (2024). Artificial Intelligence (AI) Detection in Turnitin. https://digitaleducation.lincoln.ac.uk/resources-hub/artificial-intelligence-ai-detection-in-turnitin/

UNESCO (2018). UNESCO ICT Competency Framework for Teachers, Version 3. United Nations Educational, Scientific and Cultural Organization. 

UKESF (2023). Future engineering skills in the age of artificial intelligence. UK Electronics Skills Foundation. https://www.ukesf.org/wp-content/uploads/AI-report-interactive-March-2023.pdf

Webb, M. (2023a, January 26). Exploring the potential for bias in ChatGPT. JISC Artificial Intelligence. https://nationalcentreforai.jiscinvolve.org/wp/2023/01/26/exploring-the-potential-for-bias-in-chatgpt/

Webb, M. (2023b, September 18). AI Detection – Latest Recommendations. JISC Artificial Intelligence. https://nationalcentreforai.jiscinvolve.org/wp/2023/09/18/ai-detection-latest-recommendations/

Weber-Wulff, D., Ampjoma-Naumeca, A., Bjolobaba, S., Foltynek, T., Guerrero-Dib, J., Poppla, O., Sigut, P., & Waddington, L. (2023). Testing of Detection Tools for AI-Generated Text. Arxiv. https://arxiv.org/pdf/2306.15666.pdf

Zawacki-Richter, O., Marín, V. I., Bond, M., Gouverneur, F., & Hunger, I. (2019). Systematic review of research on artificial intelligence applications in higher education—where are the educators? International Journal of Educational Technology in Higher Education, 16(1), 39. https://doi.org/10.1186/s41239-019-0171-0

Appendix 1

Questions from first survey:

  1. From reading the information on the webpage, what do you hope that TeacherMatic will be able to help you with? 
  2. How do you feel about using a tool such as TeacherMatic to support your work? Please include both positive and negative feelings where relevant.           
  3. How do you think the student experience will be affected by your use of TeacherMatic? Please be as specific as possible.          
  1. What do you think students will think of teachers using tools such as TeacherMatic
  2. What concerns do you have about using TeacherMatic? Please be as specific as you can. 

Appendix 2

Questions from second survey:

  1. Do the documents make sense?
  2. Are they applicable to your context?
  3. What from the documents could/would you use?
  4. Are you aware of students using AI?
  5. Do you think AI is a threat to assessment integrity?
  6. Where are you on a continuum of enthusiast to sceptic?

Appendix 3

OpenAI Chat GPT Responses (OpenAI, 2024).

You

Explain the application and use of mechanical systems in food processing / manufacturing:

ChatGPT

ChatGPT

Mechanical systems play a crucial role in food processing and manufacturing by automating various tasks, improving efficiency, ensuring product quality, and maintaining safety standards. Here’s a breakdown of their application and use:

  1. Mixing and Blending: Mechanical mixers and blenders are employed to uniformly mix ingredients in food processing. Whether it’s mixing dough for bread or blending ingredients for sauces, mechanical systems ensure consistency and quality.
  2. Cutting and Slicing: Mechanical cutting and slicing machines are used to precisely cut fruits, vegetables, meats, and other food items to desired shapes and sizes. This ensures uniformity in the final product and increases production speed.
  3. Grinding and Milling: Machines like grinders and mills are utilized to grind grains, spices, and other ingredients into powders or pastes. They provide consistent texture and particle size, essential for various food products.
  4. Cooking and Heating: Mechanical systems such as ovens, steamers, and fryers are used for cooking and heating food items. These systems offer precise control over temperature and cooking time, ensuring food safety and quality.
  5. Packaging: Packaging machinery is employed to pack food products into containers, pouches, or bags. These machines can handle different packaging materials and ensure proper sealing to maintain freshness and extend shelf life.
  6. Sorting and Inspection: Mechanical sorting and inspection systems are used to remove defective or foreign materials from food products. These systems employ sensors, cameras, and automated mechanisms to detect and eliminate contaminants, ensuring product safety.
  7. Conveying and Transporting: Conveyor belts and automated transport systems are used to move raw materials and finished products throughout the processing plant. They facilitate smooth workflow, reduce manual handling, and minimize the risk of contamination.
  8. Cleaning and Sanitizing: Mechanical cleaning systems, such as industrial washers and sanitizers, are crucial for maintaining hygiene in food processing facilities. They ensure equipment and surfaces are thoroughly cleaned and sanitized to prevent foodborne illnesses.
  9. Quality Control: Mechanical systems are integrated with sensors and monitoring devices to continuously monitor various parameters such as temperature, pressure, and moisture levels during processing. This enables real-time quality control and ensures adherence to product specifications.
  10. Waste Management: Mechanical systems are also utilized for waste management in food processing plants. This includes equipment for recycling, composting, and waste disposal, helping facilities to minimize environmental impact and adhere to sustainability goals.

In summary, mechanical systems in food processing and manufacturing play a vital role in streamlining operations, ensuring product quality and safety, and meeting industry regulations and standards. They enable efficient, hygienic, and cost-effective production of a wide range of food products for consumers worldwide.

You

Produce a writing frame for a reflective account about the application and use of mechanical systems in food processing / manufacturing including: cams and followers, chain and belt drives, clutches and brakes, gears and gear drives, levers and linkage mechanisms, pumps, transmission shafts and valves

ChatGPT

ChatGPT

Title: Reflective Account: Application and Use of Mechanical Systems in Food Processing/Manufacturing

Introduction:

  • Brief overview of the importance of mechanical systems in food processing/manufacturing.
  • Introduction to the specific mechanical systems to be discussed: cams and followers, chain and belt drives, clutches and brakes, gears and gear drives, levers and linkage mechanisms, pumps, transmission shafts, and valves.
  • Explanation of the purpose of this reflective account: to explore personal experiences and insights regarding the application and effectiveness of these mechanical systems in the food industry.

Body:

  1. Cams and Followers:
  1. Description of how cams and followers are utilized in food processing/manufacturing machinery.
  2. Personal experiences with machinery employing cams and followers.
  3. Reflection on the effectiveness and efficiency of cams and followers in specific applications within the food industry.
  1. Chain and Belt Drives:
  1. Explanation of the role of chain and belt drives in transferring power within food processing machinery.
  2. Personal anecdotes or observations related to the use of chain and belt drives.
  3. Reflection on the reliability and maintenance requirements of chain and belt drives in the food manufacturing environment.
  1. Clutches and Brakes:
  1. Overview of how clutches and brakes are applied in food processing equipment for control and safety.
  2. Personal instances where clutches and brakes have been pivotal in ensuring smooth operation or preventing accidents.
  3. Reflection on the importance of proper functioning and maintenance of clutches and brakes for operational efficiency and worker safety.
  1. Gears and Gear Drives:
  1. Description of the role of gears and gear drives in food processing machinery for speed reduction, torque amplification, and motion control.
  2. Personal experiences working with machinery incorporating gears and gear drives.
  3. Reflection on the advantages and limitations of gears and gear drives in food processing applications.
  1. Levers and Linkage Mechanisms:
  1. Explanation of how levers and linkage mechanisms are utilized for mechanical advantage and motion control in food processing equipment.
  2. Examples of specific applications of levers and linkage mechanisms in the food industry.
  3. Reflection on the versatility and effectiveness of levers and linkage mechanisms in optimizing processes and equipment design.
  1. Pumps:
  1. Overview of the importance of pumps in fluid handling and transfer within food processing facilities.
  2. Personal encounters with pump systems in the food manufacturing context.
  3. Reflection on the significance of pump reliability and efficiency for maintaining production continuity and product quality.
  1. Transmission Shafts:
  1. Description of the role of transmission shafts in transmitting power and rotational motion in food processing machinery.
  2. Personal observations regarding the use of transmission shafts in food manufacturing equipment.
  3. Reflection on the challenges and considerations associated with the design and maintenance of transmission shaft systems.
  1. Valves:
  1. Explanation of the function of valves in regulating flow, pressure, and direction of fluids in food processing operations.
  2. Personal instances where valves have been critical for process control and product quality assurance.
  3. Reflection on the importance of valve selection, positioning, and maintenance for efficient and hygienic food processing.

Conclusion:

  • Summary of key insights gained from reflecting on the application and use of mechanical systems in food processing/manufacturing.
  • Final thoughts on the significance of these mechanical components for ensuring efficiency, safety, and quality in the food industry.
  • Consideration of potential areas for improvement or innovation in mechanical systems to address evolving challenges and demands in food processing.