AI Meets Education | Protecting Privacy, Empowering Learning The adoption of AI in schools to support students on their educational journey has accelerated rapidly. From lesson planning and report writing to helping with homework, supporting CVs, and even creating art and music – there’s an AI tool for nearly every aspect of school life. Our sponsor SchoolPro TLC share their insights here on this phenomenon that is changing all our lives. AI in education is rapidly evolving, designed to help students prepare for GCSEs, A Levels, understand the world around them, and navigate life beyond school. But it is not only students using these tools – teachers, administrators, school leaders, and even parents and carers are exploring AI to reduce workloads, save time, boost productivity, and escape repetitive administrative tasks. AI is revolutionary in its ability to lift the burden of everyday chores, transforming hard work into something more engaging – even fun. But at what cost? This surge in AI use raises an important question: Is AI our friend or foe? Do the benefits truly outweigh the risks, and should we be concerned about the long-term implications? FRIEND 1. Great for Learning AI can be useful for personalised learning, tailoring educational content to meet the specific requirement of the student. Using AI in this way will help enhance engagement and understanding of subjects being studied. 2. Removing the Tedium from Routine Tasks AI is able to take over time-consuming tasks like grading, lesson planning, scheduling, report writing, and attendance tracking. By automating these routine duties, teachers are freed up to focus on what truly matters: teaching, building relationships with students, and delivering more personalised support in the classroom. 3. No Time Restraints Unlike schools and educational institutions AI is available 24/7. This not only supports teachers, but also provides students with access to learning tools and support anytime, anywhere. 4. Providing Teachers with Support AI offers teachers access to a wide pool of teaching resources, enabling them to enhance their instructional strategies. This support can lead to more engaging lessons, personalised learning experiences, and ultimately, improved student outcomes. 5. Enhancing Future Career Prospects for Students With AI integrated into education learning, this will help students develop the necessary skills to enter the job market where AI technologies will play a significant role. These are but a few of the benefits of using AI in the educational environment. FOE With every benefit a product or service offers, there often comes a downside, and the use of AI tools is no exception. 1. Data Privacy Risks When using AI, users may inadvertently enter personal or sensitive data, which is then processed in ways they do not control and this raises important concerns about how this data is stored, used, and protected. The information entered into AI systems may be utilised to train and improve the applications over time, making data privacy and security critical considerations. 2. AI Psychosis A startling new trend is emerging in our increasingly digital world: people are forming deep emotional bonds with AI systems like ChatGPT – spending hours interacting, confiding, and even building relationships with these tools. This growing dependence is being dubbed “AI Psychosis” across media and social platforms. Whilst not a clinical diagnosis, the term reflects a concerning shift in human behaviour – where reliance on AI begins to blur the lines between reality and artificial companionship. 3. Reliance on Content The content delivered by using an AI tool cannot always be relied upon for accuracy. The end content is dependent on many factors of how the AI tool views the sources of the learned material and information. 4. Sharing of Responses You have used AI to help write a report and found it incredibly useful. Naturally, you want to share it with a colleague so that they can benefit and use it as a template for their own report writing. There is nothing wrong with that – or is there? Recent reports suggest that when you share AI chatbot responses, such as those from ChatGPT, the content could potentially be indexed by search engines like Google, making it publicly searchable. This raises important privacy and confidentiality concerns. Sharing of data this way is not always automatically but could be linked to privacy settings with the chatbot. Therefore, it is recommended to “check you settings” prior to using chatbot tools to ensure unnecessary sharing does not occur. Friend or Foe? Building Safe and Positive AI Experiences There is no doubt AI will bring about vast benefits for schools. Students will prosper, teachers will have more time to spend on teaching and interaction with students, administrators will be free from doing mundane tasks to take on more meaningful projects and money will be saved. So how do we reap the benefits but at the same time keep students, teachers and even the school safe from privacy risks and breaches? By thinking “HARP” H: Human Intervention It is important not to rely solely on AI-generated information. AI is not always accurate, so human oversight is essential before trusting or using any AI response. If in doubt, verify the information with trusted sources you have used previously to ensure its reliability. A: Age Appropriate When introducing AI tools in the classroom, ensure they are age-appropriate and aligned with students’ educational needs. Monitor how students use AI chatbots or other applications, as recent leaked documents from Meta’s GenAI Content Risk Standards revealed that some AI systems may engage children in conversations that are romantic or sensual. It is vital to remain vigilant and guide students towards safe, relevant, and appropriate use of AI technology. Talk openly with students about the safety risks associated with using AI tools. Encourage them to limit the amount of time they spend interacting with AI to reduce the risk of developing dependency. Educating students on balanced and mindful AI use helps promote healthier, safer habits. Harness an environment of digital literacy and critical thinking when using AI tools. Remind teachers and students to evaluate outputs and not take responses at face value – “Don’t Believe The Misinformation”. R: Risk Assessments and Policy Before implementing AI tools, ensure that your school or Trust has conducted thorough risk assessments that weigh the educational benefits against potential privacy and security concerns. Review and update existing policies and procedures to explicitly address AI use, outlining the safeguards and measures in place to protect data privacy and reduce associated risks. Exactly like any third-party data processor, AI tools must meet GDPR standards. Before using them, ensure they have strong security measures, clear data handling policies, and comply with privacy laws. Treat AI with the same scrutiny. P: Privacy and Security Settings Set Your AI Tool to Private: Many AI chatbots allow you to adjust privacy settings. Use the toggle switch to set your chats to private, especially if you plan to share conversations with colleagues. This helps protect your data from being publicly accessible. Delete Chats and Turn Off Memory: Regularly delete your chat history and disable memory features to limit the amount of data the AI collects about you. AI systems build profiles based on your interactions – such as your interests and question patterns – that could reveal sensitive information like religious, political, or social preferences, even if you do not explicitly provide personal details. Practice Online Safety: Treat AI tools like any other online platform. Consider what security measures are in place to protect your information. Review the company’s privacy policies to understand how your data is used – especially whether it’s leveraged to train and improve AI models. Report Concerns: Have clear guidance on how to report any inappropriate AI content or misuse. By Tanya Clark, SchoolPro TLC We thank the SchoolPro TLC Team for these helpful guidelines as a reminder to keep safe when using AI.
0 Comments
You can see here the programme for our 2025 SWIFT Summer Conference TODAY, Thursday 19 June 2025 at the Future Skills Centre in Exeter: This year's conference is set to be memorable and momentous focusing about relevant educational issues bringing the best of regional and national: Artificial Intelligence | Trust Leadership | Diversity, Equity and Inclusion | Pupil, Parent and Staff Engagement | SW Regions Group | Resilience and more.
Every year we build on the success of the previous conference and the feedback speakers for itself: "This is the best conference I have attended. Every session was brilliant. I left feeling inspired.” 66% delegates Strongly Agreed that the conference made a positive impact on their understanding of educational practices. 71% delegates Strongly Agreed the conference provided a high quality experience. 88% of delegates rated the conference as Very Good. With special thanks to our sponsors and exhibitors:
Elementa Support Services Exeter Supply Partnership Educatering ONVU Learning SchoolPro TLC Whole School SEND Further to the previous article on the Use of Generative AI in MATs and Schools from our sponsor SchoolPro TLC, we encourage you to this review this further guidance to ensure you are AI safe in your School and Multi Academy Trust starting with this checklist. General AI Best Practices
Quick Staff Guide
What is AI and How Can It Be Used in Schools? Artificial Intelligence (AI) can support teaching, reduce workload, and improve efficiency. When used responsibly, it can:
Key Safety Tips
How to Talk to Pupils About AI
Who to Contact for AI Support For any AI-related concerns, training needs, or Data Protection questions, contact your School’s IT or Data Protection Lead, Your SchoolPro TLC Data Protection Officer (DPO). Have you embarked yet on your Artificial Intelligence (AI) journey? The use of AI in schools is rapidly growing, offering numerous benefits, such as enhanced efficiency, personalised learning, and improved decision-making. However, AI also presents challenges, including Data Protection risks, ethical considerations, the risk of bias, and concerns over transparency. Given the rapid advancements in AI and the growing reliance on these technologies in education, it is crucial for MATs and schools to establish clear policies that balance innovation with safeguarding concerns. Our sponsor, SchoolPro TLC provides some helpful guidance here and a framework for the responsible use of AI in schools, ensuring compliance with UK GDPR, recommendations from the Information Commissioner’s Office (ICO), the Department for Education (DfE), and guidance from Ofsted. What is Generative AI? Generative AI refers to AI systems that can create new content, such as text, images, video or audio. Unlike traditional AI, which follows explicit programming to complete specific tasks, generative AI uses machine learning to create original outputs from input data. The UK Government and the ICO define AI as technology that mimics cognitive functions associated with human intelligence, such as learning and problem-solving. AI is increasingly used in MATs and schools for both educational and administrative purposes, raising questions about responsible implementation, data security and the ethical implications of its use. Open vs Closed AI Systems Understanding the distinction between open and closed AI systems is essential when assessing risk and implementing AI within educational settings:
Can Open AI Systems Be Configured as Closed? Some AI tools, such as Google Gemini, Microsoft Copilot, and other cloud-based AI models, are generally considered open AI systems by default. However, it is possible that they can be configured to function as closed systems depending on their settings and the environment in which they are deployed. For example, within a Google Workspace for Education environment, Google Gemini can be configured to:
In such cases, an AI tool that is generally open in a public setting may be functionally closed within a well-managed, restricted environment. Schools should consult their IT lead or Data Protection Officer (DPO) to determine whether an AI tool is configured to meet Data Protection requirements before use. MATs and schools should assess AI applications before use to determine their suitability based on these classifications and apply appropriate safeguards, such as data minimisation and access controls. Scope of AI in MATs and Schools Pupil Usage AI has the potential to enhance learning through activities such as:
However, students must be educated on the ethical use of AI, particularly in avoiding over-reliance and plagiarism. Acceptable Use Agreements should explicitly outline permissible and prohibited AI use. Staff Usage Teachers and administrators can potentially use AI for activities such as:
Staff must verify AI-generated content for accuracy and must not input personal or sensitive data into generative AI tools without prior assessment. Governors and Leadership Governors and senior leadership teams play a crucial role in overseeing AI implementation, ensuring compliance with Data Protection laws, and updating policies as AI capabilities evolve. Core Principles for AI Use Transparency MATs and schools must conduct Data Protection Impact Assessments (DPIAs) when AI tools process personal data. DPIAs help identify risks and establish mitigating strategies to protect sensitive student and staff information. Schools should also be transparent about how they use generative AI tools, ensuring that staff, students, Governors, parents, and carers understand how their personal data is processed. Accountability Roles and responsibilities for AI use must be clearly defined and schools should:
Compliance with Data Protection Legislation Schools must ensure that AI tools comply with UK GDPR and their Data Protection Policies. To protect data when using generative AI tools, schools should:
AI and Data Protection in Schools AI use must comply with UK GDPR and the Data Protection Act 2018 in order to safeguard personal data. Schools reserve the right to monitor AI usage to prevent misuse and ensure compliance with academic integrity policies. Data Privacy and Protection The use of personal data in AI tools must be handled with extreme caution. Schools and MATs should adopt the following principles:
Additionally, some generative AI tools collect and store additional data, such as:
Schools must review and disclose how any data collected by generative AI tools is processed and stored in their Privacy Notice. Ofsted Expectations for AI Use in Education Ofsted does not directly inspect the quality of AI tools but considers their impact on safeguarding, educational quality, and decision-making within schools. Schools must ensure:
Leaders are responsible for ensuring that AI enhances education and care without negatively affecting outcomes. Integration into Policies and Agreements To ensure compliance, transparency, and ethical AI use, schools and MATs should update their existing policies to include provisions for AI. We have drafted recommended text to add to key policies and privacy notices in order to support this process. This information for parts of our AI Guidance pack for schools and is included in the following document: 2 - Generative AI in MATs and Schools - Policy Updates. Report by Soton Soleye and Ben Craig, School Pro TLC References Disclaimer
SchoolPro TLC Ltd (2025) SchoolPro TLC guidance does not constitute legal advice. SchoolPro TLC is not responsible for the content of external websites. |
SWIFT News
|
SPONSORED BY
Join us, be a part of our SWIFT community |
© COPYRIGHT 2022 SOUTH WEST INSTITUTE FOR TEACHING SWIFT. ALL RIGHTS RESERVED | Website by brightblueC
VIEW OUR PRIVACY NOTICES | VIEW OUR COURSE T&CS
VIEW OUR PRIVACY NOTICES | VIEW OUR COURSE T&CS



RSS Feed