South West Institute for Teaching SWIFT
  • Home
  • About us
    • Vision and more
    • SWIFT Teaching School Hubs
    • SWIFT Partnership
    • Diversity, Equity and Inclusion
    • Leadership and Governance
    • Sustainability
    • Our SWIFT Artwork
    • Sponsorship
    • Privacy policies
  • Membership
  • ITT
  • Appropriate Body
  • ECTP
  • NPQs
  • CPD
    • CPD view and book 2025-2026
    • Professional Communities
    • Conferences and Forums
    • Leadership and Performance Analysis
  • News
  • Contact us
  • Home
  • About us
    • Vision and more
    • SWIFT Teaching School Hubs
    • SWIFT Partnership
    • Diversity, Equity and Inclusion
    • Leadership and Governance
    • Sustainability
    • Our SWIFT Artwork
    • Sponsorship
    • Privacy policies
  • Membership
  • ITT
  • Appropriate Body
  • ECTP
  • NPQs
  • CPD
    • CPD view and book 2025-2026
    • Professional Communities
    • Conferences and Forums
    • Leadership and Performance Analysis
  • News
  • Contact us
Search by typing & pressing enter

YOUR CART

15/9/2025 0 Comments

Artificial Intelligence | Friend or Foe?

Picture
AI Meets Education | Protecting Privacy, Empowering Learning
 
The adoption of AI in schools to support students on their educational journey has accelerated rapidly.
From lesson planning and report writing to helping with homework, supporting CVs, and even creating art and music – there’s an AI tool for nearly every aspect of school life.
 
Our sponsor SchoolPro TLC share their insights here on this phenomenon that is changing all our lives.  
​

AI in education is rapidly evolving, designed to help students prepare for GCSEs, A Levels, understand the world around them, and navigate life beyond school.

But it is not only students using these tools – teachers, administrators, school leaders, and even parents and carers are exploring AI to reduce workloads, save time, boost productivity, and escape repetitive administrative tasks.
 
AI is revolutionary in its ability to lift the burden of everyday chores, transforming hard work into something more engaging – even fun. But at what cost?
 
This surge in AI use raises an important question: Is AI our friend or foe?
Do the benefits truly outweigh the risks, and should we be concerned about the long-term implications? 

FRIEND

1. Great for Learning 

AI can be useful for personalised learning, tailoring educational content to meet the specific requirement of the student. Using AI in this way will help enhance engagement and understanding of subjects being studied.
 
2. Removing the Tedium from Routine Tasks
AI is able to take over time-consuming tasks like grading, lesson planning, scheduling, report writing, and attendance tracking. By automating these routine duties, teachers are freed up to focus on what truly matters: teaching, building relationships with students, and delivering more personalised support in the classroom.
 
3. No Time Restraints
Unlike schools and educational institutions AI is available 24/7. This not only supports teachers, but also provides students with access to learning tools and support anytime, anywhere.
 
4. Providing Teachers with Support
AI offers teachers access to a wide pool of teaching resources, enabling them to enhance their instructional strategies. This support can lead to more engaging lessons, personalised learning experiences, and ultimately, improved student outcomes.
 
5. Enhancing Future Career Prospects for Students
With AI integrated into education learning, this will help students develop the necessary skills to enter the job market where AI technologies will play a significant role.
​
These are but a few of the benefits of using AI in the educational environment. 

Picture
FOE

With every benefit a product or service offers, there often comes a downside, and the use of AI tools is no exception.

1. Data Privacy Risks
When using AI, users may inadvertently enter personal or sensitive data, which is then processed in ways they do not control and this raises important concerns about how this data is stored, used, and protected. The information entered into AI systems may be utilised to train and improve the applications over time, making data privacy and security critical considerations.

2. AI Psychosis 
A startling new trend is emerging in our increasingly digital world: people are forming deep emotional bonds with AI systems like ChatGPT – spending hours interacting, confiding, and even building relationships with these tools. This growing dependence is being dubbed “AI Psychosis” across media and social platforms. Whilst not a clinical diagnosis, the term reflects a concerning shift in human behaviour – where reliance on AI begins to blur the lines between reality and artificial companionship.

3. Reliance on Content 
The content delivered by using an AI tool cannot always be relied upon for accuracy. The end content is dependent on many factors of how the AI tool views the sources of the learned material and information.

4. Sharing of Responses 
You have used AI to help write a report and found it incredibly useful. Naturally, you want to share it with a colleague so that they can benefit and use it as a template for their own report writing. There is nothing wrong with that – or is there?

Recent reports suggest that when you share AI chatbot responses, such as those from ChatGPT, the content could potentially be indexed by search engines like Google, making it publicly searchable. This raises important privacy and confidentiality concerns.
 
Sharing of data this way is not always automatically but could be linked to privacy settings with the chatbot. Therefore, it is recommended to “check you settings” prior to using chatbot tools to ensure unnecessary sharing does not occur. 
Picture
​Friend or Foe? Building Safe and Positive AI Experiences
There is no doubt AI will bring about vast benefits for schools. Students will prosper, teachers will have more time to spend on teaching and interaction with students, administrators will be free from doing mundane tasks to take on more meaningful projects and money will be saved. So how do we reap the benefits but at the same time keep students, teachers and even the school safe from privacy risks and breaches? By thinking “HARP”
​
H: Human Intervention
It is important not to rely solely on AI-generated information.

AI is not always accurate, so human oversight is essential before trusting or using any AI response. If in doubt, verify the information with trusted sources you have used previously to ensure its reliability.
A: Age Appropriate
When introducing AI tools in the classroom, ensure they are age-appropriate and aligned with students’ educational needs. Monitor how students use AI chatbots or other applications, as recent leaked documents from Meta’s GenAI Content Risk Standards revealed that some AI systems may engage children in conversations that are romantic or sensual. 
It is vital to remain vigilant and guide students towards safe, relevant, and appropriate use of AI technology.

Talk openly with students about the safety risks associated with using AI tools. Encourage them to limit the amount of time they spend interacting with AI to reduce the risk of developing dependency. Educating students on balanced and mindful AI use helps promote healthier, safer habits.

Harness an environment of digital literacy and critical thinking when using AI tools. Remind teachers and students to evaluate outputs and not take responses at face value – “Don’t Believe The Misinformation”.
 
R: Risk Assessments and Policy
Before implementing AI tools, ensure that your school or Trust has conducted thorough risk assessments that weigh the educational benefits against potential privacy and security concerns. Review and update existing policies and procedures to explicitly address AI use, outlining the safeguards and measures in place to protect data privacy and reduce associated risks.

Exactly like any third-party data processor, AI tools must meet GDPR standards. Before using them, ensure they have strong security measures, clear data handling policies, and comply with privacy laws. Treat AI with the same scrutiny.
 
P: Privacy and Security Settings
Set Your AI Tool to Private: Many AI chatbots allow you to adjust privacy settings. Use the toggle switch to set your chats to private, especially if you plan to share conversations with colleagues. This helps protect your data from being publicly accessible.

Delete Chats and Turn Off Memory: Regularly delete your chat history and disable memory features to limit the amount of data the AI collects about you. AI systems build profiles based on your interactions – such as your interests and question patterns – that could reveal sensitive information like religious, political, or social preferences, even if you do not explicitly provide personal details.

Practice Online Safety: Treat AI tools like any other online platform. Consider what security measures are in place to protect your information. Review the company’s privacy policies to understand how your data is used – especially whether it’s leveraged to train and improve AI models.

Report Concerns: Have clear guidance on how to report any inappropriate AI content or misuse.

By Tanya Clark, SchoolPro TLC

We thank the SchoolPro TLC Team for these helpful guidelines as a reminder to keep safe when using AI.
FIND OUT MORE ABOUT SCHOOLPRO TLC SERVICES FOR SCHOOLS here
SCHOOLPRO TLC DPO SUPPORT HERE
contact schoolpro tlc here
Picture
0 Comments

12/6/2025 0 Comments

SWIFT Summer Conference 2025 | THE PROGRAMME

You can see here the programme for our 2025 SWIFT Summer Conference TODAY, Thursday 19 June 2025 at the Future Skills Centre in Exeter:
click here to see the programme and then enjoy your day!
Picture
This year's conference is set to be memorable and momentous focusing about relevant educational issues bringing the best of regional and national:

Artificial Intelligence | Trust Leadership | Diversity, Equity and Inclusion | Pupil, Parent and Staff Engagement | SW Regions Group | Resilience and more.
Picture
  • ​Network in-person with like-minded colleagues.
  • Take time out of your school routine to reflect.
  • Be part of the conversation.
  • Professional development that counts.
  • Return to school re-energised.
  • Meet the SWIFT Central Team in person.
  • Enjoy a lovely lunch.  
Every year we build on the success of the previous conference and the feedback speakers for itself:

​"This is the best conference I have attended. Every session was brilliant. I left feeling inspired.”

66% delegates Strongly Agreed that the conference made a positive impact on their understanding of educational practices.
71% delegates Strongly Agreed the conference provided a high quality experience.
88% of delegates rated the conference as Very Good.
swift summer conference programme
book here | final few places!
With special thanks to our sponsors and exhibitors: 
Elementa Support Services
Exeter Supply Partnership
Educatering
ONVU Learning
SchoolPro TLC
Whole School SEND

0 Comments

7/5/2025 0 Comments

Further Guidance on AI in Schools with SchoolPro TLC

Further to the previous article on the Use of Generative AI in MATs and Schools from our sponsor SchoolPro TLC, we encourage you to this review this further guidance to ensure you are AI safe in your School and Multi Academy Trust starting with this checklist.
 
General AI Best Practices
  • Verify all AI-generated content for accuracy before use.
  • Use AI tools to enhance learning and reduce workload, not to replace professional judgment.
  • Stick to school-approved AI tools that comply with data protection policies.
  • Maintain human oversight—never rely solely on AI for assessments or decisions.
  • Be transparent—let students and staff know when AI has been used in content creation.
  • Train staff and pupils on AI’s risks, limitations, and ethical considerations.

Data Protection and Security
  • Never input personal, sensitive, or pupil data into AI tools unless explicitly approved.
  • Always check whether an AI tool is open or closed before using it.
  • If using AI for decision-making (e.g., profiling students), conduct a Data Protection Impact Assessment (DPIA)
  • Update policies, privacy notices, and acceptable use agreements (AUA) as needed.
  • Consult the SchoolPro TLC Data Protection Officer (DPO) if in doubt.

Teaching and Pupil Engagement
  • Encourage students to use AI as a learning tool (e.g., research, brainstorming) rather than for completing assignments.
  • Educate pupils on responsible AI use, plagiarism risks, and fact-checking information.
  • Monitor AI’s impact in classrooms — ensure it aligns with safeguarding and educational goals. 
Picture
Quick Staff Guide
What is AI and How Can It Be Used in Schools?
Artificial Intelligence (AI) can support teaching, reduce workload, and improve efficiency. When used responsibly, it can:
  • Assist with lesson planning, assessment design, and report writing.
  • Automate routine admin tasks (e.g., scheduling, summarising data).
  • Provide personalised learning support for students, including SEND adaptations.
However, AI must be used with caution to avoid data breaches, bias, misinformation, and over-reliance.

Key Safety Tips
  • Check before you trust: AI makes mistakes—fact-check all outputs.
  • Protect student data: Never enter personal or sensitive information into AI tools unless specifically approved.
  • Understand AI bias: AI models can reinforce biases—review content carefully.
  • Use approved tools: Stick to school-approved, closed AI systems whenever possible.
  • Update policies: Ensure AI use is reflected in privacy notices, AUAs, and safeguarding policies.

How to Talk to Pupils About AI
  • AI is a tool, not a replacement: Students should use AI to support learning, not to do their work for them.
  • Plagiarism risks: AI-generated text needs proper citation—copying AI work is academic misconduct.
  • Misinformation awareness: AI can make up facts—students must verify sources before using AI-generated content.
  • Think critically: Encourage students to question AI responses and improve their digital literacy.

Who to Contact for AI Support
For any AI-related concerns, training needs, or Data Protection questions, contact your School’s IT or Data Protection Lead, Your SchoolPro TLC Data Protection Officer (DPO).
cONTACT Your SchoolPro TLC Data Protection Officer (DPO) HERE
FIND MORE INFORMATION ABOUT SCHOOLPRO TLC HERE
Picture
Picture
0 Comments

2/4/2025 0 Comments

Guidance on the use of Generative AI in MATs and Schools from SchoolPro TLC

Picture
Have you embarked yet on your Artificial Intelligence (AI) journey?

​
The use of AI in schools is rapidly growing, offering numerous benefits, such as enhanced efficiency, personalised learning, and improved decision-making.

​However, AI also presents challenges, including Data Protection risks, ethical considerations, the risk of bias, and concerns over transparency.

Given the rapid advancements in AI and the growing reliance on these technologies in education, it is crucial for MATs and schools to establish clear policies that balance innovation with safeguarding concerns.
 
Our sponsor, SchoolPro TLC provides some helpful guidance here and a framework for the responsible use of AI in schools, ensuring compliance with UK GDPR, recommendations from the Information Commissioner’s Office (ICO), the Department for Education (DfE), and guidance from Ofsted. 

What is Generative AI?
Generative AI refers to AI systems that can create new content, such as text, images, video or audio. Unlike traditional AI, which follows explicit programming to complete specific tasks, generative AI uses machine learning to create original outputs from input data.

The UK Government and the ICO define AI as technology that mimics cognitive functions associated with human intelligence, such as learning and problem-solving. AI is increasingly used in MATs and schools for both educational and administrative purposes, raising questions about responsible implementation, data security and the ethical implications of its use.

Open vs Closed AI Systems
Understanding the distinction between open and closed AI systems is essential when assessing risk and implementing AI within educational settings:

  • Open AI Systems | These include publicly available AI models (e.g., ChatGPT, Google Gemini) that continuously learn from user inputs. They may store, share, or learn from the information entered, including personal or sensitive data. Schools should avoid entering identifiable information into these tools to protect personal and special category data.
  • Closed AI Systems | These are proprietary AI solutions controlled by an organisation (e.g., school-specific AI tools integrated into a school’s Learning Management System). Closed systems offer greater security and compliance as external parties cannot access the data input. If a school uses closed AI tools to process personal data, this must be included in the school’s Privacy Notice.
Can Open AI Systems Be Configured as Closed?
Some AI tools, such as Google Gemini, Microsoft Copilot, and other cloud-based AI models, are generally considered open AI systems by default. However, it is possible that they can be configured to function as closed systems depending on their settings and the environment in which they are deployed.

For example, within a Google Workspace for Education environment, Google Gemini can be configured to:
  • Operate within a restricted school domain, preventing data from being shared externally.
  • Be managed through Google Admin Console, where IT teams can disable data collection and adjust privacy settings.
  • Restrict AI usage to pre-approved applications, ensuring compliance with school policies.

In such cases, an AI tool that is generally open in a public setting may be functionally closed within a well-managed, restricted environment. Schools should consult their IT lead or Data Protection Officer (DPO) to determine whether an AI tool is configured to meet Data Protection requirements before use.
MATs and schools should assess AI applications before use to determine their suitability based on these classifications and apply appropriate safeguards, such as data minimisation and access controls. 
Scope of AI in MATs and Schools
Pupil Usage

AI has the potential to enhance learning through activities such as:
  • Personalised tutoring
  • Research support
  • Critical thinking development
  • Adaptive learning platforms

However, students must be educated on the ethical use of AI, particularly in avoiding over-reliance and plagiarism. Acceptable Use Agreements should explicitly outline permissible and prohibited AI use.

Staff Usage
Teachers and administrators can potentially use AI for activities such as:
  • Lesson planning
  • Curriculum development
  • Report writing (without identifiable student data)
  • Student performance analysis
  • Administrative tasks such as scheduling and resource management

Staff must verify AI-generated content for accuracy and must not input personal or sensitive data into generative AI tools without prior assessment.

Governors and Leadership
Governors and senior leadership teams play a crucial role in overseeing AI implementation, ensuring compliance with Data Protection laws, and updating policies as AI capabilities evolve.
Core Principles for AI Use
Transparency
MATs and schools must conduct Data Protection Impact Assessments (DPIAs) when AI tools process personal data. DPIAs help identify risks and establish mitigating strategies to protect sensitive student and staff information.
 
Schools should also be transparent about how they use generative AI tools, ensuring that staff, students, Governors, parents, and carers understand how their personal data is processed.

Accountability
Roles and responsibilities for AI use must be clearly defined and schools should:

  • Assign AI oversight responsibilities to senior leaders.
  • Implement AI governance committees where appropriate.
  • Ensure staff are trained in AI risk management and Data Protection.
 
Compliance with Data Protection Legislation
Schools must ensure that AI tools comply with UK GDPR and their Data Protection Policies.
To protect data when using generative AI tools, schools should:

  • Seek advice from their Data Protection Officer (DPO) and IT lead before using AI tools.
  • Verify whether an AI tool is open or closed before use.
  • Ensure no identifiable information is entered into open AI tools.
  • Acknowledge or reference AI use in academic work. 
  • Fact-check AI-generated results for accuracy before use.
AI and Data Protection in Schools 
AI use must comply with UK GDPR and the Data Protection Act 2018 in order to safeguard personal data. Schools reserve the right to monitor AI usage to prevent misuse and ensure compliance with academic integrity policies.

Data Privacy and Protection
The use of personal data in AI tools must be handled with extreme caution.
Schools and MATs should adopt the following principles:

  • Avoid Using Personal Data in AI Tools | It is recommended that personal data is not entered into AI applications unless absolutely necessary.
  • Strictly Necessary Use | If personal data must be used within an AI system, the school or MAT must ensure:
    • Full compliance with UK GDPR and school data privacy policies.
    • Appropriate safeguards such as anonymisation or pseudonymisation are in place.
    • Clear documentation of the processing, including a completed DPIA.
  • Transparency in Automated Decision-Making | Schools must be open about any use of AI in decision-making or profiling, ensuring pupils, parents, and staff understand how their data is processed.
  • Legal Basis for AI Data Processing | If AI tools process personal data, the appropriate legal basis should be identified and any relevant actions implemented as a result before use.
  • Security Measures | AI-generated data should be protected using encryption, access controls, and secure storage.
 
Additionally, some generative AI tools collect and store additional data, such as:
  • Location
  • IP address
  • System and browser information
 
Schools must review and disclose how any data collected by generative AI tools is processed and stored in their Privacy Notice.
Ofsted Expectations for AI Use in Education
Ofsted does not directly inspect the quality of AI tools but considers their impact on safeguarding, educational quality, and decision-making within schools.

Schools must ensure:
  • Safety, Security, and Robustness: AI solutions used in schools must be secure and protect user data, with mechanisms to identify and rectify bias or errors.
  • Transparency: Schools must be clear about how AI is used and ensure that AI-generated suggestions are understood.
  • Fairness: AI tools should be ethically appropriate, addressing bias related to small groups and protected characteristics.
  • Accountability: Schools must ensure clear roles and responsibilities for monitoring and evaluating AI.
  • Contestability and Redress: Staff must be empowered to override AI suggestions, ensuring human decision-making remains central. Complaints regarding AI errors must be appropriately addressed.

Leaders are responsible for ensuring that AI enhances education and care without negatively affecting outcomes.

Integration into Policies and Agreements
To ensure compliance, transparency, and ethical AI use, schools and MATs should update their existing policies to include provisions for AI. We have drafted recommended text to add to key policies and privacy notices in order to support this process. This information for parts of our AI Guidance pack for schools and is included in the following document: 
2 - Generative AI in MATs and Schools - Policy Updates. 

Report by Soton Soleye and Ben Craig, School Pro TLC
contact schoolpro tlc for further AI guidance
find out more about schoolpro tlc services to schools here
Picture
Picture
References
Generative artificial intelligence (AI) and data protection in schools | GOV.UK
Generative Artificial Intelligence (AI) in Education | GOV.UK
Information Commissioner’s Office response to the consultation series on generative AI | ICO
Ofsted's approach to artificial intelligence (AI) | GOV.UK
Disclaimer
SchoolPro TLC Ltd (2025)
SchoolPro TLC guidance does not constitute legal advice.
SchoolPro TLC is not responsible for the content of external websites.

0 Comments

    SWIFT News
    ​

    Archives

    October 2025
    September 2025
    July 2025
    June 2025
    May 2025
    April 2025
    March 2025
    February 2025
    January 2025
    December 2024
    November 2024
    October 2024
    September 2024
    July 2024
    June 2024
    May 2024
    April 2024
    March 2024
    February 2024
    January 2024
    December 2023
    November 2023
    October 2023
    September 2023
    July 2023
    June 2023
    May 2023
    April 2023
    March 2023
    February 2023
    January 2023
    December 2022
    November 2022
    October 2022
    September 2022
    August 2022
    July 2022
    June 2022
    May 2022
    April 2022
    March 2022

    Categories

    All Achievement Advice AI Annual Conference Appropriate Body Service AQA Art Artificial Intelligence Associate College Attendance Character Education Conferences CPD CPD Provider Creativity Cultural Diversity Curriculum Curriculum Forum Curriculum Hubs Data Data Protection Delivery Partners Department For Education Devon Research School Diversity Equity And Inclusion Early Career Framework Early Career Teacher Programme Early Career Teachers ECTP Enrichment Activity Events Exams Funding GDPR Golden Golden Thread Governors History Teaching Initial Teacher Training Interview Interviews IT Support Leaders Leadership Forums Literacy LSSW Masterclasses MATs Membership Mentor Mentors Multi Multi Academy Trusts National Institute Of Teaching New New Horizons News Newsletter Newsletters NIoT NPQs Ofsted Online Safety Partnership Physical Education Professional Professional Communities Professional Development Programme Pupil Premium Reading Recruitment Reports Reseach Research Schools Review RISE Teams School Catering School Leaders SchoolPro TLC Schools Security SEND Sponsor Sponsors Study Visit Summer Conference Supply Teaching Sustainability SWIFT Central Team TEACHER Teachers Teaching Teaching And Learning Teaching School Hubs The Colyton Foundation Training UPDATE Your Future Story

    RSS Feed

    Mailing list

    sign up to SWIFT mailing list
    Access Octomono Masonry Settings
Picture
Picture
SPONSORED BY
Picture
Picture
Picture
Picture
Picture

Join us, be a part of our SWIFT community

apply for membership
© COPYRIGHT 2022 SOUTH WEST INSTITUTE FOR TEACHING SWIFT. ALL RIGHTS RESERVED  | Website by brightblueC
 VIEW OUR PRIVACY NOTICES | VIEW OUR COURSE T&CS