|
Now live it's the final UPDATE of 2025 and our ruby edition.
To start us off, Executive Director Martin Smith aptly considers "the collective goodwill that exists within the profession around supporting one another to develop" - not least in our role as Teaching School Hubs. Plus a special Happy Christmas message and a reminder of all the lovely aspects of being a teacher and leader at this time of year. It is proving to be very popular, so remember to sign-up for the Spring Leadership Forum in January with keynote speaker Harry Fletcher-Wood presenting on improving teaching, an intro to the CODE Maths Hub with Laura Clitheroe and a foreword by Martin. Thinking about understanding engagement amongst low-income White children in England’s schools, you can find out more in the report by The Engagement Platform with some suggested support ideas. Read up too on the latest webinar led by the South West English Hubs on the Department for Education's Writing Framework published earlier this year. Keeping it relevant, there is more guidance from the Education Endowment Foundation on Metacognition and self-regulated learning from Devon Research School. Equip yourself with the three new classroom tools to support you putting the evidence into practice. If you are looking to refine your timetabling tools, our sponsor SchoolPro TLC share their year-round timetabling approach to strengthening this essential work within school and they are available to support you if you need that helping hand. Faye Steele, our Senior Administrator for the SWIFT Appropriate Body Service is our December interviewee and how good it is to know that she is at the helm for this integral service with her “relentless unwavering determination to procure the information we need on time and to meet our deadlines and tracking.” We are only as good as our staff and systems! If you are deliberating about taking the bold step to change your contracted catering to in-house, read on as our sponsor Educatering share all the benefits of keeping your own kitchen staff, behind-the-scenes support, compliance and allergen management, staff training, bespoke menu development - and SO much more that makes sense for a fully supported service. Looking ahead to January, if you need any teacher cover, our sponsor Exeter Supply Partnership remind us why not all supply services are the same and the benefits of working with them. For a start, this not-for-profit organisation with true heart puts people first - and that means supporting local schools and teachers. We are nearing the shortest day and if you need a reading boost, remember to check out Mr T’s instructional series (AKA Christopher Tribble, Headteacher at Honiton Primary School). We like an edifying read! We are glad to work with you, support you and hope that you will end the Autumn Term on an uplifting note with the end of term Christmas celebrations. See you in 2026!
0 Comments
2/12/2025 0 Comments The Year-Round Timetable | A Strategic Approach that Strengthens Schools with SchoolPro TLCA timetable shapes the structure and rhythm of a thriving school. With a background as teachers and school leaders, our sponsor SchoolPro TLC have a working understanding of timetables and share here some of their expertise. Every school has its own moving parts - pressures, priorities, staffing profiles, and students’ needs - which means adaptability is essential and rigid rules rarely work. At SchoolPro Timetabling and Curriculum, we follow a clear annual cycle that keeps the timetabling process strategic, structured, and never a last-minute scramble. We work alongside Schools and Trusts of all sizes to design timetables that are efficient, robust, and aligned with strategic priorities. Our approach is underpinned by the expertise of Mark Hodges, our Director of Timetabling and Curriculum, who brings 25 years in education - including more than a decade in senior leadership - and over ten years in supporting schools nationally to maximise curriculum efficiency and build timetables that genuinely support teaching, learning, and long-term planning. September to October | Reviewing and Refining
As the new academic year begins, the focus is on evaluating how the timetable is bedding in and includes reviewing:
At this stage, we also reflect on exam outcomes and staffing changes to determine whether elements of the curriculum model need adjustment. For example, if a subject has underperformed in exams, seen a decline in student uptake, or is affected by staffing challenges, discussions are held to determine whether its timetable allocation should be adjusted for the coming year. These conversations feed into strategic discussions with leadership teams and help shape decisions. November to December | Designing the Model This phase centres around consultation with SLT and Heads of Department and together, we review:
Changes like moving to a two-week timetable, adding periods, or adjusting lesson lengths are decided early to ensure the year’s timetable runs smoothly. We then draft a curriculum plan in the timetabling software to identify potential recruitment needs or staffing adjustments. Once agreed, this becomes the foundation for the year’s timetable. January to February | Setting the Foundations With a clear curriculum model established, we move into preparing the systems that underpin effective scheduling and this includes:
Digitising the options process wherever possible saves significant time and reduces errors - allowing schools to focus on informed choices rather than administrative tasks. March to April | Building Class Structures Draft class structures are shared with Heads of Department for checking and refinements. Drawing on our team’s strong understanding of staff performance, strengths and how the timetable supports school improvement priorities, we allocate staff to groups in a considered and strategic way. These allocations are reviewed and confirmed by SLT before being shared with teachers, who can provide feedback or request adjustments. Staff are then allocated to groups based on strengths, performance insight and strategic priorities. SLT review and confirm these allocations before staff view their proposed timetable placements. This stage ensures that curriculum intent and staffing strengths align before formal scheduling begins. May to June | Constructing the Timetable This is where the full timetable comes together and the process involves:
Compromises are sometimes necessary, but the goal is always to produce a timetable that balances efficiency, equity and educational value. July | Finalising and Releasing Once a stable model is achieved, we move into rooming and sharing the draft timetable with staff. Feedback is reviewed and reasonable adjustments are made, with the aim of issuing final timetables at least two weeks before the end of term to support transition into the new academic year. This final stage brings together the reflection, consultation, and modelling throughout the process. Once confirmed, the timetable is transferred into the MIS, ensuring the school is fully prepared for the new academic cycle. Supporting Schools All Year Round Timetabling is far more than a summer task - it is a strategic, iterative process that demands reflection, communication and expertise. At SchoolPro Curriculum, we support schools with:
If your school would benefit from support with next year’s timetable, the expert SchoolPro Curriculum Team would be delighted to help you. AI Meets Education | Protecting Privacy, Empowering Learning The adoption of AI in schools to support students on their educational journey has accelerated rapidly. From lesson planning and report writing to helping with homework, supporting CVs, and even creating art and music – there’s an AI tool for nearly every aspect of school life. Our sponsor SchoolPro TLC share their insights here on this phenomenon that is changing all our lives. AI in education is rapidly evolving, designed to help students prepare for GCSEs, A Levels, understand the world around them, and navigate life beyond school. But it is not only students using these tools – teachers, administrators, school leaders, and even parents and carers are exploring AI to reduce workloads, save time, boost productivity, and escape repetitive administrative tasks. AI is revolutionary in its ability to lift the burden of everyday chores, transforming hard work into something more engaging – even fun. But at what cost? This surge in AI use raises an important question: Is AI our friend or foe? Do the benefits truly outweigh the risks, and should we be concerned about the long-term implications? FRIEND 1. Great for Learning AI can be useful for personalised learning, tailoring educational content to meet the specific requirement of the student. Using AI in this way will help enhance engagement and understanding of subjects being studied. 2. Removing the Tedium from Routine Tasks AI is able to take over time-consuming tasks like grading, lesson planning, scheduling, report writing, and attendance tracking. By automating these routine duties, teachers are freed up to focus on what truly matters: teaching, building relationships with students, and delivering more personalised support in the classroom. 3. No Time Restraints Unlike schools and educational institutions AI is available 24/7. This not only supports teachers, but also provides students with access to learning tools and support anytime, anywhere. 4. Providing Teachers with Support AI offers teachers access to a wide pool of teaching resources, enabling them to enhance their instructional strategies. This support can lead to more engaging lessons, personalised learning experiences, and ultimately, improved student outcomes. 5. Enhancing Future Career Prospects for Students With AI integrated into education learning, this will help students develop the necessary skills to enter the job market where AI technologies will play a significant role. These are but a few of the benefits of using AI in the educational environment. FOE With every benefit a product or service offers, there often comes a downside, and the use of AI tools is no exception. 1. Data Privacy Risks When using AI, users may inadvertently enter personal or sensitive data, which is then processed in ways they do not control and this raises important concerns about how this data is stored, used, and protected. The information entered into AI systems may be utilised to train and improve the applications over time, making data privacy and security critical considerations. 2. AI Psychosis A startling new trend is emerging in our increasingly digital world: people are forming deep emotional bonds with AI systems like ChatGPT – spending hours interacting, confiding, and even building relationships with these tools. This growing dependence is being dubbed “AI Psychosis” across media and social platforms. Whilst not a clinical diagnosis, the term reflects a concerning shift in human behaviour – where reliance on AI begins to blur the lines between reality and artificial companionship. 3. Reliance on Content The content delivered by using an AI tool cannot always be relied upon for accuracy. The end content is dependent on many factors of how the AI tool views the sources of the learned material and information. 4. Sharing of Responses You have used AI to help write a report and found it incredibly useful. Naturally, you want to share it with a colleague so that they can benefit and use it as a template for their own report writing. There is nothing wrong with that – or is there? Recent reports suggest that when you share AI chatbot responses, such as those from ChatGPT, the content could potentially be indexed by search engines like Google, making it publicly searchable. This raises important privacy and confidentiality concerns. Sharing of data this way is not always automatically but could be linked to privacy settings with the chatbot. Therefore, it is recommended to “check you settings” prior to using chatbot tools to ensure unnecessary sharing does not occur. Friend or Foe? Building Safe and Positive AI Experiences There is no doubt AI will bring about vast benefits for schools. Students will prosper, teachers will have more time to spend on teaching and interaction with students, administrators will be free from doing mundane tasks to take on more meaningful projects and money will be saved. So how do we reap the benefits but at the same time keep students, teachers and even the school safe from privacy risks and breaches? By thinking “HARP” H: Human Intervention It is important not to rely solely on AI-generated information. AI is not always accurate, so human oversight is essential before trusting or using any AI response. If in doubt, verify the information with trusted sources you have used previously to ensure its reliability. A: Age Appropriate When introducing AI tools in the classroom, ensure they are age-appropriate and aligned with students’ educational needs. Monitor how students use AI chatbots or other applications, as recent leaked documents from Meta’s GenAI Content Risk Standards revealed that some AI systems may engage children in conversations that are romantic or sensual. It is vital to remain vigilant and guide students towards safe, relevant, and appropriate use of AI technology. Talk openly with students about the safety risks associated with using AI tools. Encourage them to limit the amount of time they spend interacting with AI to reduce the risk of developing dependency. Educating students on balanced and mindful AI use helps promote healthier, safer habits. Harness an environment of digital literacy and critical thinking when using AI tools. Remind teachers and students to evaluate outputs and not take responses at face value – “Don’t Believe The Misinformation”. R: Risk Assessments and Policy Before implementing AI tools, ensure that your school or Trust has conducted thorough risk assessments that weigh the educational benefits against potential privacy and security concerns. Review and update existing policies and procedures to explicitly address AI use, outlining the safeguards and measures in place to protect data privacy and reduce associated risks. Exactly like any third-party data processor, AI tools must meet GDPR standards. Before using them, ensure they have strong security measures, clear data handling policies, and comply with privacy laws. Treat AI with the same scrutiny. P: Privacy and Security Settings Set Your AI Tool to Private: Many AI chatbots allow you to adjust privacy settings. Use the toggle switch to set your chats to private, especially if you plan to share conversations with colleagues. This helps protect your data from being publicly accessible. Delete Chats and Turn Off Memory: Regularly delete your chat history and disable memory features to limit the amount of data the AI collects about you. AI systems build profiles based on your interactions – such as your interests and question patterns – that could reveal sensitive information like religious, political, or social preferences, even if you do not explicitly provide personal details. Practice Online Safety: Treat AI tools like any other online platform. Consider what security measures are in place to protect your information. Review the company’s privacy policies to understand how your data is used – especially whether it’s leveraged to train and improve AI models. Report Concerns: Have clear guidance on how to report any inappropriate AI content or misuse. By Tanya Clark, SchoolPro TLC We thank the SchoolPro TLC Team for these helpful guidelines as a reminder to keep safe when using AI.
Further to the previous article on the Use of Generative AI in MATs and Schools from our sponsor SchoolPro TLC, we encourage you to this review this further guidance to ensure you are AI safe in your School and Multi Academy Trust starting with this checklist. General AI Best Practices
Quick Staff Guide
What is AI and How Can It Be Used in Schools? Artificial Intelligence (AI) can support teaching, reduce workload, and improve efficiency. When used responsibly, it can:
Key Safety Tips
How to Talk to Pupils About AI
Who to Contact for AI Support For any AI-related concerns, training needs, or Data Protection questions, contact your School’s IT or Data Protection Lead, Your SchoolPro TLC Data Protection Officer (DPO). Have you embarked yet on your Artificial Intelligence (AI) journey? The use of AI in schools is rapidly growing, offering numerous benefits, such as enhanced efficiency, personalised learning, and improved decision-making. However, AI also presents challenges, including Data Protection risks, ethical considerations, the risk of bias, and concerns over transparency. Given the rapid advancements in AI and the growing reliance on these technologies in education, it is crucial for MATs and schools to establish clear policies that balance innovation with safeguarding concerns. Our sponsor, SchoolPro TLC provides some helpful guidance here and a framework for the responsible use of AI in schools, ensuring compliance with UK GDPR, recommendations from the Information Commissioner’s Office (ICO), the Department for Education (DfE), and guidance from Ofsted. What is Generative AI? Generative AI refers to AI systems that can create new content, such as text, images, video or audio. Unlike traditional AI, which follows explicit programming to complete specific tasks, generative AI uses machine learning to create original outputs from input data. The UK Government and the ICO define AI as technology that mimics cognitive functions associated with human intelligence, such as learning and problem-solving. AI is increasingly used in MATs and schools for both educational and administrative purposes, raising questions about responsible implementation, data security and the ethical implications of its use. Open vs Closed AI Systems Understanding the distinction between open and closed AI systems is essential when assessing risk and implementing AI within educational settings:
Can Open AI Systems Be Configured as Closed? Some AI tools, such as Google Gemini, Microsoft Copilot, and other cloud-based AI models, are generally considered open AI systems by default. However, it is possible that they can be configured to function as closed systems depending on their settings and the environment in which they are deployed. For example, within a Google Workspace for Education environment, Google Gemini can be configured to:
In such cases, an AI tool that is generally open in a public setting may be functionally closed within a well-managed, restricted environment. Schools should consult their IT lead or Data Protection Officer (DPO) to determine whether an AI tool is configured to meet Data Protection requirements before use. MATs and schools should assess AI applications before use to determine their suitability based on these classifications and apply appropriate safeguards, such as data minimisation and access controls. Scope of AI in MATs and Schools Pupil Usage AI has the potential to enhance learning through activities such as:
However, students must be educated on the ethical use of AI, particularly in avoiding over-reliance and plagiarism. Acceptable Use Agreements should explicitly outline permissible and prohibited AI use. Staff Usage Teachers and administrators can potentially use AI for activities such as:
Staff must verify AI-generated content for accuracy and must not input personal or sensitive data into generative AI tools without prior assessment. Governors and Leadership Governors and senior leadership teams play a crucial role in overseeing AI implementation, ensuring compliance with Data Protection laws, and updating policies as AI capabilities evolve. Core Principles for AI Use Transparency MATs and schools must conduct Data Protection Impact Assessments (DPIAs) when AI tools process personal data. DPIAs help identify risks and establish mitigating strategies to protect sensitive student and staff information. Schools should also be transparent about how they use generative AI tools, ensuring that staff, students, Governors, parents, and carers understand how their personal data is processed. Accountability Roles and responsibilities for AI use must be clearly defined and schools should:
Compliance with Data Protection Legislation Schools must ensure that AI tools comply with UK GDPR and their Data Protection Policies. To protect data when using generative AI tools, schools should:
AI and Data Protection in Schools AI use must comply with UK GDPR and the Data Protection Act 2018 in order to safeguard personal data. Schools reserve the right to monitor AI usage to prevent misuse and ensure compliance with academic integrity policies. Data Privacy and Protection The use of personal data in AI tools must be handled with extreme caution. Schools and MATs should adopt the following principles:
Additionally, some generative AI tools collect and store additional data, such as:
Schools must review and disclose how any data collected by generative AI tools is processed and stored in their Privacy Notice. Ofsted Expectations for AI Use in Education Ofsted does not directly inspect the quality of AI tools but considers their impact on safeguarding, educational quality, and decision-making within schools. Schools must ensure:
Leaders are responsible for ensuring that AI enhances education and care without negatively affecting outcomes. Integration into Policies and Agreements To ensure compliance, transparency, and ethical AI use, schools and MATs should update their existing policies to include provisions for AI. We have drafted recommended text to add to key policies and privacy notices in order to support this process. This information for parts of our AI Guidance pack for schools and is included in the following document: 2 - Generative AI in MATs and Schools - Policy Updates. Report by Soton Soleye and Ben Craig, School Pro TLC References Disclaimer
SchoolPro TLC Ltd (2025) SchoolPro TLC guidance does not constitute legal advice. SchoolPro TLC is not responsible for the content of external websites. |
SWIFT News
|
SPONSORED BY
Join us, be a part of our SWIFT community |
© COPYRIGHT 2022 SOUTH WEST INSTITUTE FOR TEACHING SWIFT. ALL RIGHTS RESERVED | Website by brightblueC
VIEW OUR PRIVACY NOTICES | VIEW OUR COURSE T&CS
VIEW OUR PRIVACY NOTICES | VIEW OUR COURSE T&CS



RSS Feed