Sparx is on a mission to improve the world through learning via a new, scientifically proven approach to teaching and learning. Evidence and data is at the heart of what we do. Our Educational Research Team has been investigating how young learners learn and measuring our impact since 2010. Completing trials which are the first of their kind in education, we’re leading the way in proving and continuously building on the efficacy of EdTech. Our product design and the rich data it generates enables us to carefully target and measure student learning.
Here is our research and evidence – a range of randomised control trials, surveys, questionnaires, blogs and more.
© Sparx Limited. All rights reserved. | Sitemap.
Maths confidence is strongly associated with student progress and attainment, as well as maths enjoyment and perceived importance of maths.
In June 2019, around 4,000 Year 7 and Year 8 students from 14 schools sat their end of year assessments. We analysed the results and mapped the distribution to the national 2018 GCSE distribution to predict their indicative GCSE maths grade.
In December 2019, we asked thousands of UK teachers two questions through Teacher Tapp. The results show that 79% of teachers want to see clear proof that EdTech works in the classroom.
The EEG brings together leading UK EdTech companies who share a belief that there needs to be a step-change in the level of evidence available about EdTech. The EEG believes schools and trusts need to be able to easily assess the value and impact of EdTech products, services and platforms.
Compared to doing no homework, students made 83% more progress by completing 15 minutes of Sparx homework, and 23% more progress for every 15 minutes of additional Sparx homework thereafter.
In January 2020, we commissioned a survey with over 1,000 primary and secondary school teachers. Over 70% of teachers surveyed agree that students have less access to extra-curricular activities because of teacher workload, with nearly half of respondents seeking mental health support due to job-related stress.
In an increasingly noisy marketplace, it is becoming more and more difficult for schools to fathom which education technologies to buy into. From homework products to behaviour management solutions, it is likely there is an EdTech company promising to solve all of your worries. We know that EdTech can work, but, how can you be sure that what you’re using is beneficial?
Use this checklist to help you choose the right EdTech for you and your school.
Compared with national norms, Sparx students made 67% more progress in Year 7 and a further 63% more progress in Year 8.
Progress was relatively consistent across prior ability groupings, with the lowest ability students making the same progress as higher ability students in year 7 and slightly more progress in year 8. This contrasts sharply with the national picture where lower initial attainment was associated with lower progress.
In the Sparx cohort there was no evidence of progress being negatively affected by socioeconomic status or gender; this again is in contrast to the national data which shows that boys and students on free school meals make slower progress.
Teachers save on average five hours per week (based on a teacher with 10 classes per week) when using Sparx Maths.
For any randomised control trial, getting the trial design right is critical as it determines the reliability of the evidence that will be obtained.
We need to encourage an evidence-based design culture to ensure impact-centric EdTech.
AI, AR, Arrrghhhh… Just show me the evidence.
From Prime Minister Boris Johnson’s pledge to fund a £250m AI lab for the NHS, to the Department for Education’s recently launched ‘Artificial Intelligence (AI) horizon scanning group’, you could be forgiven for thinking that AI is being lauded as a panacea to some of the most pressing issues society faces.
As an EdTech professional very much engaged in exploring AI in personalised learning technologies, perhaps I should welcome such movements wholeheartedly. But, while any such focus on investigating how to improve learning is positive, my worry is that such expert groups sometimes miss the most important facet of their remit – interrogating empirical evidence.
Read more by clicking the button below.