Who are we?
Baselinetests.com provides schools with an efficient and trusted means of gathering baseline data - securely, online, with no fuss.
Our team has been instrumental in delivering groundbreaking data projects such as the SMID Report & the ASCL Toolkit, transforming the way we consider KS4 data. Now, working collaboratively with experts in the the fields of curriculum, transition and assessment creation, we bring schools a solution to the KS2 data conundrum. Accurate, reliable baseline data is our specialism.
SolutionKS2 Baseline - Scaled Score
Robust Test Development
If designed with expertise and care, assessments should differentiate students based and how well they know the material/techniques being tested. Essentially, if a test is either too difficult or easy, it’s useless. Our experts ensure that tests are pitched at the requisite level to provide maximum validity, giving schools confidence in the data they provide.
Naturally, assuming answer choices are plausible, increasing the number of options increases the difficulty of a question. However, without careful planning it’s easy to ‘over-do’ things... 4 choices provides a sound base on which to build validity; particularly when questions include ‘lures’ or ‘distraction’ options.
Educators often like to include these as options on tests, but the general consensus of research is that it’s best to avoid them. In short, the potential benefits are small and they can easily become detrimental to assessment.
To conclude, we design our assessments with care, in line with the recommendations from learning literature. Our expert assessment team, led by Brian Speed, aims to furnish schools with tests that provide challenge while allowing students to largely succeed. The feedback element too, should not be underestimated. The diagnostic software within the platform will allow your school to plan and allocate resources most effectively.
Haladyna, T. M., Downing, S. M., and Rodriguez, M. C. (2002). A review of multiple-choice item-writing guidelines for classroom assessment. Applied Measurement in Education, 15, 309- 344.
Roediger, H. L., III, & Butler, A. C. (2011). The critical role of retrieval practice in long-term retention. Trends in Cognitive Sciences, 15, 20-27.
Marsh, E. J., Roediger, H. L. III, Bjork, R. A., & Bjork, E. L. (2007). The memorial consequences of multiple-choice testing. Psychonomic Bulletin & Review, 14, 194-199.
Roediger, H. L. III, & Marsh, E. J. (2005). The positive and negative consequences of multiple- choice testing. Journal of Experimental Psychology: Learning, Memory, and Cognition, 31, 1155-1159.
Brown, A. S., Schilling, H. E., & Hockensmith, M. L. (1999). The negative suggestion effect: Pondering incorrect alternatives may be hazardous to your knowledge. Journal of Educational Psychology, 91, 756-764.
Butler, A. C., & Roediger, H. L. III. (2008). Feedback enhances the positive effects and reduces the negative effects of multiple-choice testing. Memory & Cognition, 36, 604–616.
Butler, A. C., Marsh, E. J., Goode, M. K., & Roediger, H. L., III (2006). When additional multiple-choice lures aid versus hinder later memory. Applied Cognitive Psychology, 20, 941-956.
Whitten, W. B., & Leonard, J. M. (1980). Learning from tests: Facilitation of delayed recall by initial recognition alternatives. Journal of Experimental Psychology: Human Learning and Memory, 6, 127-134.
Little, J. L., Bjork, E. L., Bjork, R. A., & Angello, G. (2012). Multiple-choice tests exonerated, at least of some charges: Fostering test-induced learning and avoiding test-induced forgetting. Psychological Science, 23, 1337-1344.
Pachai, M. V., DiBattista, D., & Kim, J. A. (2015). A systematic assessment of ‘none of the above’ on multiple choice tests in a first year psychology classroom. The Canadian Journal for the Scholarship of Teaching and Learning, 6, Article 2.
Odegard, T. N., & Koen, J. D. (2007). “None of the above” as a correct and incorrect alternative on a multiple-choice test: Implications for the testing effect. Memory, 15, 873-885.
Bishara, A. J., & Lanzo, L. A. (2015). All of the above: When multiple correct response options enhance the testing effect. Memory, 23, 1013-1028.
A mixture of examiners, teachers and software developers.
Creator of the ASCL toolkit among other groundbreaking technologies promoting great use of data in schools - Steve has lent his expertise to this project. Ever mindful of the need to manage teacher workload, he was adamant that marking, analysis and reporting be automated. Steve’s knowledge and insights have proved invaluable in building the platform to meet your baseline data needs.
Brian is an expert mathematician who has written several best-selling Maths text books. He is also an educational consultant, chief examiner for AQA and KS2 senior marker. Brian’s knowledge and experience as an assessment creator has ensured that the integrity of our tests is something schools can have the utmost confidence in.
Andrew is an experienced Assistant Head Teacher, KS2 Leader, English Lead and Y6 Teacher. A successful educational author, Andrew’s interest in E-learning has manifested itself in several projects; creating educational apps, developing education resources and striving to make vocabulary a worthy priority in schools. Andrew launched Vocabulary Ninja in 2017 and is author of 7 best-selling books on classroom based vocabulary and reading comprehension instruction.
Maths teacher, SATs authenticator, SLT, Governor and GCSE marker. Currently thriving supporting schools with teacher development to improve student outcomes.
Designer, teacher and E-Learning entrepreneur - fuelled by combining his skills to improve student outcomes.
Anthony is now an Education Consultant with a particular interest in data and how it can support transition. He is an advocate of using data only if it truly informs teaching and learning.
Entrepreneur and Logistics problem solver. Works across companies and schools to facilitate maximum efficiency of personnel, data and systems.