Moving to Digital Assessment masthead

Moving to Digital Assessment

Technical and operational factors for achieving your digital transformation goals

Simon Trevers, Head of Strategy and Propositions at Inspera, shares practical advice from his 15 years of experience helping organisations move from paper to on-screen exams.

The transition from traditional paper-based exams to digital assessment represents a pivotal strategic shift for awarding organisations. This move is no longer a question of if, but when and how.

Digital assessment offers transformative opportunities to enhance the validity, reliability, and accessibility of qualifications. On-screen exams can deliver greater efficiency, richer data insights, and an assessment experience more aligned with modern educational and professional environments.

However, the path to successful adoption is complex and requires careful strategic planning. It involves significant considerations, from technology investment and security protocols to change management and stakeholder communication. This article outlines the key opportunities and challenges, and a framework of critical technical and operational considerations, to ensure a successful and sustainable transition.

By proactively addressing these factors, organisations can mitigate risks and fully harness the potential of digital assessment to future-proof their qualifications and maintain their competitive edge.

1. Set your goals for digital assessment

Education and professional development are undergoing a profound digital transformation that requires a re-evaluation of traditional assessment. Before embarking on this journey, the critical first step is to define the desired outcome.

A clear vision, shared across the organisation, is the foundation of a successful digital transformation. Potential strategic goals for adopting digital assessment include:

Enhancing efficiency. Streamlining assessment production through single-source authoring for both paper and on-screen.

Boosting quality assurance. Leveraging e-marking workflows and the precision of automated marking for objective questions.

Improving candidate experience. Offering authentic, real-world scenarios that are only possible through on-screen delivery.

Adopting modern assessment models. Using a mix of formative, diagnostic, and summative assessments to provide ongoing insights into learner progression.

Strengthening security. Moving away from the risks of physical exam papers towards a model where tests can be unique to each candidate.

Increasing flexibility. Enabling assessment delivery anywhere, anytime, with or without physical invigilation, for a wide range of contexts from high-stakes, closed-book exams to open-book formative assignments.

Product differentiation. Introducing innovative, digitally-delivered assessments that stand apart from competitors.

2. Realising the benefits of digital assessment

A strategic transition unlocks benefits that can enhance every stage of the assessment lifecycle.

Enhanced validity and authenticity

Digital platforms enable the use of innovative item types and test models that provide a more authentic measure of a candidate's competence. This is achieved through:

  • Advanced question types. Moving beyond multiple-choice to include simulations, case studies, and interactive tasks, often augmented with tools like basic and scientific calculators.
  • Portable Custom Interactions (PCI). This industry standard is portable across different technology platforms, and can be used to develop unique, custom interactions that allow candidates to actively demonstrate specific skills.
  • A spectrum of test construction models. The journey of digital transformation allows for an evolution in test design. Organisations can progress from Fixed Forms (for both paper and on-screen) to more sophisticated models as their digital item bank matures. This includes Multiple Fixed Forms, Item Randomisation, Linear-on-the-Fly Testing (LOFT), which assembles a unique test for each candidate based on specific rules, and Computerised Adaptive Testing (CAT), which adjusts the difficulty of questions in real-time based on a candidate's performance.

Improved Security and Integrity

Digital platforms offer robust, multi-layered security features. A privacy-by-design approach should incorporate:

  • Role-Based Access Control (RBAC), ensuring users can only access content and functions appropriate to their role.
  • Comprehensive lockdown. Utilising a secure browser that can detect and terminate unauthorised applications, prevent the use of secondary monitors, and monitor clipboard activity, while retaining flexibility for different proctoring models and enabling resilience.
  • Flexible invigilation models. Supporting both in-person invigilation and remote options, such as ‘record and review’ remote invigilation that maintains security even with intermittent connectivity.
  • Plagiarism and authorship detection. Integrating tools to check for originality and validate authorship of submitted work.
  • Detailed audit logs. Maintaining a full digital record of every event, from authoring and test-taker keystrokes to marking decisions.

Greater Efficiency and Scalability

A modern, cloud-native Software-as-a-Service (SaaS) platform provides the foundation for efficiency and scale. Key attributes include:

High availability. Look for platforms with a proven uptime of 99.99% or higher, built on contemporary, dynamically scalable architecture (e.g. hosted on ISO 27001 certified services like AWS) with automated load balancing and geographical redundancy.

Automated and on-screen human marking. Auto-marking objective questions and providing efficient on-screen tools and workflows for human markers (such as rubrics and dynamic answer keys) speeds up the entire process and ensures quality and consistency of results no matter the form in which the assessment was submitted.

Single-source authoring. Authoring content once in a central digital item bank and deploying it to multiple formats (paper and on-screen) is a cornerstone of efficiency. This allows you to build a proven, high-quality digital asset over time that facilitates the introduction and transition to digital assessment at a pace that meets your needs.

On-demand testing. The efficiency gains allow for more frequent, flexible testing windows, increasing convenience for candidates – particularly important for formative assessment but arguably increasingly important for summative high stakes assessments.

Richer Data and Analytics

Digital assessment generates a wealth of data. Every interaction can be captured and analysed to provide invaluable insights.

  • Monitor item bank health. Track item performance over time to ensure the validity and reliability of the entire bank.
  • Support pre-testing and standard setting. Use powerful cross-test metrics to calibrate new items and fix standards with psychometric confidence.
  • Inform the authoring process. Feed analytics back to content creators to build and retain a bank of proven, effective items.
  • Provide actionable reports. Utilise built-in reporting and automated notification systems to keep stakeholders informed.

Access to data and analytical information is imperative when moving from paper to digital assessment. Consider how you want to ensure comparability between paper and digital versions of an assessment, to ease the transition and maintain both delivery modes if you are not transitioning entire qualifications to digital in one move.

Increased Accessibility and Inclusivity

A core principle of modern digital assessment is ensuring an inclusive experience. Platforms must be designed to meet relevant Web Content Accessibility Guidelines and offer built-in tools such as integrated screen readers, spell-check, and robust multilingual support for both the user interface and at individual question level.

Usability is equally as important. Choosing a solution that tailors the experience and user interface for each stakeholder will ensure fewer barriers to adoption and provide the best possible chance for test takers, without having to ‘learn’ the platform each time they take a test.

3. Critical success factors

A successful transition is built on a foundation of careful planning and strategic execution.

Adopt a phased and pilot-led approach

A ‘big bang’ might not suit your situation. A phased rollout, beginning with smaller-scale pilots, Proofs of Concept (POCs), trials, and practice tests, allows an organisation to test its systems, gather feedback, and build confidence in a controlled environment – minimising risk and maximising value.

Prioritise change management and stakeholder engagement

Digital transformation is a cultural change, not just a technical one. Success depends on:

  • A clear vision. A comprehensive project plan with a clear vision must be communicated across the entire organisation.
  • Stakeholder buy-in. Engage all stakeholders (staff, examiners, centres, candidates) at all levels, clearly communicating the benefits to them, such as an improved test-taker experience, new technology-enhanced questions, and better performance insights.
  • Training and support. Ensure all stakeholders understand what they need to do and why, providing the necessary training to build competence and confidence.

Develop a robust content strategy

It is crucial to consider how you will migrate existing content and build your future item bank. Look for a platform that supports:

  • IMS Question and Test Interoperability (QTI) standard. The QTI standard is key to the portability of content between platforms.
  • Flexible migration tools. The platform should offer both APIs for automated, large-scale transfers and user-friendly manual upload tools for smaller or varied content sources.

Choose the right technology partner

Selecting a vendor is a critical decision. You are choosing a long-term partner, not just a supplier. A true partner acts as a trusted advisor, committed to your success. Key characteristics to look for include:

  • A true SaaS model. A partner offering a modern, cloud-native SaaS solution ensures you benefit from a single code-base, continuous enhancements, and no hidden hosting fees.
  • Proven reliability. The partner should demonstrate high availability (e.g. >99.99% uptime) and a proven understanding of high-stakes assessment delivery at a global scale.
  • Commitment to co-development. Look for a partner willing to work collaboratively to enhance features to meet your specific needs, such as refining marking workflows or adding new assessment tools.
  • A flexible commercial model. The partnership should be cost-effective, avoiding large upfront costs and minimum volume commitments. Look for volume-based discounts that scale with your success.
  • The ability to work with other partners to ensure all your needs are met, enabling a best-of-breed approach to delivering your digital assessment goals.
  • Comprehensive support capabilities that meet your needs in assessment management and delivery, wherever you or your test takers are in the world.

Consider integration capabilities

  • A digital assessment solution will most likely be used alongside your wider assessment technology infrastructure and will need to leverage your existing investments in booking systems, user management, results publishing processes and more.
  • Ensuring the solution has a wide range of APIs that empower your organisation to deploy the service in a way that meets your needs will give you flexibility in how you use the platform, at implementation and into the future.
  • Integration considerations from the start of your project will inform how you integrate digital assessment authoring, delivery, monitoring and marking capabilities into your wider workflow. Workflows will likely be different for different types of assessment; from formative to summative and practice assessments, and from fixed form or automatically constructed tests.

Plan how to build your item bank

It is important to consider how you will build an item bank that will best realise the benefits of digital assessment delivery Depending on the size of your existing item bank and the processes you currently follow to create new items and maintain your existing items, a gradual progression of assessment models may be the best approach.

Unless you already have a large item bank, moving directly to ‘linear on the fly’ automatically constructed tests per test-taker is likely to be a big step. But there are ways you can work towards that goal as you build your item bank, leveraging different fixed forms for different cohorts, item order randomisation and potentially using branching logic in which you build out your item bank.

Conclusion: Plan for the future

The move to digital assessment is an essential step for awarding organisations to remain relevant and competitive. The benefits are compelling, but the journey requires a clear vision and meticulous planning. By starting with your strategic goals, learning from the success of others, and focusing on the critical success factors of a phased rollout, stakeholder engagement, and a technology partnership, you can navigate the complexities of this change successfully.

Viewing this transition not as a mere technological upgrade but as a strategic transformation will empower senior leaders to guide organisations towards a future where assessment is more secure, authentic, and effective.

The views and opinions expressed in this article are those of the author and do not necessarily reflect the policy or position of AQA Global Assessment Services.

Start with a Digital Readiness Audit

If you want to modernise your exams, start with an objective review of your current capabilities to see how ready you are for digital delivery.

Our experts will conduct an in-depth audit and produce a bespoke plan for filling any gaps, to give you the clarity and confidence you need to begin your on-screen exams roll out.

More on this topic

Our latest research report

How can digital assessment improve educational outcomes?

Governments around the world are using on-screen exams to improve educational outcomes. Find out how in our research report.

Hear from the experts

What does a successful transition to digital assessment look like?

These four expert-led interviews offer practical insights for anyone involved in shaping the future of assessment.

Adaptive testing in action

Developing adaptive on-screen assessments for use nationwide

Learn how we support the Welsh online national assessments taken by around 270,000 learners every year.

;