Industry2026-04-07·8 min read

CBSE's Twice-a-Year Class 10 Exams Are Doubling the Evaluation Burden

CBSE's new two-attempt policy for Class 10 board exams starting 2026 means double the answer books, double the correction work, and no proportional increase in evaluator capacity. How digital evaluation offers a path forward.

CBSE's Twice-a-Year Class 10 Exams Are Doubling the Evaluation Burden

A Policy Designed for Students, Not for Evaluators

In June 2025, CBSE officially notified that Class 10 board examinations would be conducted twice per year starting from the 2025-26 academic session. The first cycle ran from February 17 to March 11, 2026. The second cycle is scheduled to begin on May 15, 2026.

The intent is student-friendly: consistent with National Education Policy 2020 recommendations, students can appear in both attempts and count the better score as their final result. Stress reduction, reduced exam anxiety, more learning opportunities — the goals are sound.

But the policy quietly creates a problem that policymakers did not fully address: every teacher who evaluates board papers will now evaluate twice. Every evaluation centre will process twice the volume. Every logistical chain — scanning, dispatch, correction, tabulation, result processing — runs twice within a calendar year.

The Arithmetic of Double Evaluation

CBSE's 2026 board examinations involve approximately 46 lakh students for Class 10 and Class 12 combined. Class 10 alone accounts for roughly 22-24 lakh students nationally. With multiple subjects per student, the number of answer scripts in a single cycle runs into several crore.

When teachers describe the situation, the numbers become concrete. A typical evaluator assigned to board examination duty corrects between 25 and 30 answer books per day over a 10-12 day evaluation window. Under the old single-cycle system, this was an annual obligation that teachers planned around. Under the new two-cycle system, the same obligation repeats within 60 days of the first round concluding.

Teachers across CBSE-affiliated schools have flagged three specific concerns:

  • Curriculum continuity: Evaluation duty pulls teachers away from classroom instruction during active academic terms. Two rounds mean two interruptions.
  • Quality degradation: When evaluation and teaching overlap, the cognitive load on evaluators increases. Rushed correction increases the probability of marking inconsistencies.
  • Planning paralysis: School administrators cannot schedule curriculum, internal assessments, or remedial classes reliably when teachers may be called for evaluation duty twice in the same semester.
  • One school principal quoted in media coverage of the policy change put it directly: "Two rounds of board exams mean twice the correction work, twice the coordination, and no real break for lesson planning."

    CBSE's Response: OSM for Class 12, But Not Class 10

    CBSE has recognised part of this problem. For Class 12 answer sheets in 2026, CBSE has implemented On-Screen Marking (OSM) — a digital evaluation system where answer sheets are scanned, uploaded to a central portal, and evaluated by teachers on their computer screens at their own school premises.

    This removes the need for teachers to travel to evaluation centres. It allows evaluation to be distributed across more evaluators simultaneously. It automates mark tabulation, eliminating totalling errors.

    But Class 10 answer sheets continue to be evaluated through the traditional physical method in 2026. Teachers still travel. Bundles still move. Totalling is still manual.

    The asymmetry matters: the very students who will now appear for exams twice — Class 10 students — are the ones whose papers will be evaluated through the older, more burdensome system. If CBSE eventually extends the two-attempt policy to Class 12 as well (which NEP 2020 broadly envisages), the evaluation strain will compound further.

    Why Scaling Traditional Evaluation Does Not Work

    The instinctive response to needing more evaluation capacity is to assign more evaluators. CBSE has already moved in this direction — 2026 guidelines reduce the per-evaluator answer book quota to complete evaluation within 8-10 days instead of the previous 10-12 day window, distributing the load across more teachers.

    But this solution has structural limits:

    Qualified evaluator pool is finite. Board evaluators must be trained, subject-qualified, and CBSE-approved. Expanding the pool takes time and structured capacity-building programs. You cannot simply add evaluators the way you add temporary warehouse staff.

    Evaluation centres have physical constraints. Running more evaluators requires more space, more infrastructure, more supervisory personnel, and more administrative overhead.

    Quality oversight does not scale linearly. In physical evaluation, quality checks — moderation, random rescrutiny, head examiner reviews — require dedicated human oversight at each step. Doubling evaluators does not automatically double the quality assurance capacity.

    The cost structure scales with volume. Paper transport, centre rentals, evaluator honoraria, supervisory staff — every element of the traditional evaluation chain is proportional to volume. Double the scripts, and costs roughly double.

    What Digital Evaluation Changes About This Equation

    On-screen marking breaks the linear relationship between evaluation volume and resource requirements in two important ways.

    First, evaluators work from where they are. When answer sheets are digitized and distributed through a portal, an evaluator in Lucknow can evaluate a script from a school in Agra without anyone transporting anything. The effective evaluator pool expands to anyone with an internet connection and an approved computer — nationally, not locally.

    Second, quality assurance is built into the software layer. Modern OSM platforms can enforce minimum time-per-script thresholds (flagging answers marked suspiciously fast), require double valuation for high-stakes scripts, route borderline cases automatically to senior evaluators, and log every marking action with a timestamp. These controls operate at the system level, not through additional human supervisors.

    For CBSE's twice-yearly Class 10 examination challenge specifically, digital evaluation offers a path to handling double the volume without doubling the logistical burden — because the logistical burden per script drops dramatically when physical transport and physical presence are removed from the chain.

    The Infrastructure Question

    CBSE's infrastructure requirements for OSM — a computer lab with static IP, Windows 8 or above, minimum 4 GB RAM, 2 Mbps internet, uninterrupted power supply — are modest by 2026 standards. Most secondary schools in urban and semi-urban areas already have labs that meet or exceed these specifications.

    The challenge is less technical than it is procedural: training evaluators, establishing protocols, and ensuring that evaluation technology is reliable enough that teachers trust it. CBSE's Class 12 OSM rollout in 2026 is, in effect, the proof-of-concept test. If it runs smoothly, the case for extending digital evaluation to Class 10 will become difficult to resist.

    What Other Boards Are Watching

    CBSE's twice-yearly examination is being watched closely by state boards exploring similar reforms under NEP 2020 implementation plans. Karnataka, Tamil Nadu, and Maharashtra have all conducted internal reviews of two-attempt models at the state level.

    If state boards with even larger student volumes than CBSE move toward multiple examination cycles, the evaluation infrastructure problem will become acute far beyond the CBSE system. State boards that have not yet invested in digital evaluation infrastructure will face the same strain CBSE is navigating now — except at larger scale, with fewer resources.

    Madhya Pradesh's Parallel Experiment in Accountability

    MPBSE has taken a different approach to evaluation quality in 2026 — introducing a financial penalty of Rs 200 per confirmed marking error for evaluators. With 40,000 answer books being evaluated in Indore alone, the board is betting that financial accountability will tighten evaluation standards.

    It is a blunt instrument, and its limitations are real: a penalty system cannot catch errors that are never detected, does not address the structural causes of inconsistent marking, and may create perverse incentives for evaluators to mark conservatively rather than accurately. But it signals that boards are aware of the evaluation quality problem and are actively experimenting with solutions.

    Digital evaluation's approach to the same problem is systemic rather than punitive. When every mark entry is logged, when double valuation is automatic, and when software flags statistical anomalies, the system catches problems before results are published — not after complaints are filed.

    The Policy Direction Is Clear

    Whether or not CBSE's Class 12 OSM rollout runs without incident in 2026, the structural logic is inescapable. Twice-yearly examination cycles require evaluation systems that can scale without linear cost increases, that can distribute work geographically, and that can maintain quality standards under compressed timelines.

    The question for India's examination infrastructure is not whether digital evaluation becomes the norm. It is how quickly, and how well, the transition is managed.

    ---

    Related Reading

  • CBSE Introduces On-Screen Marking for Class 12: What It Means for Digital Evaluation in India
  • Lessons From Large-Scale Onscreen Marking Rollouts
  • The Hidden Costs of Paper-Based Exam Evaluation
  • Ready to digitize your evaluation process?

    See how MAPLES OSM can transform exam evaluation at your institution.