Industry2026-05-15·6 min read

CBSE OSM 2026: Why Students Say Digital Marking Is Unfair — And What the Data Shows

After CBSE declared Class 12 results with an 85.20% pass rate — down from 88.39% last year — students across India are demanding free re-evaluation, alleging on-screen marking is unfair. A closer look at what the numbers actually reveal.

CBSE OSM 2026: Why Students Say Digital Marking Is Unfair — And What the Data Shows

The Backlash After India's Largest OSM Result

On 13 May 2026, CBSE declared Class 12 results for over 17.68 lakh candidates. The pass percentage came in at 85.20% — a drop of 3.19 percentage points from the 88.39% recorded in 2025. Within hours, social media was flooded with students claiming their marks were unexpectedly low and attributing the drop to CBSE's newly introduced On-Screen Marking system.

Students and parents wrote collectively to the Ministry of Education and the Prime Minister's Office demanding free manual re-evaluation, fee waivers for answer sheet photocopies, and an urgent review of the digital evaluation process. "Our futures are at risk," read one widely-shared letter posted across social media platforms.

This is the first major public controversy to emerge from India's most ambitious digital evaluation rollout — nearly 9.8 million answer scripts evaluated digitally for the first time. Understanding it clearly matters, both for students whose marks are in dispute and for the broader adoption of digital evaluation across India's examination system.

What Students Are Claiming

The student grievances cluster around three distinct issues:

Unexpectedly low marks: Students report scores significantly below their anticipated performance based on how they believed the exam went. In previous years, similar gaps were sometimes explained by totalling errors discovered during marks verification — a mechanism that no longer exists under OSM.

High re-evaluation fees: CBSE discontinued post-result marks verification for Class 12 starting with the 2026 cycle. Students must now pay re-evaluation fees for any dispute. Many argue that since answer sheets are already digitised, the cost of a second review should be far lower than under the manual system.

Evaluation quality concerns: Teachers and students reported that evaluators faced screen visibility issues and slow software response times during the evaluation period, raising questions about whether every answer was reviewed with adequate care.

What the Data Actually Shows

Pass Rate Trend

YearPass %Evaluation Mode
202387.33%Manual
202487.98%Manual
202588.39%Manual
202685.20%OSM (first year)

The 3.19-point drop is real. The question is what caused it — and whether OSM is the cause, a contributing factor, or a coincidence.

Regional Performance

Trivandrum led all CBSE regions at 95.62%, followed by Chennai at 93.84% and Bengaluru at 93.19%. Prayagraj recorded the lowest at 72.43%. This variation mirrors pre-OSM regional performance patterns from previous years, suggesting no uniform digital-marking penalty across geographies.

Gender Performance

Girls recorded 88.86% and boys 82.13% — a 6.73-point gap, the widest in recent years. This gender gap is consistent with trends observed in pre-OSM years and cannot be attributed to the evaluation mode.

What OSM Changes — and What It Does Not

OSM changes the interface through which evaluators mark answer scripts. It does not change the assessment criteria, the marking schemes, the subjects being evaluated, or whether a candidate answered questions correctly.

What OSM does eliminate is a specific category of error that historically benefited some students: totalling mistakes. In manual evaluation, wrong addition of sub-question marks was among the most common errors caught during post-result marks verification. The verification step — now discontinued — functioned in practice as a mechanism that occasionally upgraded marks when totalling errors were found. Removing that correction step will statistically lower the apparent pass rate in the transition year, even if individual question-level marking is equally rigorous.

This is not the same as OSM causing unfair marking. It is the removal of an accidental upward correction that existed in the manual system.

The Three Sources of the Pass Rate Drop

Analysts examining the 2026 CBSE result attribute the 3.19-point drop to three overlapping factors:

  • Removal of totalling corrections: Post-result verification upgrades no longer apply.
  • First-year evaluator adjustment: Evaluators marking on screen for the first time may have been more conservative or inconsistent than in mature OSM deployments.
  • Examination difficulty variation: The Class 12 2026 paper was assessed by multiple coaching institutes as moderately harder in key subjects compared to 2025.
  • OSM is one factor, not the sole cause.

    Where Legitimate Concerns Exist

    The transition to OSM raised genuine operational issues that deserve acknowledgement rather than dismissal.

    Evaluator training gap: This was the first year of full-scale Class 12 OSM. Multiple states reported evaluators unfamiliar with the software interface, leading to slower evaluation pace and potential fatigue-related inconsistencies late in evaluation sessions. Fatigue in onscreen evaluation is a different ergonomic challenge from physical fatigue with paper scripts.

    Scan quality and screen legibility: Reading handwritten exam responses on a monitor is cognitively different from reading them on paper. Scan resolution, monitor calibration, and lighting at evaluation centres all affect readability. First-generation deployments often underinvest in this infrastructure layer.

    Software performance: Reports of slow interface response times during peak evaluation periods are consistent with what other examination boards experienced in their first OSM deployments. These are solvable engineering problems, but they add pressure in a high-stakes context.

    The Re-Evaluation Fee Argument

    Students are making a structurally valid point about re-evaluation costs. Digitisation of answer scripts should reduce the marginal cost of second-opinion evaluation substantially. Under the manual system, retrieving a physical answer book from storage, dispatching it to a second evaluator, and re-totalling the marks had real logistical costs. Under OSM, the script exists as a file — a second evaluator can be assigned to the digital file without any physical handling.

    The legitimate ask is not for free re-evaluation for everyone, but for a rationalised fee structure that reflects actual digital marginal costs, and a faster, transparent grievance process. This is a valid policy conversation for CBSE and other boards to have as OSM matures. Several state examination boards that have adopted digital evaluation internationally have moved to lower per-paper re-evaluation charges specifically because digitisation justified the reduction.

    What This Transition Tells Exam Systems About Change Management

    The backlash following CBSE's OSM result is partly a change management problem. When evaluation moves from a decades-old process to an unfamiliar digital format, any unexpected outcome becomes attributed to the new system — even when multiple other factors are in play. This is predictable. It is also preventable with better communication.

    Examination boards planning OSM adoption should:

  • Publish explicit, plain-language explanations of what OSM changes and what it does not, before results are declared
  • Conduct mock OSM evaluation sessions so evaluators develop baseline proficiency before live deployment
  • Establish a digital grievance portal with real-time status tracking and defined response timelines
  • Rationalise re-evaluation fees to reflect the actual lower cost of digital second-marking
  • Communicate directly with students through school principals and college notice boards, not just official website notices that most students will not read until after results
  • CBSE's 2026 OSM rollout evaluated nearly 10 million answer scripts with a complete digital audit trail — a genuine operational achievement. The first-year challenges in evaluator training, software performance, and student communication are addressable. How examination bodies respond to legitimate concerns will determine whether this transition builds or erodes the institutional trust that digital evaluation ultimately depends on.

    Related Reading

  • CBSE Eliminates Marks Verification: What OSM Makes Possible
  • CBSE Class 12 2026 Pass Drop and OSM Rigour Explained
  • What Is On-Screen Marking?
  • Ready to digitize your evaluation process?

    See how MAPLES OSM can transform exam evaluation at your institution.