Why CBSE Had to Publicly Defend Its OSM System — and What It Tells Us About Digital Transitions
In late April 2026, CBSE was compelled to issue a public statement refuting media reports of OSM technical glitches. The episode is a case study in how perception management is as critical as technology implementation during digital evaluation rollouts.

The Statement Nobody Expected to Need
On April 29, 2026, the Controller of Examinations at CBSE, Sanyam Bhardwaj, issued a public clarification that should not have been necessary. Media reports had emerged suggesting that the board's newly introduced On-Screen Marking (OSM) system was experiencing technical glitches and that Class 12 results would be delayed.
Bhardwaj described the reports as "far away from facts." The evaluation, he stated, was proceeding "perfectly, better than the previous evaluation." Class 12 results, he confirmed, remained on track for the third week of May 2026.
That a serving Controller of Examinations had to go on record to defend a technology system in the middle of an active evaluation cycle is, by itself, a significant data point. It is not an indictment of the OSM system. It is a lesson in what happens when a first-time, large-scale digital transition encounters an information vacuum.
The Scale of What CBSE Is Running
The numbers are worth stating clearly. Approximately 46 lakh students appeared for CBSE Class 12 board examinations in 2026, across India and 26 other countries. The implementation of OSM means that every single answer sheet written by those students was physically scanned, uploaded to a secure server, and made available for evaluation through a digital portal.
This is not a pilot programme. This is the largest single deployment of on-screen marking for a school-leaving examination anywhere in the world by volume. The investment involved is estimated at approximately ₹32 crore for scanning infrastructure alone, with an equivalent commitment of training, technical support, and evaluator onboarding.
When systems of this scale launch, media scrutiny is inevitable. When they launch successfully, the absence of dramatic failures sometimes reads as suspicious rather than reassuring.
How OSM Works and Why Glitches Were Always Unlikely
The OSM workflow CBSE deployed for 2026 follows a sequence that has been tested across multiple state boards and international examination bodies before this implementation:
The system is designed with redundancy at multiple points. Scripts are not marked on the evaluator's local device — they are displayed from the server. Partial progress is saved automatically. If an evaluator's session is interrupted, completed work is not lost. These are standard features of enterprise-grade assessment platforms.
Evaluator feedback from the 2026 cycle has been strikingly positive. Teachers have reportedly been sending photographs and messages expressing gratitude to the board, describing the experience as superior to physical marking. One widely circulated message from an evaluator described being "fortunate" to participate in what they called a landmark change.
Why Rumours Emerge at the First Deployment
The emergence of glitch rumours around a first-time deployment is not unique to CBSE. It is a pattern that repeats across every major digital transition in examination systems, and understanding the mechanism helps institutions prepare for it.
Information asymmetry: In a distributed evaluation system, thousands of evaluators are simultaneously encountering the platform for the first time. A small percentage will face login delays, browser compatibility issues, or network problems at their location. They may mention these experiences in WhatsApp groups or to local correspondents. These individual friction points are not system failures, but they can be reported as such.
Vested interests: Physical evaluation creates ecosystems — transport vendors who move answer scripts, local coordination logistics, evaluators who prefer familiar processes. OSM removes much of this. Some of those with a stake in the old system have an interest in narratives of digital failure.
Novelty bias: Editors and reporters are more likely to publish stories about new systems failing than about new systems working. A story about 46 lakh answer sheets evaluated without incident is less compelling than a story about technical delays affecting 46 lakh students.
Missing baseline: First-time deployments have no prior track record to point to. This year, if CBSE completes evaluation within the same or a shorter timeframe than 2025 — which all indicators suggest — that data point will exist for future reference.
The Timeline: Then and Now
| Evaluation Metric | Physical Evaluation (2025) | OSM Evaluation (2026) |
|---|---|---|
| Evaluation period | March–April | March–April |
| Result declaration | Late May/June | Third week of May |
| Mark verification requests | Tens of thousands annually | Not applicable (auto-totalling) |
| Evaluator location requirement | Fixed evaluation centres | Any authorised internet location |
| Carbon copy error risk | Present | Eliminated |
| Post-result revaluation demand | High | Significantly reduced |
The elimination of mark verification requests is particularly significant. CBSE had previously processed over 14 lakh applications for answer script verification annually after results. Under OSM, auto-totalling makes the arithmetic verification service redundant. This represents a downstream efficiency that students, institutions, and admission processes benefit from — and that was not part of most media coverage of the transition.
What Institutions Rolling Out Digital Evaluation Can Learn
The CBSE episode has practical implications for universities, autonomous colleges, and state boards planning their own OSM implementations.
Communication must precede technology. Evaluators, head examiners, and institutional coordinators need to understand the system before they encounter it. A user who knows what to expect from a login screen does not interpret a momentary delay as a system failure.
Build a visible feedback channel. The CBSE system apparently receives evaluator feedback through informal channels (WhatsApp messages to officials). Formalising this — a real-time dashboard showing evaluation progress across centres, an evaluator helpdesk with visible response times — transforms anecdote into data.
Plan the narrative, not just the infrastructure. A press briefing that proactively states "evaluation is proceeding as follows, results are expected on this date, here is the helpline" is far cheaper than a reactive clarification issued after rumours have already been published.
The CBSE OSM 2026 implementation appears to be succeeding. The system is on track. Evaluators are satisfied. Results will be declared in the third week of May for 46 lakh students across the world. The fact that a public clarification was required in week six of a ten-week implementation cycle is a communication lesson, not a technology one.
Related Reading
Ready to digitize your evaluation process?
See how MAPLES OSM can transform exam evaluation at your institution.