CBSE's On-Screen Marking Hit by Technical Glitches: Lessons for Every Institution Going Digital
CBSE's March 2026 OSM rollout faced server crashes, login failures, and teacher burnout. Here's what went wrong, why it happened, and the five things every institution must get right before launching digital evaluation.

What Happened
CBSE's On-Screen Marking system for Class 12 board exams went live in March 2026 — the largest single deployment of digital evaluation in Indian education history. Over 18.5 lakh students, more than 1 crore answer sheets, and tens of thousands of evaluators across the country.
The ambition was right. The execution ran into problems.
Within the first week of evaluation, reports surfaced from evaluation centres across Punjab, Haryana, and Delhi-NCR:
CBSE also issued an advisory on March 16–17 warning evaluators against sharing evaluation details on social media, threatening legal action. Approximately 20 misinformation incidents were flagged during the exam period.
None of this means digital evaluation is wrong. It means this particular rollout made avoidable mistakes — mistakes that every institution planning digital evaluation should study carefully.
Mistake 1: Insufficient Training Time
CBSE allocated approximately one week for evaluator training on the new OSM portal. For a system that fundamentally changes how teachers interact with answer sheets — from physical paper on a desk to digital images on a screen — one week is not enough.
Why This Matters
Evaluators who have spent years or decades marking physical answer sheets have deeply ingrained workflows. They flip pages, write marks in margins, circle errors, and add up totals by hand. Digital evaluation replaces every one of these actions with a different interface element — navigation buttons, mark entry fields, annotation tools, and submission workflows.
Learning a new interface is not just about understanding what each button does. It is about developing the muscle memory and confidence to work at speed without anxiety. That takes practice — not a single training session.
What Institutions Should Do Instead
Mistake 2: Untested Infrastructure at Scale
CBSE's portal reportedly struggled under the load of thousands of concurrent evaluators. Login failures and server crashes during the mock evaluation session — which is explicitly designed to find these problems — indicate that the infrastructure was not load-tested at production scale.
Why This Matters
Digital evaluation is a concurrent system. Unlike paper evaluation, where each evaluator works independently with physical papers, digital evaluation requires every evaluator to simultaneously connect to a central server, download answer sheet images, and submit marks. The server must handle:
If the infrastructure is sized for 500 concurrent users but 4,000 log in at 9 AM on the first day, the system crashes. This is a predictable failure that load testing prevents.
What Institutions Should Do Instead
Mistake 3: Rigid Interface Design
Teachers reported that CBSE's OSM portal used a color-coding system for marking that reset on minor errors, requiring them to re-evaluate the same booklet from the beginning. This suggests an interface designed for data integrity without sufficient consideration for evaluator workflow.
Why This Matters
Evaluators are not data entry operators. They are subject matter experts making professional judgments about student responses. The interface should support that judgment — not obstruct it with rigid validation rules that punish minor mistakes.
When an evaluator accidentally enters a wrong mark and the system forces them to restart the entire booklet, three things happen:
What Institutions Should Do Instead
Mistake 4: No Hybrid Fallback
When the OSM portal went down, evaluators had no fallback. They could not switch to paper evaluation temporarily, because the answer sheets were already scanned and in the digital pipeline. They could not continue working offline, because the system required a live server connection. They waited.
Why This Matters
In a system serving lakhs of students with result deadlines, any downtime directly extends the evaluation timeline. If 4,000 evaluators lose 2 hours to a server outage, that is 8,000 person-hours of lost evaluation capacity — equivalent to losing an entire day of evaluation for a mid-sized institution.
What Institutions Should Do Instead
Mistake 5: Communication Failures
CBSE's response to the technical difficulties included a social media advisory threatening legal action against evaluators who shared information about the evaluation process. This approach — addressing communication about problems rather than the problems themselves — eroded trust among evaluators at precisely the moment when trust was most needed.
Why This Matters
Evaluators encountering a new system for the first time need to feel supported, not surveilled. When teachers shared frustrations on social media, it was not because they wanted to undermine CBSE — it was because they were struggling with a system that was not working as promised and they had no other outlet.
What Institutions Should Do Instead
What CBSE Got Right
It is important to acknowledge what CBSE did accomplish:
The problems are execution problems, not conceptual problems. Digital evaluation at CBSE's scale is achievable — it requires better preparation, infrastructure, and evaluator support than this rollout delivered.
The Five Non-Negotiables
For any institution planning digital evaluation, here are the five things you must get right:
1. Train Early, Train Often
Start evaluator training 4–6 weeks before live evaluation. Use the production system. Include mock evaluations with real answer sheets. Provide ongoing support during the evaluation window.
2. Load Test at Production Scale
Test with the actual number of concurrent evaluators you expect. Test image delivery, mark submission, and session management under peak load. If the mock evaluation reveals problems, delay the live rollout.
3. Design the Interface for Evaluators
Build the UI around how evaluators actually work. Allow easy mark correction. Auto-save continuously. Minimize clicks and cognitive load. The system should feel faster than paper, not slower.
4. Plan for Failure
Implement offline capability. Document failover procedures. Communicate proactively during outages. Every hour of downtime costs thousands of person-hours of evaluation capacity.
5. Support Your Evaluators
Establish real-time support channels. Acknowledge problems quickly and transparently. Treat evaluator feedback as valuable data. Their experience determines whether your digital evaluation succeeds or fails.
The Bigger Picture
CBSE's OSM rollout will be studied for years — both for its ambition and for its stumbles. The institutions that learn from this experience and prepare accordingly will have smoother transitions. The institutions that assume "we'll figure it out" will repeat the same mistakes.
Digital evaluation is not optional anymore. With 74% of Indian exam boards adopting some form of digital evaluation, the question is not whether to go digital — it is whether to go digital well or go digital badly.
The difference is preparation.
Related Reading
Ready to digitize your evaluation process?
See how MAPLES OSM can transform exam evaluation at your institution.