5 Lessons from Large-Scale On-Screen Marking Rollouts (That CBSE Learned the Hard Way)
CBSE's OSM rollout for 46 lakh students hit technical glitches, training gaps, and teacher pushback. Here are 5 lessons from institutions that have navigated these challenges across multiple evaluation cycles.

Context: CBSE's Bold Move
In February 2026, CBSE announced that all Class 12 board examination answer sheets would be evaluated through On-Screen Marking (OSM). The scale was unprecedented — 46 lakh students across India and 26 countries. The timeline was aggressive — just eight days between the announcement and the start of exams.
What followed was a textbook case study in the difference between deciding to go digital and executing the transition well. Mock evaluations hit technical glitches. Teachers reported login failures, server overload, and insufficient training. Delhi government school teachers formally asked CBSE to suspend OSM for the current year.
None of these problems are unique to CBSE. Every institution that has deployed digital evaluation at scale has faced similar challenges during their first cycle. The difference is in preparation. Here are five lessons that experienced institutions have learned — lessons that would have made CBSE's rollout significantly smoother.
Lesson 1: Your First Cycle Should Not Be Full Scale
What happened at CBSE: OSM was deployed for all Class 12 answer sheets simultaneously — every subject, every school, every student.
What experienced institutions do differently: Start with a subset. Most successful digital evaluation deployments begin with a pilot covering:
This allows the institution to identify and fix issues — scanning quality, server capacity, evaluator workflow problems — before they affect the full exam cycle. The pilot serves as a live training exercise for administrators, scanning operators, and evaluators.
The principle: Your first digital evaluation cycle is a learning experience. Make it a controlled learning experience, not a high-stakes one.
Lesson 2: Training Cannot Be Compressed
What happened at CBSE: Training lasted approximately one week in many cases. Teachers reported the training was insufficient, particularly for those less comfortable with technology.
What experienced institutions do differently: Effective evaluator training follows a progression:
Phase 1: Platform Familiarization (2-3 weeks before evaluation)
Phase 2: Mock Evaluation (1-2 weeks before evaluation)
Phase 3: Supported Live Evaluation (first 2-3 days)
Phase 4: Independent Evaluation (remainder of cycle)
The principle: Compress the evaluation timeline, not the training timeline. A week of training for a system that will handle lakhs of students' marks is not adequate preparation.
Lesson 3: Infrastructure Must Be Tested Under Load
What happened at CBSE: The OSM portal experienced login failures and server overload during mock evaluations when many teachers tried to access the system simultaneously.
What experienced institutions do differently:
Capacity Planning
Calculate the expected peak concurrent users. If 10,000 teachers are expected to evaluate simultaneously between 9 AM and 6 PM, the system needs to handle 10,000 concurrent sessions — plus a safety margin for spikes (first-day login surge, post-lunch return, etc.).
Load Testing
Run automated load tests that simulate the expected number of concurrent users before the first real evaluation. This is standard practice in software engineering but is often skipped in examination platform deployments because the urgency of the evaluation timeline takes priority over testing.
Infrastructure Redundancy
Server failures during evaluation are unacceptable — every minute of downtime is a minute that evaluators cannot mark, pushing back the result timeline. Production systems need redundant servers, database failover, and monitoring alerts that catch problems before users report them.
Network Requirements
If evaluators are working from school computer labs, the network bandwidth at each school must be sufficient. A single school with 20 teachers evaluating simultaneously needs more bandwidth than a school with normal internet usage. Network testing at representative schools should happen during the mock evaluation phase.
The principle: The first day of live evaluation should be boring from a technical perspective. Every technical issue should have been discovered and fixed during testing.
Lesson 4: Scanning Quality Controls Everything
What happened at CBSE: Teachers reported image quality issues — blurry text, incomplete captures, and improper segmentation of answer sections that could lead to misinterpretation of student responses.
What experienced institutions do differently:
Scanning is the foundation of digital evaluation. If the scanned image is poor, no amount of evaluation interface quality can compensate. Effective scanning workflows include:
Capture-Time Validation
The scanning system should validate image quality at the point of capture — checking for resolution, completeness (all pages present), readability, and barcode/QR code recognition. Rejecting poor captures immediately is far cheaper than discovering quality issues during evaluation.
Quality Control Review
After scanning, a dedicated quality control step involves human reviewers checking scanned images in grid views and flagging pages that need re-scanning or manual correction. This step catches issues that automated validation misses — handwriting that's legible but faint, pages that are complete but poorly aligned, etc.
Page-Level Correction
When a quality issue is found, the system should allow correction of individual pages without requiring the entire booklet to be re-scanned. This is critical for maintaining throughput at scale.
Storage Redundancy
Scanned images should be stored in at least two locations (local storage plus cloud backup) to prevent data loss. If a server failure destroys scanned images that haven't been evaluated yet, the only recovery option is physical re-scanning — an enormous setback at scale.
The principle: Invest disproportionately in scanning quality. Every downstream issue — evaluator frustration, marking errors, disputed results — traces back to the scanned image.
Lesson 5: Change Management Is Not Optional
What happened at CBSE: Delhi government school teachers formally asked CBSE to suspend OSM for the current year. Teachers described the digital process as adding pressure rather than easing workload — the opposite of the intended effect.
What experienced institutions do differently:
Digital evaluation is as much a human change as a technical one. Evaluators who have marked papers with a red pen for 20 years are being asked to fundamentally change their workflow. This requires:
Communicate the "Why"
Teachers need to understand why the change is happening — not just "CBSE decided" but the specific problems that digital evaluation solves (totalling errors, faster results, RTI compliance). When evaluators understand the purpose, they're more likely to work through the inevitable friction of a new system.
Acknowledge the Learning Curve
The first cycle will be slower than paper evaluation. This is normal and expected. Evaluators need to hear this from administrators so they don't feel they're failing when they're actually going through a natural learning process.
Provide Ongoing Support
A training session before evaluation is not sufficient. Evaluators need access to support during evaluation — whether that's a helpline, on-site technical staff, or a senior evaluator who has used the system before and can answer questions in real-time.
Gather and Act on Feedback
After the first evaluation cycle, collect structured feedback from evaluators. What worked? What was frustrating? What would they change? Institutions that act on this feedback see dramatic improvements in evaluator satisfaction by the second or third cycle.
Demonstrate Results
After the first cycle, share the outcomes: how many totalling errors were prevented, how much faster results were published, how many fewer RTI disputes were filed. Concrete results build buy-in for subsequent cycles.
The principle: Technology adoption at scale is a change management project first and a technology project second.
The Path Forward
CBSE's challenges are not evidence that digital evaluation doesn't work — they are evidence that the transition requires careful planning, adequate preparation, and respect for the human side of change management.
Institutions that have been through multiple digital evaluation cycles universally report that the second cycle is dramatically smoother than the first, and by the third cycle, evaluators prefer the digital workflow to paper. The benefits — zero totalling errors, faster results, complete audit trails, remote evaluation — are real and measurable.
CBSE's 2026 deployment, challenges and all, will prove this at a scale that removes any remaining doubt about whether digital evaluation works for Indian examinations. The question for every other institution is not whether to make the transition, but how to do it well.
Related Reading
Ready to digitize your evaluation process?
See how MAPLES OSM can transform exam evaluation at your institution.