Evaluator Conduct in the Digital Age: Why CBSE Warned Teachers During the 2026 Evaluation Season
In March 2026, CBSE formally warned teachers involved in evaluation against posting opinions and experiences on social media. The episode reveals why evaluator conduct rules are becoming more important as digital evaluation scales.

The Warning
On March 16, 2026, CBSE issued a formal notice to teachers participating in the evaluation of Class 10 and Class 12 board examination answer sheets. The notice warned that certain evaluators had been posting comments, opinions, and experiences from inside the evaluation process on social media platforms. CBSE stated this behaviour constituted a violation of professional conduct and threatened disciplinary action.
The warning was not directed at teachers sharing the kind of information that could compromise specific students' marks. It was broader: evaluators posting any observation — about the quality of student answers, the difficulty of marking certain responses, the functioning of the digital portal — were told to stop.
For an institution processing 46 lakh answer sheets in its first On-Screen Marking cycle, the reasons are understandable. But the episode also illustrates something important about how evaluator conduct frameworks must evolve as examination evaluation moves from private, centralized evaluation camps to distributed, home-based digital environments.
What Changed With On-Screen Marking
Under the traditional paper evaluation model, CBSE evaluators gathered at designated evaluation centres — typically schools — for a structured evaluation camp lasting several days. The physical environment created natural boundaries. Evaluators worked in a shared space, under observation, with supervisors present. Conversations happened face-to-face. Social media was not part of the evaluation environment.
Under OSM, evaluators log in from their own school's computer lab, evaluate answer sheets on screen, and submit marks digitally. They are working, effectively, from a semi-private setting — without the ambient monitoring of an evaluation camp. The boundary between "at evaluation" and "at their desk, on their phone" is considerably blurred.
This structural change creates new conduct risks that the old evaluation camp model did not have to address:
None of these risks existed when evaluation was paper-based and camp-based.
Why Conduct Rules Matter More in Digital Evaluation
The CBSE warning reflects a broader truth: as evaluation becomes digital and distributed, the conduct framework surrounding it must become correspondingly more explicit and enforceable.
Confidentiality of Answer Script Content
Every answer script that an evaluator views represents a student's confidential examination submission. Under the traditional model, no evaluator carried an answer script out of the evaluation centre — the physical constraint enforced confidentiality. Under OSM, digital images of answer scripts exist on evaluators' screens and are accessible in ways that paper never was.
Posting images of student handwriting, even redacted or partially obscured, creates a confidentiality breach. The evaluator may intend no harm. The result is the same.
The Problem of Premature Commentary
When evaluators post observations about answer quality — "students clearly did not understand this concept," "the answers to Question 4 are very weak this year" — before results are declared, it creates several problems.
First, it can generate anxiety and speculation among students and parents who cannot yet verify the observation's accuracy or representativeness. Second, it may influence how other evaluators approach similar answers if the post circulates in evaluator networks. Third, it creates public statements about student performance before the official evaluation process is complete — before head examiners have reviewed outliers, before moderation has occurred, before any quality checks have validated the initial marking.
The evaluation process is not complete when a teacher finishes marking their assigned bundle. It is complete when the full chain of verification, moderation, and result processing has concluded. Evaluator commentary before that chain is finished is premature.
Reputational Risk to the Examination System
India's board examination results carry significant weight in student futures. The credibility of those results depends on public confidence in the integrity of the evaluation process. When evaluators post commentary about evaluation experiences — including technical problems, disagreements about marking, or complaints about the system — it can undermine that confidence, regardless of whether the individual observations are accurate.
CBSE's warning acknowledged this directly: posts that are "untrue and misleading" were specifically cited, but the instruction to cease posting extended beyond obviously incorrect content to any public sharing of evaluation experiences.
The Training Gap
The CBSE warning also reveals a gap that institutions adopting digital evaluation often underestimate: evaluators need conduct training, not just technical training.
Most evaluator training for OSM has focused on the mechanics of the digital marking interface — how to navigate the portal, enter marks, handle different question types, use annotation tools. This is necessary but insufficient.
A comprehensive evaluator orientation for digital evaluation should also cover:
Data confidentiality obligations. What information about the evaluation process may and may not be shared. Which channels (helplines, supervisors, official feedback mechanisms) are appropriate for raising concerns.
The scope of the marking mandate. Evaluators are performing a specific function within a larger process. Understanding that their individual marking is one stage in a multi-layer verification chain helps calibrate what observations are appropriate to share publicly versus what should go through official channels.
Professional conduct in digital environments. The same professional standards that applied in an evaluation camp apply when evaluating from a school computer lab. The change in physical environment does not change the professional obligations.
How to raise legitimate concerns. If an evaluator identifies a genuine problem — ambiguous marking scheme guidance, a potential error in the question paper, a technical issue with the portal — there are correct channels for raising this. Social media is not one of them.
What Good Conduct Infrastructure Looks Like
The best digital evaluation platforms and institutions build conduct compliance into the process rather than relying solely on post-hoc warnings:
| Mechanism | How It Works |
|---|---|
| Non-disclosure agreements | Formal documentation of confidentiality obligations before access is granted |
| Session-based access | Evaluators can only access the system during designated marking windows |
| Screen capture restrictions | Technical controls limiting the ability to screenshot answer script images |
| Audit trails | Every evaluator action is logged with timestamp and user ID |
| Official feedback channels | Structured mechanisms for evaluators to report concerns to supervisors |
| Post-evaluation debriefs | Formal channels for gathering evaluator feedback after the cycle completes |
The combination of technical controls and clear conduct frameworks reduces the risk of evaluators sharing confidential information — whether intentionally or inadvertently.
The Broader Lesson
CBSE's March 2026 warning is a milestone in the evolution of India's examination evaluation culture. When evaluation was centralized and physical, professional conduct was partly enforced by the environment. Distributed digital evaluation requires that the same standards be internalized, communicated explicitly, and backed by clear consequences.
The move to On-Screen Marking is not just a technology deployment — it is a change in how professional evaluation work is organized. The institutions that will handle this transition most successfully are those that treat the human factors — training, conduct norms, feedback channels, accountability — with the same seriousness they bring to the technical infrastructure.
The students whose scripts are being evaluated deserve both: a technically sound platform and a professionally conducted evaluation process.
Related Reading
Ready to digitize your evaluation process?
See how MAPLES OSM can transform exam evaluation at your institution.