Industry2026-04-15·8 min read

CBSE 2026 PIL: When Exam Paper Sets Are Unequal, Who Protects the Student?

A PIL filed in 2026 alleges CBSE's Class 12 Physics and Class 10 Maths papers had wildly different difficulty levels across question sets — and demands that CBSE disclose how, or whether, scores are adjusted.

CBSE 2026 PIL: When Exam Paper Sets Are Unequal, Who Protects the Student?

A Petition That Names a Long-Standing Problem

CBSE board examinations for 2026 were still under way when educator Prashant Kirad filed a Public Interest Litigation (PIL) alleging that the Central Board of Secondary Education had failed to maintain consistent difficulty levels across multiple question paper sets for Class 12 Physics and Class 10 Mathematics.

The PIL's core allegation was specific: students sitting the same examination on the same day, in the same subject, had received question papers of materially different difficulty. Some sets closely followed the pattern of CBSE's own sample papers. Other sets — particularly Physics Set 2 and Set 3 — reportedly contained questions requiring conceptual depth comparable to JEE Main and JEE Advanced preparation, well beyond what the board's prescribed syllabus calls for.

The petition demanded that CBSE disclose the methodology used to design multiple paper sets, including any moderation or statistical equating formulas applied to adjust for difficulty variation. It also asked the court to determine whether the current system adequately protects students who receive harder sets through no choice of their own.

The Mechanics of Multi-Set Examinations

To understand why this controversy arose, it helps to understand how large-scale Indian board examinations handle paper security.

CBSE administers examinations to roughly 44 lakh students across Class 10 and 12 combined. Distributing a single question paper to all students simultaneously creates an obvious security risk: if one student photographs the paper and circulates it electronically, every student in every time zone benefits from that leak.

To manage this, CBSE — like most large examination bodies — prepares multiple sets of the same paper, labelled Set 1, Set 2, Set 3, and so on. Different examination centres receive different sets. Students sitting next to each other may have papers from different sets, making copying less useful.

The operational problem this creates is one of fairness. If Set 1 requires students to recall a formula and apply it to a standard problem, while Set 3 requires students to derive the formula from first principles and apply it under non-standard conditions, these are not the same examination. A student who receives Set 3 is being tested more stringently than a student who receives Set 1, on the same date, in the same subject, under the same examination conditions.

What Equating Is Supposed to Do

Examination bodies are aware of this problem. The standard technical response is statistical equating — a process that compares score distributions across sets after the examination and applies adjustments if one set shows systematically lower scores, suggesting it was more difficult.

CBSE has not publicly disclosed whether it applies equating to Class 10 and Class 12 board examinations, what statistical method it uses, or at what threshold difficulty differences trigger score adjustments. The PIL's demand for this disclosure is not frivolous — it is a request for basic transparency about a process that directly determines student results.

Why This Matters Now

The 2026 controversy is not the first time CBSE has faced questions about paper set difficulty variation. Similar concerns were raised in 2023 and 2024 for various subjects. What makes the 2026 case significant is its timing and its context.

CBSE is simultaneously rolling out On-Screen Marking (OSM) for Class 12 this year — a ₹32 crore initiative digitising the evaluation of over 1 crore answer sheets from 17 lakh+ students. The board is, by its own account, making the most substantial reform to its examination infrastructure in decades.

If evaluation is becoming more rigorous, more auditable, and more transparent, the question of whether the examination itself is fair to all students becomes more pressing. You can have perfect marking of an answer sheet and still produce an unfair result if different students received materially different examinations.

The Student Position

For a student who received a harder set, the immediate consequence is reduced scores and potentially missing key grade thresholds — for college admissions, scholarship eligibility, and competitive examination ranking. CBSE's move to abolish post-result mark verification for 2026 means students cannot request re-evaluation of their answer sheets after results. If an unfair paper is the source of underperformance, there is no individual remedy available.

The PIL puts this plainly: performance should depend on merit, not on which paper set happened to be sent to a student's examination centre.

What Evaluation Systems Can Do

The PIL focuses primarily on the question paper design side of the problem — how sets are created and how scores are adjusted. But the evaluation side of the pipeline also has a role in addressing fairness concerns.

Standardised Marking Rubrics

Onscreen marking systems require examiners to evaluate against defined rubrics — explicit descriptors of what earns each mark level. This is particularly important when multiple evaluators are marking the same question: a standardised rubric ensures that a student who derives a formula correctly in Set 3 receives the marks that derivation deserves, rather than being graded by an examiner who expected to see a recalled formula.

Well-designed rubrics also create a defence against the "my set was harder" problem. If a question in Set 3 requires more work than the equivalent in Set 1, a rubric should reflect that and reward the additional demonstrated skill.

Double Valuation as a Check

Many university-level digital evaluation systems deploy double valuation as a standard safeguard: two independent evaluators mark each answer script, and results are compared. Where scores diverge beyond a threshold, a third moderator reviews. This system is designed primarily to catch evaluator inconsistency, but it also surfaces systematic marking anomalies — including patterns where one set of answers consistently receives lower marks than another.

At scale, this data can be used to identify and correct paper-level difficulty variation before results are finalised, rather than after.

The Audit Trail Argument

The PIL demands transparency — specifically, that CBSE publish its moderation methodology. This is ultimately a demand for audit trails: documented evidence that the board knows whether difficulty variation exists, and what it did about it.

Digital evaluation platforms build this kind of audit trail into their normal operation. Every evaluation action — mark awarded, rubric applied, moderation decision — is timestamped and stored. When a student or a court asks how a result was arrived at, the answer is retrievable.

The absence of this infrastructure is not merely an inconvenience for students filing PIL petitions. It is a governance gap that makes boards legally and reputationally vulnerable whenever examination quality becomes a matter of public dispute.

The Transparency Standard Students Are Now Demanding

The CBSE 2026 PIL reflects a shift in how Indian students and their advocates think about examination governance. The demand is not just for good results — it is for demonstrably fair processes, with evidence available for scrutiny.

This shift is happening in parallel with the digitisation of examination infrastructure. CBSE's OSM project, WBCHSE's published OMR sheets, university platforms that publish answer sheet scans — all of these create the expectation that evaluation is a documented process, not a black box.

The legal challenge embedded in the PIL is not a crisis for any particular board. It is a prompt. Examination bodies that can demonstrate documented, equitable processes — from paper design through evaluation through result publication — are better positioned to respond when questions arise. Those that cannot will face more PILs, more public pressure, and more court scrutiny.

A transparent examination system is not a privilege. It is what students who sat a national board examination in 2026 are now explicitly asking for. That is a reasonable ask — and the infrastructure to answer it exists.

---

Related Reading

  • CBSE Paper Difficulty Disparity: How Moderation Fixes Unfair Grading
  • CBSE's On-Screen Marking for Class 12, 2026: What It Means for India's Boards
  • How Evaluator Anonymity Eliminates Bias in Exam Grading
  • Ready to digitize your evaluation process?

    See how MAPLES OSM can transform exam evaluation at your institution.