CBSE Open Book Exams for Class 9 from 2026-27: What It Means for Evaluation
CBSE has formally approved open-book assessments for Class 9 starting 2026-27. The shift from testing memory to testing mastery changes not just question design but the entire evaluation infrastructure schools and boards must maintain.

The Announcement
The Central Board of Secondary Education has approved the introduction of open-book assessments (OBA) for Class 9 students beginning with the 2026-27 academic session. The format will initially cover four core subject areas — Languages, Mathematics, Science, and Social Science — and will be incorporated into the three pen-and-paper tests conducted each term as internal final examinations.
Students will be permitted to refer to textbooks, approved notes, and reference materials during these examinations. The change is directly aligned with the National Education Policy 2020 and the National Curriculum Framework for School Education 2023, both of which call for a fundamental shift away from rote memorisation toward competency-based assessment.
The move has significant implications — not just for students and teachers, but for the examination evaluation infrastructure that schools, boards, and universities will need to maintain as this philosophy moves through successive grade levels.
What Changes in the Examination Room
An open-book examination is, counterintuitively, harder to game than a closed-book one — if it is designed correctly.
CBSE's own pilot studies found that over 78% of students found open-book assessments more challenging than traditional examinations. The reason is straightforward: a question that tests factual recall can be answered by reading the relevant page. A question that tests application, synthesis, or critical evaluation cannot be answered by copying — it requires the candidate to engage with the material, construct an argument, and demonstrate understanding through the process of reasoning.
The practical implication is that open-book questions are longer, more analytical, and more nuanced in their expected responses. They require evaluators to exercise more professional judgment. A model answer with a specific expected phrasing becomes less relevant; evaluators must assess whether the student has demonstrated the required competency, even if they have expressed it differently.
This is a significant change to how evaluation works.
The Infrastructure Challenge
CBSE's decision follows successful pilots conducted over the 2024-25 and 2025-26 academic sessions. Scaling the model to all affiliated schools across India will require overcoming several infrastructure gaps that the pilots exposed.
Digital access inequality. Rural school infrastructure remains uneven. Rajasthan alone has approximately 86,934 classrooms classified as structurally unsafe, and nationwide rural internet penetration stands at around 45%. Schools without reliable connectivity cannot leverage digital resources during examination sessions, and printed reference materials increase the logistics burden on examination administrators.
Teacher training. Writing open-book assessment questions requires a different skill set than writing knowledge-recall questions. CBSE's plan involves scaling teacher capacity through National Professional Standards for Teachers (NPST) and Centres of Excellence in Gurugram, Raipur, and Ranchi, with a target of training 5 lakh teachers by the end of 2026. The DIKSHA platform, with over 40 crore registered users, will provide digital preparation resources.
Evaluation load and complexity. Longer, analytical answer scripts require more evaluator time per script. This increases the overall burden on the evaluation cycle — which already operates under significant time pressure as boards aim to declare results within 45 to 60 days of examination completion. The workload increase makes the case for digital evaluation platforms even more compelling.
Standardisation of evaluator judgment. When answers are analytical rather than factual, inter-evaluator consistency becomes a more significant concern. Two experienced evaluators reading the same answer may reach different conclusions about whether a student has demonstrated the required competency. Systems that support calibration exercises, double valuation, and moderation workflows address this directly.
The Trajectory Beyond Class 9
The Class 9 rollout is not the end point; it is the beginning of a broader transformation in how Indian education assesses student learning.
NEP 2020 envisions competency-based assessment at all levels of schooling and higher education. CBSE's concurrent decision to implement On-Screen Marking (OSM) for Class 12 board examinations in 2026 signals that the board is simultaneously reforming what is tested (moving toward application in Class 9) and how evaluation is conducted (moving to digital in Class 12).
The trajectory points toward a system where answer scripts — more analytical, more varied in structure, longer in length — are evaluated digitally, with tools that help evaluators maintain consistency across thousands of scripts, calculate partial marks automatically, and generate data that can inform future question design.
For the 2026-27 Class 9 cohort, this transition begins as an internal school assessment. But these students will appear for Class 10 board examinations in 2027-28, and Class 12 in 2029-30. The assessment habits, question formats, and evaluation standards established now will shape those higher-stakes examinations.
What Schools Should Prepare For
Question redesign is non-negotiable. Teachers cannot ask the same questions in an open-book format that they asked in a closed-book format and expect meaningful differentiation in performance. CBSE has indicated it will issue clear guidelines on permissible materials and question types. Schools should begin working with teachers to develop question banks that demand synthesis, application, and evaluation — Bloom's Taxonomy levels 4 through 6 — rather than recall and comprehension.
Evaluation rubrics will matter more. When correct answers are varied rather than singular, rubrics become the primary tool for ensuring consistency. Schools will need to develop rubrics that clearly define what constitutes a Level 3 versus Level 4 response, so that multiple evaluators can assess the same script similarly.
Volume planning for evaluation. If open-book assessments produce longer, more detailed answers, the pages-per-script count increases. Schools running internal evaluations and boards processing affiliated school results should model the increased evaluation time per script and plan resources accordingly.
Parent and student communication. The shift is a genuine change in assessment philosophy, not just a format update. Students accustomed to high marks through strong memorisation may find their performance drops initially. Clear communication about the purpose of OBA — to test understanding, not encyclopaedic knowledge — will help manage expectations during the transition year.
The Broader Case for Digital Evaluation
Open-book assessments strengthen the argument for digital evaluation infrastructure in several ways. Longer answer scripts mean more pages to scan, but also more data to extract: digital platforms can support word-frequency analysis to flag copied responses, analytics on which questions consistently receive lower marks (indicating a teaching gap rather than a student gap), and longitudinal tracking of how a student's analytical performance improves across terms.
The Indian examination system has historically evaluated students through the lens of what they can remember. CBSE's Class 9 open-book decision is a formal, policy-level acknowledgment that what matters more is what students can do with what they know. Building the evaluation infrastructure to match that philosophy is the challenge — and the opportunity — that boards, universities, and institutions face over the next three to five years.
---
Related Reading
Ready to digitize your evaluation process?
See how MAPLES OSM can transform exam evaluation at your institution.