Industry2026-03-30·8 min read

NEP 2020's Assessment Overhaul: What 50% Competency-Based Questions Mean for Exam Evaluation

CBSE's 2026 board exams now require 50% competency-based questions. Here is what the NEP 2020 assessment shift means for how papers are set, evaluated, and what evaluators must look for.

NEP 2020's Assessment Overhaul: What 50% Competency-Based Questions Mean for Exam Evaluation

The Shift Nobody Is Talking About

When CBSE announced On-Screen Marking for Class 12 in February 2026, the coverage was wall-to-wall. But buried inside the same reform package was a change that will prove equally significant: starting with the 2026 board examinations, 50% of every CBSE question paper must comprise competency-based questions — case studies, source-based questions, and application-oriented formats designed to test understanding rather than recall.

This is not a cosmetic change. It restructures what examiners are setting, what evaluators are marking, and what students are supposed to demonstrate. Together with the rollout of On-Screen Marking, it represents the most fundamental reform of India's board examination system in decades.

What NEP 2020 Actually Says About Assessment

The National Education Policy 2020 was explicit about its intentions for assessment: replace the culture of rote learning with competency-based evaluation that measures what students can do with knowledge, not just whether they can reproduce it.

NEP 2020 introduced three key institutions and frameworks to operationalize this shift:

PARAKH

The National Assessment Centre — Performance Assessment, Review, and Analysis of Knowledge for Holistic Development — was established under NEP 2020 to drive assessment reform across all school boards. PARAKH's mandate includes creating standardized, holistic assessment frameworks that measure not only subject knowledge but also critical thinking, applied skills, and socio-emotional competencies.

The 360-degree holistic report cards that PARAKH advocates move beyond percentages and grades toward a multidimensional view of student development. While full implementation is still in progress, PARAKH's influence is visible in how CBSE has restructured its question paper format for 2026.

SAFAL

The Structured Assessment for Analysing Learning framework is now operational for Classes 3, 5, and 8. SAFAL is a diagnostic evaluation tool focused on measuring conceptual clarity rather than marking student performance on a competitive scale. It tracks whether students genuinely understand foundational concepts — a departure from exams that reward memorization without comprehension.

SAFAL's approach will eventually influence how board-level assessments are designed as students move through the system.

The 50% Competency-Based Question Mandate

This is the reform with immediate, measurable impact. The 2026 CBSE board exam paper structure is:

Question TypeShare of PaperWhat It Tests
Competency-Based Questions (CBQ)50%Application, analysis, case-based reasoning
Objective-Type Questions20%Conceptual recall, definitions
Short and Long Answer30%Knowledge articulation, structured explanation

Competency-based questions include source-based integrated questions (passages, graphs, maps, data sets with multi-part questions), case studies requiring analysis and reasoning, and application-oriented problems that require students to use concepts in unfamiliar contexts.

This represents a departure from the traditional paper where 70-80% of marks could be obtained through well-memorized answers.

Why This Changes How Evaluators Work

The introduction of 50% competency-based questions does not just change what students write — it changes what evaluators must assess.

Marking Schemes Must Allow for Partial Credit

A case study answer cannot be evaluated with a binary right/wrong rubric. Students may demonstrate partial understanding, apply the correct reasoning but reach an incorrect conclusion, or approach a problem from an unexpected direction that still demonstrates competency. Evaluators need marking schemes that can award partial credit and distinguish between levels of understanding.

Traditional marking schemes for factual recall questions are relatively simple: either the student listed the correct points or they did not. Competency-based question marking schemes require more nuanced judgment, better rubrics, and more evaluator training to apply consistently.

Consistency Becomes Harder

When 50% of a paper requires interpretation and judgment — "does this student's case study analysis demonstrate sufficient understanding?" — inter-rater reliability becomes a challenge. Two evaluators marking the same answer may award different marks. This is a known problem in competency-based assessment, and it is why chief examiner calibration sessions, anchor scripts, and double-valuation mechanisms matter more in this environment.

Under traditional paper evaluation, a well-trained evaluator marking factual questions produces fairly consistent outcomes. Competency-based evaluation requires more structured calibration to ensure a student in Tamil Nadu and a student in Haryana are being evaluated against the same standard.

On-Screen Marking and Competency Questions

CBSE's simultaneous introduction of On-Screen Marking and competency-based questions creates an interesting intersection. OSM platforms have traditionally been optimized for structured, segmented marking of defined question-answer pairs. Evaluating a nuanced case study analysis on screen requires the same careful reading that paper evaluation does — but with the screen as the medium.

Evaluators must resist the temptation to mark more quickly in the digital environment. The efficiency gains from OSM (no travel, automated totalling, faster result processing) should not translate into rushed evaluation of complex analytical answers.

The Impact on Question Paper Setting

The 50% CBQ mandate also changes the work of question paper setters significantly. Setting a competency-based question is harder than setting a factual recall question. A well-constructed case study must:

  • Present a realistic, unambiguous scenario
  • Require the application of specific syllabus concepts
  • Have clearly distinguishable levels of response (basic, adequate, excellent)
  • Be fair across different regional contexts and student backgrounds
  • Have an unambiguous marking scheme that different evaluators will apply consistently
  • The CBSE question paper vetting process — which involves subject experts, chief examiners, and multiple review rounds — is becoming more critical as the questions themselves become more complex. A poorly constructed factual recall question is usually obvious. A poorly constructed case study question can be ambiguous in ways that only become apparent during evaluation, when evaluators disagree on what a correct answer looks like.

    What This Means for State Boards

    State boards watching CBSE's 2026 reforms are not just watching the OSM rollout — they are watching what happens when a large, complex examination system attempts to implement NEP 2020's assessment philosophy at scale.

    The NEP 2020 expectation is that all boards align toward competency-based assessment over time. CBSE is the first to implement the 50% CBQ mandate at board examination level. State boards will face similar mandates as the NEP implementation timeline advances.

    The implications for state boards:

    Question bank development. State boards will need to build question banks with competency-based questions, which requires different skills than traditional question bank development.

    Evaluator training upgrades. Marking schemes for CBQs require more detailed guidance. Evaluator training programs will need to include calibration sessions and worked examples of borderline responses.

    Moderation procedures. With higher evaluator judgment variability in CBQs, moderation and double-valuation workflows become more important, not less. The institutional infrastructure for review and appeal must be stronger.

    The Assessment Culture Shift

    The 50% CBQ mandate is ultimately an attempt to change what India's examination culture rewards. When half the marks require demonstrating understanding and reasoning rather than recall, the incentive shifts — at least partially — away from rote memorization.

    Whether this shift happens in classrooms depends on what teachers teach and how they teach it. Examination-driven instruction is a rational response to examination-focused assessment. If board exams genuinely reward applied thinking, teaching methods will eventually follow.

    The 2026 CBSE board examinations are the first large-scale test of whether this shift is real or aspirational. The results — and the feedback from evaluators about what students actually wrote — will shape how the assessment reform is calibrated in subsequent years.

    Related Reading

  • CBSE Introduces On-Screen Marking for Class 12 — The digital evaluation reform running in parallel with NEP assessment changes
  • How Evaluator Anonymity Eliminates Bias in Exam Grading — Why anonymized evaluation matters more in judgment-heavy assessment
  • Understanding Double Valuation in Exam Evaluation — The role of double-checking when evaluator judgment is variable
  • Ready to digitize your evaluation process?

    See how MAPLES OSM can transform exam evaluation at your institution.