Guide2026-04-24·9 min read

Multiple Exam Attempts, NAAC Criterion 4, and Why Colleges Must Act Now

NEP 2020's multiple-attempt mandate is now live across CBSE, Karnataka, and MP Board. Colleges hosting evaluation centres have a narrow window to turn this infrastructure requirement into NAAC and NIRF accreditation evidence.

Multiple Exam Attempts, NAAC Criterion 4, and Why Colleges Must Act Now

The Policy Is Now Operational

NEP 2020 was released as a vision document in July 2020. By 2026, its assessment reform clauses are no longer aspirational — they are running on live examination infrastructure across India's largest boards.

Within a single academic year:

  • CBSE introduced two Class 10 board exam attempts (February and May 2026)
  • Karnataka replaced supplementary exams with three annual attempts for Class 12, with Exam 2 running April 30 to May 13, 2026
  • MP Board launched an improvement exam from May 7, 2026
  • These are not pilot programmes. They are structural changes to how India evaluates tens of millions of students, and they are anchored in NEP 2020's explicit guidance that "board examinations will be redesigned to eliminate the need for rote learning" and that students should be able to "appear for any board examination when they feel ready, and to improve their performance."

    For colleges and universities — particularly those that host examination or evaluation centres, run autonomous programmes, or are preparing for NAAC re-accreditation — this shift carries specific implications that go beyond the school board level.

    What Multiple Attempts Mean for Evaluation Volume

    The simplest arithmetic: shifting from one annual evaluation cycle to two or three multiplies the evaluation volume accordingly. For an affiliated college that evaluates 4,000-6,000 answer scripts per year as part of a university examination pool, moving to a two-cycle model doubles that figure.

    More practically, the turnaround window between cycles shrinks dramatically. Karnataka's Exam 1 results were declared in April; Exam 2 starts April 30. The evaluation team that handled Exam 1 needs to be available and operationally ready again within weeks.

    This compression eliminates the possibility of running evaluation on a purely physical, camp-based model:

    Traditional physical evaluation timelineDigital evaluation timeline
    Answer books physically transported: 3-5 daysScanned scripts uploaded: same day
    Evaluators travel to camp: coordination over daysEvaluators log in remotely: immediate
    Manual marking and totalling: 15-25 daysOn-screen marking with auto-total: 7-12 days
    Internal spot check audit: 3-5 daysDigital audit trail: continuous
    Data entry and compilation: 3-5 daysAutomated result compilation: hours
    Total: 25-35 days minimumTotal: 10-15 days achievable

    A 10-15 day evaluation cycle is what a 3-exam system actually requires. Physical evaluation does not achieve it at scale. Digital evaluation can.

    The NAAC Criterion 4 Opportunity

    NAAC's revised binary accreditation framework under MBGL (Minimum Benchmark Grading Level) 2025 places significant weight on Criterion 4: Infrastructure and Learning Resources. Specifically:

    Key Metric 4.2 — Library, ICT and Physical Infrastructure: Institutions are assessed on the adequacy of their ICT infrastructure for academic activities, including computer-to-student ratios, internet bandwidth, and the deployment of technology in examination and assessment.

    Institutions that have invested in digital evaluation infrastructure — scanning stations, on-screen marking platforms, secure server access, UPS-backed labs — generate real, auditable evidence for Metric 4.2 that many institutions currently lack.

    Under the previous physical accreditation model, peer teams could make subjective judgements about infrastructure. Under NAAC's current digital-first, data-verified approach, institutions need to demonstrate technology deployment through logs, contracts, utilisation records, and system configurations.

    Every board examination evaluation cycle conducted digitally at your institution generates:

  • Evaluator login records with timestamps
  • Answer scripts scanned and processed (quantifiable throughput)
  • Network utilisation logs
  • UPS and hardware configuration documentation
  • Marks submission audit trail
  • These records are directly usable as supporting evidence in your Self-Study Report (SSR) under Criterion 4.2 and in the AQAR (Annual Quality Assurance Report) that IQAC files annually.

    Criterion 6.2 — E-governance

    NAAC also evaluates institutional governance under Criterion 6, specifically whether examination and assessment processes use digital platforms. An institution that uses on-screen marking for its autonomous semester examinations — or that participates in university or board evaluation using a digital platform — has concrete evidence of e-governance in assessment. This is distinct from merely having a student information system or a fee payment portal.

    Criterion 6.2 rewards demonstrably paperless administrative processes. Examination evaluation, historically one of the most paper-intensive functions, is among the highest-impact areas where institutions can shift the evidence.

    The NIRF Graduation Outcomes Connection

    NIRF ranks institutions on five parameters. The Graduation Outcomes (GO) parameter has a weight of 30 points out of 100, making it the single highest-weighted parameter in the methodology.

    Within GO, the key metrics are:

  • Ph.D. students per faculty (for universities)
  • Students qualifying in national-level examinations
  • Placement and higher studies outcomes
  • Median salary of placed graduates
  • Students completing the programme on time
  • Multiple-attempt exam policies — and the digital evaluation infrastructure that supports them — have a direct causal relationship with the "students completing programme on time" metric. When students fail a paper in Year 2 and must wait until the following year's supplementary to clear it, their programme duration extends. A student who could appear for a second attempt in the same academic year completes on schedule.

    For universities where supplementary exams have historically accumulated a backlog of students who never finish their programmes, the shift to multiple attempts with fast evaluation turnarounds is a structural intervention in the graduation rate metric.

    Faster results also improve the downstream metrics: students who have their results in April rather than June have more time to apply for placements, higher studies, or qualifying examinations in the same academic year.

    Building the Case Internally

    Many college principals and registrars are aware that digital evaluation is operationally superior but face internal resistance when proposing the investment. Common objections:

    "Our university hasn't mandated it yet." — University mandates are a lagging indicator. CBSE, Karnataka PUC, and MP Board have all moved without waiting for a single national mandate. Affiliated colleges that build digital evaluation readiness now can participate in university pilots, host evaluation centres, and generate NAAC evidence — regardless of the university timeline.

    "We don't have the budget." — CBSE's minimum infrastructure specification (Windows 8+, 4 GB RAM, 2 Mbps internet, UPS) describes equipment that most government and aided colleges already have or can acquire for under Rs 5-10 lakh if they do not. The marginal investment is smaller than typically assumed. An existing computer lab can often be repurposed.

    "We don't have trained evaluators." — Digital evaluation platforms are designed for teachers who have never used them before. CBSE and state boards provide training for their systems. Internally, the training curve for faculty is measured in hours, not weeks.

    "It's not secure." — Properly implemented digital evaluation has a more complete audit trail than physical evaluation. Answer scripts are accessed via credentialed login, every action is logged with a timestamp, and marks cannot be altered after submission without creating a detectable record. Physical evaluation centres have no equivalent audit trail for what happens inside a room.

    A Practical Three-Step Institutional Action Plan

    Step 1: Audit existing infrastructure

    Map what you have against CBSE's minimum specification: dedicated lab with Public Static IP, Windows 8+ machines with 4 GB RAM, 2 Mbps internet, UPS. Identify gaps. Most institutions will find the gap is narrower than expected.

    Step 2: Register as an evaluation centre

    For state board examinations, contact your state board or university examination branch to be listed as an evaluation centre for the next cycle. For CBSE-affiliated schools and colleges that provide teachers as evaluators, coordinate with the institution's CBSE liaison to request on-screen marking assignment for the next cycle.

    Step 3: Document everything for NAAC

    From the first digital evaluation cycle, begin maintaining:

  • Evaluator participation records (names, subjects, dates, scripts processed)
  • Infrastructure utilisation logs (internet usage, lab hours)
  • Hardware and software configuration documents
  • Any system audit reports or marks submission confirmation records
  • Compile these into a folder labelled by NAAC criterion. By the time your next SSR is due, you will have 2-3 cycles of evidence — not a retrospective claim.

    The Window Is Narrow

    Accreditation cycles run on 5-year timelines. Institutions preparing for their next NAAC assessment in 2027 or 2028 have a 12-18 month window to start generating the digital evaluation evidence that will differentiate them from peers who are still describing technology adoption in aspirational terms.

    Multiple exam attempts are now policy, not experiment. The infrastructure to support them is now specification, not suggestion. The accreditation credit for building that infrastructure is available now, not later.

    Related Reading

  • How Digital Evaluation Improves Your NAAC Accreditation Scores
  • NAAC Binary Criterion 4: Building the ICT Infrastructure Evidence Portfolio
  • Digital Evaluation, NAAC, NIRF, and NBA — The Triple Accreditation ROI
  • Ready to digitize your evaluation process?

    See how MAPLES OSM can transform exam evaluation at your institution.