UGC Fitness Rules 2024: Why NAAC, NBA, and NIRF Are Now Tied to Your Funding
The UGC's new Fitness of Colleges for Receiving Grants Rules 2024 make NAAC accreditation, NBA programme approval, and NIRF participation mandatory conditions for central government funding — and digital examination records are the foundation of the evidence institutions need.

The Rules That Change Everything
The University Grants Commission's (Fitness of Colleges for Receiving Grants) Rules, 2024, represent the most significant overhaul of college funding eligibility criteria since 1975. Released for public comment in January 2024, the rules replace a 49-year-old framework with requirements that directly reflect the quality assurance priorities of NEP 2020.
The core requirement is straightforward: a college must be accredited and ranked to receive UGC funds.
Specifically, colleges must:
These are not aspirational targets. They are threshold requirements for what the UGC calls "12B status" — the designation that makes a college eligible for grants covering infrastructure development, faculty salaries, research funding, and institutional support. Without 12B status, a college is effectively cut off from the primary channel of central government higher education funding.
Why Most Colleges Are Exposed
NAAC data suggests that fewer than 40% of India's approximately 40,000 colleges hold a valid NAAC accreditation. Of those accredited, a significant proportion hold B or lower grades, with accreditation cycles expiring and institutions failing to renew on time.
NBA accreditation coverage is even more limited. The NBA accredits individual programmes against outcome-based standards aligned with the Washington Accord (engineering) and Seoul Accord (applied sciences). For institutions offering five or more technical programmes, the requirement to accredit 60% of eligible programmes represents substantial additional compliance work — particularly for colleges that have not yet implemented the outcome-based education (OBE) documentation framework NBA requires.
NIRF participation has grown year over year, but many institutions that participate do not provide complete or accurate data across all parameters, which limits their ranking performance regardless of actual quality.
The result: a significant proportion of Indian colleges that currently receive UGC grants may need to substantially improve their quality assurance documentation to maintain eligibility under the new rules.
The Evidence Problem
NAAC, NBA, and NIRF assessments are all evidence-intensive. Each requires institutions to demonstrate performance against specific criteria using verifiable data. The challenge is that many Indian institutions generate the relevant evidence in fragmented, paper-based systems that make it difficult to compile, verify, and present at the time of assessment.
What NAAC Requires
NAAC's revised Binary and MBGL framework assesses institutions across seven criteria. Of these, three have direct and substantial dependencies on examination and evaluation records:
Criterion 2 — Teaching-Learning and Evaluation carries significant weight in NAAC scoring. Evaluators examine evidence of student assessment processes, including whether assessment is continuous, transparent, and reformative. Evidence typically required includes examination schedules, question paper standards, evaluation guidelines, mark distribution data, and student performance analytics. Digital evaluation platforms generate this evidence automatically as a byproduct of normal operation; paper-based systems require labour-intensive manual compilation.
Criterion 5 — Student Support and Progression requires data on student results, pass rates, drop-out rates, and progression to higher education or employment. Accurate, complete student performance records — tied to specific examinations, specific evaluators, and specific timelines — underpin this criterion's evidence base.
Criterion 6 — Governance, Leadership and Management examines institutional processes for quality assurance, including examination governance. Audit trails, process documentation, and evidence of systematic improvement are valued.
A consistent finding across NAAC self-study reports is that institutions struggle most with Criterion 2 — not because their teaching-learning practices are poor, but because their documentation of those practices is incomplete. Digital evaluation platforms address this by creating structured, timestamped records of every evaluation activity.
What NBA Requires
NBA accreditation for engineering and applied science programmes is built on outcome-based education principles. Programmes must define Programme Educational Objectives (PEOs), Programme Outcomes (POs), and Course Outcomes (COs), and must demonstrate through direct and indirect assessment that students are attaining those outcomes.
Direct assessment typically relies on examination marks, mapped to specific COs and POs. For a programme with eight courses per semester and eight semesters, this means generating, storing, and analysing approximately 64 outcome-attainment datasets per batch of students. NBA assessors examine whether these attainment calculations are accurate, current, and traceable to underlying evaluation records.
Manual calculation of CO attainment is error-prone and time-consuming. Digital evaluation platforms that tag each question to a specific course outcome can generate CO attainment reports automatically, with the underlying evaluation records available for auditor verification.
What NIRF Measures
NIRF ranks institutions across five broad parameters, with the following approximate weightages for colleges:
| Parameter | Weightage |
|---|---|
| Teaching, Learning and Resources | 30% |
| Research and Professional Practice | 30% |
| Graduation Outcomes | 20% |
| Outreach and Inclusivity | 10% |
| Perception | 10% |
The Teaching, Learning and Resources parameter includes examination infrastructure as a component of institutional resources. Graduation Outcomes directly depends on pass rates, timely result declaration, and progression data — all of which improve when examination processes are digitised and efficient.
NIRF data submissions are self-reported but subject to verification. Institutions that inflate data or submit without supporting records face disqualification. Institutions that generate clean, auditable digital records from their examination processes can submit NIRF data with confidence.
Building the Evidence Architecture
The UGC Fitness Rules create a practical imperative: institutions must now maintain continuous, structured evidence of their examination and evaluation processes — not just at assessment time, but year-round.
The institutions that perform best in accreditation cycles are those that treat evidence collection as an ongoing operational process rather than a pre-assessment scramble. The Annual Quality Assurance Report (AQAR) that IQAC submits to NAAC every year is a proxy measure of whether this culture exists. Institutions with digital examination records can populate AQAR examination data fields in hours; those relying on manual records typically need weeks.
Practical Steps
Audit your current accreditation timeline. NAAC accreditation cycles are typically five years. If your current accreditation expires within 24 months, your evidence collection for the next cycle should already be underway. Map your examination records against Criteria 2, 5, and 6 evidence requirements now to identify gaps.
Calculate your NBA exposure. If your institution offers more than three eligible technical programmes, identify which 60% you will put forward for NBA accreditation and confirm that OBE documentation — CO/PO mapping, attainment calculations, assessment rubrics — is current for each.
Register for NIRF. Institutions not yet participating in NIRF have no path to meeting the UGC Fitness Rule participation requirement. Registration and data submission open annually; institutions that miss submission windows cannot retroactively satisfy the participation count requirement.
Centralise examination data. Institutions that run affiliated college examinations through university systems, and conduct internal assessments through departmental processes, often have examination data in multiple silos. A consolidated examination management platform that captures both creates the unified record set that NAAC, NBA, and NIRF all draw from.
The Funding Cliff
The transition period for the new UGC Fitness Rules includes grace provisions for institutions in the process of seeking accreditation. But those provisions are time-limited. Colleges that have not begun serious accreditation preparation — securing NAAC recognition, initiating NBA programme assessments, submitting NIRF data — are approaching a funding cliff.
For institutions that depend on UGC grants to pay faculty salaries and maintain infrastructure, the operational risk is existential. The rules are not new; they have been in effect since 2024. The institutions that will maintain eligibility are those that treat the accreditation data trail as a core operational function, built into their examination and evaluation systems, not bolted on at assessment time.
---
Related Reading
Ready to digitize your evaluation process?
See how MAPLES OSM can transform exam evaluation at your institution.