Higher Education is a global service industry giving huge revenues to national economies and employing millions of people as faculty and staff every year. Around 10 million students leave their own country for pursuing higher education in some developed nations. Among the developed nation, there is an intense competition to get these students enrolled in their universities. Admission seekers in universities generally look for rankings to judge the quality of higher education offered by them.
Global university rankings such as Shanghai Jiao Tong University (SJTU) or Academic Ranking of World Universities (ARWU) Ranking, Times Higher Education Supplement (THES) World University Ranking and QS World University Ranking are considered most popular in the world.
- The THES Ranking: This ranking is considered a balanced ranking because it assigns equal weights to teaching and research – 30 per cent each level. Citations for research papers carry another 30 per cent while international outlook of a university and industry income have 7.5 per cent and 2.5 per cent weight respectively.
- SJTU or ARWU Ranking:The SJTU or ARWU ranking is the most popular in the case of universities which are research focused. In this ranking, four broad categories – Quality of Education, Quality of Faculty; Research Out- put, and Per Capita Performance – several indicators are chosen and definite weights are assigned to each of them under the ranking.
- QS World University Ranking: It publishes ranking for the top 400 universities in the world. This ranking is mainly reputation-based as academic reputation from global survey has 40 per cent weight and employer reputation 10 per cent. Half of the total weights are thus based on the reputation of the universities and the rest are distributed among citations (20 per cent), faculty-student ratio (20 per cent), and proportion of international students and staff (10 per cent).
The Table 1.2 presents comparison of the parameters and weightages used by three world level rankings. These rankings provide information about educational quality across the world about HEIs. The parameters, weights and process of ranking are made public through their respective websites. Global Rankings indeed give us some idea about the performance of the universities across the world. But one should not expect that global rankings are without any inbuilt lacuna or defect. Actually the whole concept has favourable tilt towards universities of North America and Europe.
TABLE 1.2 Basic Parameters and their Weights used in Global University Rankings(in per cent
|Shanghai Jiao Tong||Times Ranking||QS Ranking|
|Alumni of an Institution Winning |
Nobel Prizes and Field Medals: 10
|Teaching: 30||Academic Reputation from Global Survey: 40|
|Staff of an Institution Winning |
Nobel Prizes and Field Medals: 20
|Research||Employers’ Reputation from Global Survey: 10|
|Highly-Cited Researchers in 21 |
Broad Subject Categories: 20
|30||Citations per Faculty: 20|
|Papers published in Nature and Science: 20||Citation: 30||Faculty-Student Ratio:20|
|Papers Indexed in Science Citation |
Index-Expanded and Social Science
Citation Index: 20
|Industry Income: 2.5||Proportion of International Students: 5|
|Per Capita Academic Performance of an nstitution:10||International Outlook: 7.5||Proportion of International Faculty: 5|
Issues & Challenges in Global Rankings
- Focus on Foreign Students & Faculty:- While ranking universities and higher education institutions at global level, global rankings agencies expect that the proportion of foreign students and faculty to total number of students and faculty of a university should be of higher order. But they underscore the fact that many university could have a vision and mission to serve the educational needs of local or regional economy. Givingweightage to the presence of foreign students and faculty in higher numbersmight not serve the needs of these kind of universities and HEIs.
- Disfavouring Non-English Speaking Population:- English may be the dominant language in higher education at the international level, but there are large number of countries in the world who allow their teachers, students and research scholars to use languages other than English in teaching and research. Popular global rankings have an inherent bais in favour of research work published in English. Research scholars doing their work in languages like Chinese, French, Spanish, German, Hindi etc. normally face difficulties in finding reputed journals for getting their research papers published. Thus global rankings are unable to correct this imbalance
which is tilted in favour of English speaking nations,especially from Western Countries. (Marginson, 2006)
- Delinking Universities from their Context:- Higher education institutions and universities are creation of their place of origin and their historical requirements. Hence while ranking them, global rankings should preferably not use common parameters or weightages ignoring the fact that assessment of the quality of a university cannot be delinked from its localcontext and needs.
- Lack of Holistic Assessment:- A university or a higher education institution serves the society mostly by creating and disseminating new knowledge and also by grooming talent for the future requirements of the society. In this perspective, both teaching and research are equally important in higher education. Global rankings are sometimes criticized for not using a holistic framework for judging the quality of a university. For example, the SJTU Ranking lays inordinate emphasis on research which ultimately disfavour a national university focusing on teaching. It may not find a top slot in spite of producing thousands of talented manpower suit the needs of to the host country.
- Neglecting Social Responsibility of A University:- Universities and higher education institutions are integral part of the economy and thus act as a source of supplying talented manpower for the future needs of a nation. Global ranking, by giving higher weightages to research and less weightage to teaching simply ignores the basic fact that a university has to serve society primarily by ensuring a steady supply of talented manpower.
- Perpetual Dominance of Ivey League Institutions:- In global rankings, there is a tilt also in favour of those Ivey League Institutions which have been building global reputations in the course of their evolution. Givingweightage to reputation and perception leads to a perpetual dominance of Ivey League Institutions. (Altbach, 2006)
- Favouring the Well Established University:- Worldwide, Universities are not of similar size, stature and strength. But the ranking models of different global rankings mostly favour well established universities who have a long history with sizable endowments and big annual budget. These universities, naturally, also have large number of students, faculty and staff.
- Misuse of Rankings by Admission Marketus:- Marketing of products and service usually bring the possibility of unethical methods being used by markets. Innocent students and parents are many times led by admission marketers to take admission in inferior institutions bymis-interpretation of rankings. Multiplicity of rankings aggravates this problem.
- Tendency to Inflate the Data:- Good ranking of a university opens opportunities for getting more funds, tie ups, alliances and partnerships. To climb up on the ranking ladder, universities sometime resort to inflating the data. Although it is always risky to make false claims, it is a shortcut adopted by some administrators in universities where the ethical code in not adhered toby the leadership.
- McDonaldization of Higher Education:- Similar to models
of big consumer brands like McDonalds, KFC, Coca-Cola, Pepsi etc. global ranking agencies have a tendency to apply same set of ranking parameters or criteria to universities operating in different parts of the world. There is a growing concern that the desire among university to find place in the ‘Top 200 University List’ could actually ‘McDonaldize’ higher education which will not help in the sustainability or future development of those universities.
NIRF-India’s Intiative in Ranking
During the last decade whenever global rankings of universities and higher educations were announced, there have been uproars in Indian media, in the parliament, as well as in the academia. In all these global rankings, names of few Indian universities and IITs alone find some place in ‘Top-200” or ‘Top-500’ category, whereas several universities from countries like China, Japan, South Korea, Hong Kong and Singapore were continuously getting top positions regularly. It created a lot of heart-burning among the policy makers and political leaders.India’s former President Pranab Mukherjee, who was also the ipso facto Chancellor for 40+ central universities was very much concerned about this and had spoken about it several times. To prepare the Indian universities for meeting requirementsof global ranking, a new idea emerged as to why India cannot start its own ‘National Ranking Framework’?Since 2014, Ministry of Home Resource Development(MHRD) has been contemplating to launch India’s own National Ranking so as to create competition among Indian universities and institutions and in the process raise the overall standards of educational institutions.
In this backdrop, the National Institutional Ranking Framework (NIRF) was approved by the MHRD and launched by the Minister of Human Resource Development on September 29, 2015.
Evolution of the National Institute Ranking Framework (NIRF)
A one-day workshop was organized by the MHRD on 21 August 2014 for developing methodologies for the ranking of institutions of higher education in India. The workshop resolved to constitute a Committee for evolving
a National Ranking Framework. A Core Committee consisting of 16 members was constituted on 29 October 2014 with Secretary (Higher Education, MHRD, as Chairperson and Additional Secretary (Technical Education), MHRD, as Member-Secretary.
The terms of reference of the Committee were:
- Suggest a National Framework for performance measurement and ranking of Institutions and Programmes.
- Suggest the organizational structure, institutional mechanism and processes for implementation along with time-lines of the National Ranking Framework.
- A mechanism for financing of the Scheme on National Ranking Framework.
- Linkages with National Assessment and Accreditation Council (NAAC) and National Board of Accreditation (NBA), if any.
An Expert Committee was constituted by the UGC on October 9, 2015 to develop a framework for the ranking of universities and colleges in India and the framework developed by this Expert Committee was incorporated into National Institute Ranking Framework (NIRF). The Core Committee also suggested a framework for ranking institutions offering management education also. For ranking management, pharmacy and architecture institution, the AICTE was designated to develop parameters and metrics.
Recommendations of the Core Committee
The following recommendations were made by the Core Committee:
- The metrics for ranking of engineering institutions should be based on the parameters agreed upon by the Core Committee.
- The parameters were organized into five broad heads or groups and each group has been divided into suitable sub-groups.
- A suitable metric was proposed which computed a score under each sub-head. The sub-head scores were then added to obtain scores for each individual head. The overall score was computed-based on the weights allotted to each head. The overall score could take a maximum value of 100.
- The Committee recommended the classification of institutions into two categories:
- Category A Institutions: These are institutions of national importance set up by Acts of Parliament, State Universities, Deemed-to-be Universities, Private Universities and other autonomous institutions.
- Category B Institutions: These are institutions affiliated to a University and do not enjoy full academic autonomy.
Parameters and their weightages
Engineering, Management, Pharmacy and Architecture institutions
The approved set of parameter groups and the weightages assigned to the categories of institutions offering programmes in engineering, management, pharmacy and architecture are given in the following table.
|Parameters||Category A institutions||Category B institutions|
|Teaching, learning and resources (TLR)||0.30||0.30|
|Research, professional practice and collaborative |
|Graduation outcome (GO)||0.15||0.25|
|Outreach and inclusivity (OI)||0.15||0.15|
Overall and colleges
The approved set of parameter groups and the weightages assigned to them in respect of overall rating and for colleges are given in the following table, for 2018.
|Teaching, learning and resources (TLR)||0.30||0.40|
|Research, productivity, impact and IPR (RPII)||0.30||0.15|
|Graduation outcome (GO)||0.20||0.25|
|Outreach and inclusivity (OI)||0.10||0.10|
Criticism of NIRF
- High levels of variation: – Ranking launched by NIRF shows the suspiciously huge variations between the 2016 and 2017 rankings. For Example, JamiaMillia university improved to the extent of 71 places, from #83 to #12 whereas Guru Gobind Singh University Delhi fallen in quality by 61 places (#22 to #82)?
- Lack of consistency:- Apart from this, the standout pattern indicates that a large fraction of universities in the 2017 ranking is new entrants, as a corollary, the same number of universities from the 2016 rankings have dropped off the list. To be precise, 47 of the top 100 are new entrants in the NIRF 2017.
- Wrong Comparison: – As per the NIRF 2017 ranking, the fourth best university in India was a highly specialized research institution in Bengaluru. The JNCASR has around 200-300 Ph.D. students and faculty there at any given time. How does one compare it to the University of Hyderabad (UoH), which came fourth in 2016? UoH has about 4,500 students and 400 faculties. JNCASR is not normally perceived as a university as the term is typically understood.
- Structural issue: – Many universities in India have an area of focus: engineering (like BITS-Pilani) or social sciences (like the Tata Institute of Social Sciences). Others offer courses across the arts and sciences, like JNU, UoH, Delhi University, etc. It is the latter that fits the general definition of a university. This means that comparing these two kinds of institutions it not very useful both from the institutions’ and from the students’ points of view.
- Too Much Focus On Research: – One rank to rule them all the fundamental issue with coming up with weighted rankings as what NIRF has adopted it question of weights. Weights are an indication of priority-but whose priority? Currently, the NIRF has decided that research performance counts towards 30% of a university’s rank and 20% of a college’s rank. Does the average undergraduate student really value research to that extent?