IMPACT OR ILLUSION? RANKINGS IN INSTITUTIONAL STRATEGIES AND PROCESSES: EUA PUBLICATIONS 2014 BY ELLEN HAZELKORN, TIA LOUKKOLA, THÉRÈSE ZHANG - PDF

Description
EUA PUBLICATIONS 2014 RANKINGS IN INSTITUTIONAL STRATEGIES AND PROCESSES: IMPACT OR ILLUSION? BY ELLEN HAZELKORN, TIA LOUKKOLA, THÉRÈSE ZHANG With the support of the Lifelong Learning programme of the

Please download to get full document.

View again

of 60
All materials on our website are shared by users. If you have any questions about copyright issues, please report us to resolve them. We are always happy to assist you.
Information
Category:

Science

Publish on:

Views: 31 | Pages: 60

Extension: PDF | Download: 0

Share
Transcript
EUA PUBLICATIONS 2014 RANKINGS IN INSTITUTIONAL STRATEGIES AND PROCESSES: IMPACT OR ILLUSION? BY ELLEN HAZELKORN, TIA LOUKKOLA, THÉRÈSE ZHANG With the support of the Lifelong Learning programme of the European Union Copyright by the European University Association 2014 All rights reserved. This information may be freely used and copied for non-commercial purposes, provided that the source is acknowledged ( European University Association). European University Association asbl Avenue de l Yser Brussels, Belgium Tel: Fax: A free electronic version of this report is available through ISBN: EUA PUBLICATIONS 2014 RANKINGS IN INSTITUTIONAL STRATEGIES AND PROCESSES: IMPACT OR ILLUSION? BY ELLEN HAZELKORN, TIA LOUKKOLA, THÉRÈSE ZHANG CONTENTS LIST OF TABLES AND MAPS 6 GLOSSARY AND ABBREVIATIONS 8 FOREWORD 10 ACKNOWLEDGEMENTS 11 EXECUTIVE SUMMARY SETTING THE STAGE Rankings in Institutional Strategies and Processes (RISP) Objectives of the RISP project Methodology and framework of analysis Site visits The Roundtable TRENDS IN RANKINGS Rising influence of rankings Trends in transparency, accountability and comparability MAIN FINDINGS FROM THE RISP SURVEY Characteristics of RISP respondents Recognition and knowledge of rankings Monitoring rankings Internal dissemination Communication with, and influence on, external stakeholders The role of rankings in collaboration and engagement with stakeholders Benchmarking and relations with other institutions Institutional strategies incorporating rankings Impact on decision-making Choice of indicators EXPLORING THE IMPACT OF RANKINGS One development among others Diversity of rankings Who is interested in ranking results How rankings are used Implications for institutional processes CONCLUSION AND RECOMMENDATIONS 50 APPENDIX: KEY ACTORS IN THE RISP PROJECT 53 Partner organisations 53 Members of the RISP Steering Committee 54 Researchers who conducted RISP site visits 54 Project coordination at EUA 54 REFERENCES AND FURTHER READING 55 5 LIST OF TABLES AND MAPS Figures Figure 1 Distribution of RISP respondents per country 23 Figure 2 What is the type of your institution according to the national statutes? 24 Figure 3 How many full-time equivalent students do you have in total, including undergraduates and postgraduates? 24 Figure 4 Is your institution currently ranked in any ranking? (all respondents) 25 Figure 5 Is your institution currently ranked in any ranking? (HEIs counting over 30,000 students) 25 Figure 6 Is your institution currently ranked in any ranking? (per type of HEI) 26 Figure 7 Which ranking(s) do you find the most influential/have the greater impact on your institution? 27 Figure 8 Does your institution monitor its position in rankings? (all respondents) 28 Figure 9 Does your institution monitor its position in rankings? (ranked institutions) 28 Figure 10 When monitoring your rank, what is the highest level at which this review takes place? (respondents who monitor their rank) 30 Figure 11 Within your institution, is there any internal dissemination of your institution s position in rankings and/or other transparency tools? (respondents who monitor their rank) 31 Figure 12 Does your institution use its position in rankings for marketing or publicity purposes 31 Figure 13 In your opinion, which of the following groups is/are influenced by rankings in their views, choices or decisions about your institution? 32 Figure 14 In your opinion, which group(s) of students are influenced by rankings when choosing their higher education institution? (respondents who believe that prospective students are influenced by rankings) 33 Figure 15 In what way do you think the results of rankings affect your institution s reputation? 35 Figure 16 Do you monitor the ranking of other/peer institutions? 36 Figure 17 Do rankings play a part in your institutional strategy? 37 Figure 18 Has your institution used the results of rankings or other transparency tools to take strategic, organisational, managerial or academic action? 38 Figure 19 Have the results of rankings or other transparency tools helped or hindered the following activities in your institution? 41 6 Figure 20 In the framework of your strategic planning and internal monitoring of activities, does your institution pay special attention to the following, either at institutional or at faculty level? 42 Tables Table 1 To which level of studies does your institution educate students? 24 Table 2 How does your institution inform itself about ranking methodologies? (all respondents) 27 Table 3 How does your institution monitor its position in rankings? (respondents who monitor their rank) 29 Table 4 Do you advertise the positioning of your institution in rankings in your communication with the following external stakeholders or partners? 33 Table 5 In your opinion, which of the following group(s) is/are influenced by rankings in their views, choices or decisions about your institution? 34 Table 6 What are/would be the reason(s) for monitoring the ranking of other institutions? (respondents who monitor the ranking of other/peer institutions or are planning to do so) 36 Table 7 Do rankings play a part in your institutional strategy? 38 Table 8 How have rankings influenced the type of institutional decisions you have made? (all respondents) 39 Table 9 Have the rankings influenced the type of institutional decisions you have made in any of the following ways? 40 Boxes Box 1 Major global rankings, 2014 (in order of year) 19 Box 2 Typology of transparency, accountability and comparability instruments 21 Box 3 Framework for guiding institutional responses to rankings 51 7 GLOSSARY AND ABBREVIATIONS ARWU: refers to the Academic Ranking of World Universities developed by Shanghai Jiao Tong University, China, in 2003 and annually thereafter. BRICS: an acronym for the five major emerging or newly industrialised economies of Brazil, Russia, India, China and South Africa. Faculty: sub-unit within a higher education institution comprising one subject area, or a number of subject areas. Higher Education Institution (HEI): refers to all post-secondary institutions undertaking research and awarding higher degrees (Bachelor, Master s and/or doctorate), irrespective of their name and status in national law. Institution, institutional level: refers to the higher education institution as a whole, beyond and including all its constituent parts (faculties, departments, institutes, etc.) Institutional research capacity: the institution s capacity to generate comprehensive, highquality data and information to underpin strategic planning and decision-making. Performance funding: funding based on how the institution has performed with regard to defined indicators or objectives. Postgraduate students: students registered at second or third-cycle level who have already gained a first-cycle degree. QS: refers to QS Quacquarelli Symonds Top Universities Rankings. Ranking: in the framework of the RISP project, rankings in the higher education sector are understood as lists of higher education institutions. They compare HEIs using a range of different indicators, which are weighted differently, and then aggregated into a single digit in descending order. The term league table is often used to refer to rankings because of the way HEIs are listed in order; it is a metaphor that is taken from the world of sports. Rector: refers here to an executive head of an institution, top senior leadership position, equivalent to President or Vice-Chancellor. RISP respondents and participants: RISP respondents refers to those who completed the online survey, while RISP participants refers to those who participated in the Roundtable. THE: refers to the Times Higher Education World University Rankings. 8 Transparency tools: refers to a range of different mechanisms which facilitate greater knowledge, understanding and comparability about higher education performance, such as benchmarking, accreditation, quality assurance, classification and profiling, etc. They all aim to enhance understanding and clarity about the different missions, activities and performances of higher education and research institutes. Undergraduate students: students registered at first-cycle level. Universities: the term universities is used throughout to describe those HEIs which award qualifications from Bachelor to doctoral level. University of Applied Sciences (UAS): a collective term for higher education institutions designed with a focus on vocational degrees, especially in disciplines such as engineering, business, or health professions, and with a responsibility towards their region or the SME sector. Within their own countries, these HEIs have been called polytechnics (UK), Fachhochschulen (Germany), hogescholen (Netherlands and Belgium), institutes of technology (Ireland), etc. Depending on national legislation, they provide both undergraduate and postgraduate education. 9 FOREWORD Since they were launched over a decade ago, global rankings have managed to shake up the world of higher education. They have provoked and been a source of numerous debates all over the world, as well as leading to discussions about the purpose of higher education and appropriate ways to measure its activities and consider its contribution to society and the economy. The European University Association has previously contributed to the debate through its two publications: the Global University Rankings and their Impact reports I and II, published in 2011 and 2013 respectively. These two reports offered an analysis of ranking methodologies, and this work has shown the need to go beyond studying how rankings are compiled, and focus on their impact on the higher education landscape. In recent years we have seen governments responding to rankings and some systems being shaped so as to aim for world-class universities as opposed to world-class systems. The Rankings in Institutional Strategies and Processes (RISP) project focuses on the institutional level. It is the first pan-european survey of higher education institutions seeking to understand how they use rankings, and the impact and influence that rankings are having on them. To what extent have institutional strategies or processes been affected or changed because of rankings? To what extent have rankings influenced institutional priorities or activities or led to some areas being given more emphasis than others so as to improve an institution s ranking position? How have stakeholders been influenced? The survey was complemented by site visits to six universities and a Roundtable of university managers and stakeholders, both of which were used to support the analysis of the data and form conclusions. We hope that this publication, highlighting the key findings of the RISP project, will be of interest to a wide readership. The aim is to contribute to a broader discussion about the potential impact of rankings on institutional behaviour. Whether this impact is positive or perverse is, to a large extent, dependent upon each individual institution. In this regard, the concluding chapter offers some tips on how to make the best out of rankings as one source of information amongst others, and therefore to take greater control of the impact that rankings can have. Maria Helena Nazaré EUA President 10 ACKNOWLEDGEMENTS We would like to thank, All the higher education institutions which replied to the survey, those institutions which hosted the site visits, and all participants to the Roundtable in June 2014; The project Steering Committee and associated researchers for their continuous support and feedback throughout the process, in particular when developing the questionnaire and guidance on the direction that this publication took; and Joanne Byrne for making sure that the project stayed on track, we stuck to the schedule and for her valuable feedback on the draft versions of this publication. Ellen Hazelkorn, Tia Loukkola and Thérèse Zhang 11 EXECUTIVE SUMMARY Higher education is undergoing rapid change in response to developments occurring at national and international level. Today, universities performance worldwide is increasingly being measured using rankings which have been developed by governmental and/or commercial agencies, at both national and international level. The Rankings in Institutional Strategies and Processes (RISP) project is the first pan-european study of the impact and influence of rankings on European higher education institutions. The project has sought to build understanding of how rankings impact and influence the development of institutional strategies and processes and its results are presented in the publication. The study carried out in the context of the project consisted of three steps: an online survey among European universities and higher education institutions; a series of site visits; a Roundtable with senior university managers and stakeholders. The key findings of the project can be summarised as follows: 1. Identifying the precise role that rankings play in institutional strategies and processes is challenging due to the complexity of the context in which the HEIs operate and the number of factors that HEIs need to take into account when developing their strategies. In addition to global rankings, national rankings have an influential role although there is some confusion regarding what exactly constitutes a ranking. 2. The term ranking has come to be used as tantamount to any measurement of higher education performance. This interchange of concepts may arise because regardless of the accountability or transparency instrument in question, the results are often displayed as a league table or ordinal ranking. Indeed, there is some confusion regarding what exactly constitutes a ranking 3. HEIs pay attention to rankings as one source of information among others. The way in which HEIs study or reflect upon rankings is not systematic or coherent, and may occur at an informal as well as at a formal level. Institutions often use ad hoc monitoring patterns in response to strategic needs related to particular issues. Hence, there is no clear pattern as to how institutions respond: not all institutions or all institutions with a similar profile react in the same way. 4. The main user groups of rankings identified in the project were both external governments or national higher education authorities in general and international students and internal institutional leadership and the academic community as a whole. However, how these groups use the rankings varies as well as their attitudes towards rankings While HEIs can be highly critical of what is being measured and how, the evidence showed that they can still use rankings in a variety of ways: i) to fill an information gap; ii) for benchmarking; iii) to inform institutional decision-making; and last but by no means least iv) in their marketing efforts. 6. The institutional processes that are impacted by rankings fall into the following four categories: i) mechanisms to monitor rankings; ii) clarification of institutional profile and adapting core activities; iii) improvements to institutional data collection; and iv) investment in enhancing institutional image. 7. Thus, rankings have helped generate a greater awareness of the changing dynamics of the higher education environment, both nationally and internationally, and especially in response to increasing focus on quality and performance. The report concludes that cross-national comparisons are an inevitable by-product of globalisation and will intensify in the future. Therefore it is crucial that all institutions improve their institutional research capacity 1 so as to be able to provide meaningful, comparative information about institutional performance to the public. Finally, the report provides a Framework for Guiding Institutional Responses to Rankings on page By institutional research capacity, the authors mean the institution s capacity to generate comprehensive, high-quality data and information to underpin strategic planning and decision-making. 13 1 SETTING THE STAGE 1.1 Rankings in Institutional Strategies and Processes (RISP) Higher education is undergoing rapid change in response to developments nationally and internationally. Globalisation and demand for a highly educated and skilled knowledge-driven economy have combined to push higher education to the top of the policy agenda. Over the last decades, the number of students enrolled in higher education institutions around the world has grown dramatically; this number is forecast to more than double to 262 million by 2025, with international students expected to rise from current annual figures of 4.3 million to 7.2 million by The demand from society for more higher education is occurring at the same time as many public budgets and private incomes are constrained. This has heightened concerns about quality, especially in publicly funded systems. Questions are also being asked about the degree to which higher education is accountable for its actions to public stakeholders and students, and to the needs and demands of society and the economy. Focus on quality and excellence in a globally competitive world has led to calls for greater and better accountability and transparency, and tools which can enable and facilitate international comparison. Today, universities performance worldwide is increasingly being measured using rankings which have been developed by governmental and/or commercial agencies, at both national and international level. Although criticism of rankings methodologies has been expressed by governments, institutional representatives, students, researchers and others throughout the years, 2 rankings have succeeded in changing the way universities are perceived by students and parents, the business sector, employers and other stakeholders, and how they are presented in the media. They have placed consideration of higher education performance within a wider comparative and international framework. In doing so, the higher education world has become visibly more competitive and multi-polar. Many more countries are now investing in building up their higher education and research systems and competing for mobile talent and investment. Consequently, one may conclude that international comparisons and classifications of universities are here to stay. It is generally easy to fall for the temptation to report and try to analyse movements up and down ranking lists, in spite of the knowledge that it may often be just statistical noise, not necessarily reflecting any substantive underlying changes. RISP respondent Because of the significance attached to being listed in the rankings, research 3 indicates that rankings are having a growing and significant impact and influence on institutional decisionmaking and actions. With this in mind, the Rankings in Institutional Strategies and Processes (RISP) project was launched in late This is the first pan-european study using data drawn from a 14 2 EUA has previously published two reviews of the methodologies of rankings written by Andrejs Rauhvargers (see further details in References and further reading). 3 See References and further reading. large sample of HEIs, of how, and to what extent, European higher education institutions (HEIs) are being influenced by, and responding to, rankings. This is important because many people have their own views but until now this has not been based upon information provided from HEIs. The aim of this study is to better understand how complex and multifaceted rankings have become, and how HEIs use comparative information about institutional performance to inform their strategies. This report presents the key findings of the project about the impact of rankings on institutional strategies and processes. It also makes some recommendations as to how higher education institutions can thoughtfully, cautiously
Related Search
Similar documents
View more...
We Need Your Support
Thank you for visiting our website and your interest in our free products and services. We are nonprofit website to share and download documents. To the running of this website, we need your help to support us.

Thanks to everyone for your continued support.

No, Thanks