Governance and Performance Management in Public Universities: Current Research and Practice 3030856976, 9783030856977

This edited volume contributes to the ongoing research and practice on applying performance management to university gov

202 56 3MB

English Pages 219 [220] Year 2022

Report DMCA / Copyright

DOWNLOAD FILE

Polecaj historie

Governance and Performance Management in Public Universities: Current Research and Practice
 3030856976, 9783030856977

Table of contents :
Foreword
Contents
Contributors
Performance Management and Governance in Public Universities: Challenges and Opportunities
1 The Relevance of Higher Education for Social Sustainable Development
2 Managing Sustainable Organizational Performance: The Role of Performance Management Systems
3 Main Challenges for Designing ``Robust´´ and ``Intelligent´´ Performance Management Systems in Public Universities
4 Towards Innovative Performance Management Regimes in Higher Education to Foster Collaborative Networking and Sustainable Com...
5 Structure of this Book
References
Transparency and Accountability in Higher Education as a Response to External Stakeholders and Rules: A Comparison Between Thr...
1 Introduction
2 Accountability, Transparency, and Quality Evaluation in Higher Education Institutions
3 Country-Case Studies
3.1 The Netherlands´ Country-Case
3.1.1 Introduction
3.1.2 Accountability for Whom?
3.1.3 Legal Framework
3.1.4 Accountability Issues
3.2 The Portuguese Country-Case
3.2.1 Introduction
3.2.2 Accountability for Whom?
3.2.3 Legal Framework
3.2.4 Accountability Issues
3.3 The Italian Case Study
3.3.1 Introduction
3.3.2 Accountability for Whom?
3.3.3 Legal Framework
3.3.4 Accountability Issues
4 Discussion and Conclusions
References
An Analysis of Methodologies, Incentives, and Effects of Performance Evaluation in Higher Education: The English Experience
1 Introduction
2 A Theoretical Framework for Performance Evaluation Systems in Universities
3 The Evaluation of Research in England
3.1 Main Features of the Research Excellence Framework
3.2 The Object of Assessment: Research Output, Impact, and Environment
3.3 Effect of Research Evaluation and Transition Toward REF 2021
4 The Evaluation of Teaching in the English HE Sector
4.1 Features and Operation of the Teaching Excellence Framework
5 Discussion and Conclusion
References
A Longitudinal Analysis of the Relationship Between Central Government and Universities in France: The Role of Performance Mea...
1 Introduction
2 A Heterogeneous System
3 Changes in Governance
4 Description of the 4 Policy Tools
4.1 Evaluation
4.2 Contracts
4.3 Mergers
4.4 Excellence-Driven Policies
5 Conclusion
References
Adopting a Dynamic Performance Governance Approach to Frame Interorganizational Value Generation Processes into a University T...
1 Introduction and Research Objectives
2 Reviewing the Concept of Third Mission in HEIs
3 A Dynamic Performance Governance Approach to Frame Third Mission Activities
4 Applying Dynamic Performance Governance to Third Mission Activities: An Illustrative Example
5 Advantages and Limitations of Using DPG to Frame Third Mission Activities
6 Concluding Remarks and Future Research Perspectives
References
The Third Mission Strategies Disclosure Through the Integrated Plan
1 Introduction
2 Literature Review
2.1 Moving Towards the Third Mission and the Entrepreneurial Character of Universities
2.2 Theoretical Background and Prior Research
3 The Italian Context: The Third Mission and the Integrated Plan
4 Methodology and Data Collection
5 Findings
6 Conclusion and Future Research
References
Transferring Knowledge to Improve University Competitiveness: The Performance of Technology Transfer Offices
1 Introduction
2 Literature Review and Research Question
3 Research Methodology
3.1 The Research Context
3.2 The Liaison Office at the University of Calabria
3.3 Data Collection
4 The Research Framework
5 Main Findings
6 Conclusions
Appendix: Guiding Interview Questions
References
Third Mission and Intellectual Capital External Dimension: The Implications in the European University Planning Process
1 Introduction
2 The Evolution of the Third Mission in Universities: State of the Art
3 Intellectual Capital and the Third Mission in Universities: A Literature Review
4 The Italian, The UK, and French University Sector: Context and Initiatives for Evaluation Systems
4.1 The Italian University Sector
4.2 The UK University Sector
4.3 The French University Sector
5 Research Design
6 Findings and Discussion
7 Conclusion
References
Institutional Logics to Unveil Entrepreneurial Universities´ Performances: A Cross-Country Comparative Study
1 Introduction
2 Theoretical Background
2.1 Entrepreneurial Universities and the Third Mission
2.2 Logic Multiplicity in Universities
3 Methodology
3.1 Cases Selection
3.2 Data Collection and Analysis
4 Results
4.1 Teaching Institutional Logics
4.2 Researching Institutional Logics
4.3 Third Mission Institutional Logics
5 Discussion and Conclusion
5.1 Discussion
5.2 Implications
5.3 Limitations and Further Steps
References
Third Mission in Universities from a Performance Management Perspective: A Comparison Between Germany and Italy
1 Introduction
2 From the Ivory Tower to Services and Engagement: The Evolution of Universities´ Third Mission
3 The Case Studies
3.1 Italy
3.1.1 The TM at the National (System) Level in Italy
3.1.2 The TM at the University of Cagliari
3.1.3 The TM at the University of Siena
3.2 Germany
3.2.1 The TM at the National (System) Level in Germany
3.2.1.1 Federal and Provincial Initiatives
3.2.1.2 Start-Up Support in the Brandenburg Universities
3.2.1.3 Start-Up Support in North Rhine-Westphalian Universities
3.2.2 The TM at the University of Potsdam
3.2.3 The Evaluation of the Third Mission at the University of Wuppertal
4 Some Final Remarks
References

Citation preview

SIDREA Series in Accounting and Business Administration

Eugenio Caperchione Carmine Bianchi Editors

Governance and Performance Management in Public Universities Current Research and Practice

SIDREA Series in Accounting and Business Administration Series Editors Stefano Marasca, Università Politecnica delle Marche, Ancona, Italy Anna Maria Fellegara, Università Cattolica del Sacro Cuore, Piacenza, Italy Riccardo Mussari, Università di Siena, Siena, Italy

This is the official book series of SIDREA - the Italian Society of Accounting and Business Administration. This book series is provided with a wide Scientific Committee composed of Academics by SIDREA. It publishes contributions (monographs, edited volumes and proceedings) as a result of the double blind review process by the SIDREA’s thematic research groups, operating at the national and international levels. Particularly, the series aims to disseminate specialized findings on several topics – classical and cutting-edge alike – that are currently being discussed by the accounting and business administration communities. The series authors are respected researchers and professors in the fields of business valuation; governance and internal control; financial accounting; public accounting; management control; gender; turnaround predictive models; non-financial disclosure; intellectual capital, smart technologies, and digitalization; and university governance and performance measurement.

More information about this series at https://link.springer.com/bookseries/16571

Eugenio Caperchione • Carmine Bianchi Editors

Governance and Performance Management in Public Universities Current Research and Practice

Editors Eugenio Caperchione Department of Economics Marco Biagi Modena and Reggio Emilia University Modena, Italy

Carmine Bianchi Department of Political Sciences and International Relations University of Palermo Palermo, Italy

ISSN 2662-9879 ISSN 2662-9887 (electronic) SIDREA Series in Accounting and Business Administration ISBN 978-3-030-85697-7 ISBN 978-3-030-85698-4 (eBook) https://doi.org/10.1007/978-3-030-85698-4 © The Editor(s) (if applicable) and The Author(s), under exclusive license to Springer Nature Switzerland AG 2022 This work is subject to copyright. All rights are solely and exclusively licensed by the Publisher, whether the whole or part of the material is concerned, specifically the rights of translation, reprinting, reuse of illustrations, recitation, broadcasting, reproduction on microfilms or in any other physical way, and transmission or information storage and retrieval, electronic adaptation, computer software, or by similar or dissimilar methodology now known or hereafter developed. The use of general descriptive names, registered names, trademarks, service marks, etc. in this publication does not imply, even in the absence of a specific statement, that such names are exempt from the relevant protective laws and regulations and therefore free for general use. The publisher, the authors and the editors are safe to assume that the advice and information in this book are believed to be true and accurate at the date of publication. Neither the publisher nor the authors or the editors give a warranty, expressed or implied, with respect to the material contained herein or for any errors or omissions that may have been made. The publisher remains neutral with regard to jurisdictional claims in published maps and institutional affiliations. This Springer imprint is published by the registered company Springer Nature Switzerland AG The registered company address is: Gewerbestrasse 11, 6330 Cham, Switzerland

Foreword

After almost a century of experience in performance management in public universities, it is time for some reflections on the results achieved, the challenges still open, and the consequences on governance. Triggered by the growing attention on sustainable development and the role of education in achieving the United Nations’ goals, public sector scholars have intensified the discussion around the crucial role of performance management systems. Well-designed performance management systems can support governing bodies in defining strategies and plans, identifying coherent objectives and targets, and providing measures for ensuring an effective use of public resources. Public universities are responsible not only for educating future generations but even more for research that can enhance knowledge and abilities, and for cooperation with society at large. Their multifaceted mission needs the support of a performance management system to encounter opportunities and face risks and tricks, leading governing bodies to set clear plans but also to design flexible solutions. Moreover, the fluid and complex social and economic situation in which public universities operate worldwide asks governing bodies to foresee the outcomes of the planned actions (and not only the outputs), embracing a perspective well beyond the narrow border of a single institution. In this complex scenario, comprehensive information on reforms, practices, and effects of performance management on the governance of public universities, as well as reflections on the frontiers of competitiveness of universities in cooperating with industries, public administrations, and society, is not only timely but even more necessary for the future of higher education institutions. This book provides an overview of the current trends and challenges. Its ten chapters explore the performance management and governance in public universities, also looking at the experiences of some European countries. It is worth noting that almost half of the chapters focus on the third mission: a concept with a broad meaning, which encompasses a wide range of activities involving the generation, use, application, and exploitation of knowledge and other university capabilities outside academic environments. The specific attention that this volume places on the v

vi

Foreword

third mission is an unequivocal signal of the need for universities to play a stronger, more visible, measurable, and evaluable role in the design of modern knowledge societies by providing socially, culturally, and economically usable knowledge. We are proud to have encouraged the preparation of this book, which is part of a more comprehensive project under the auspices of the Italian Society of Accounting and Business Administration—SIDREA. The editors have to be congratulated on their initiative to coordinate the efforts of a large group of international scholars, editing a book that can be a signpost for further research and reflection for regulators and governing bodies in public universities. University of Naples - Federico II, Napoli, Italy email: [email protected] University of Siena, Siena, Italy University of Tor Vergata - Rome, Rome, Italy

Francesca Manes-Rossi

Riccardo Mussari Denita Cepiku

Contents

Performance Management and Governance in Public Universities: Challenges and Opportunities . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Carmine Bianchi and Eugenio Caperchione Transparency and Accountability in Higher Education as a Response to External Stakeholders and Rules: A Comparison Between Three Country-Case Studies . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Anna Francesca Pattaro, Patrícia Moura e Sá, and Johan A. M. de Kruijf An Analysis of Methodologies, Incentives, and Effects of Performance Evaluation in Higher Education: The English Experience . . . . . . . . . . . Giovanni Barbato and Matteo Turri A Longitudinal Analysis of the Relationship Between Central Government and Universities in France: The Role of Performance Measurement Mechanisms . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Giovanni Barbato, Clément Pin, and Matteo Turri Adopting a Dynamic Performance Governance Approach to Frame Interorganizational Value Generation Processes into a University Third Mission Setting . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Federico Cosenz

1

15

49

69

87

The Third Mission Strategies Disclosure Through the Integrated Plan . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 109 Natalia Aversano, Giuseppe Nicolò, Giuseppe Sannino, and Paolo Tartaglia Polcini Transferring Knowledge to Improve University Competitiveness: The Performance of Technology Transfer Offices . . . . . . . . . . . . . . . . . . 129 Pina Puntillo, Franco Rubino, and Stefania Veltri

vii

viii

Contents

Third Mission and Intellectual Capital External Dimension: The Implications in the European University Planning Process . . . . . . . 149 Elisa Bonollo, Simone Lazzini, and Zeila Occhipinti Institutional Logics to Unveil Entrepreneurial Universities’ Performances: A Cross-Country Comparative Study . . . . . . . . . . . . . . . 179 Canio Forliano, Paola De Bernardi, Alberto Bertello, and Francesca Ricciardi Third Mission in Universities from a Performance Management Perspective: A Comparison Between Germany and Italy . . . . . . . . . . . . 197 Pasquale Ruggiero, Patrizio Monfardini, Dieter Wagner, and Dominik Bartsch

Contributors

Natalia Aversano Department of Management & Innovation Systems, University of Salerno, Fisciano, Italy Giovanni Barbato Dipartimento di Economia, Management e Metodi Quantitativi (DEMM), Università degli Studi di Milano, Milan, Italy Dominik Bartsch Schumpeter School of Business and Economics, University of Wuppertal, Wuppertal, Germany Alberto Bertello Department of Management, University of Turin, Torino, Italy Carmine Bianchi Department of Political Sciences and International Relations, University of Palermo, Palermo, Italy Elisa Bonollo Department of Economics and Business Studies, University of Genoa, Genoa, Italy Eugenio Caperchione Department of Economics Marco Biagi, Modena and Reggio Emilia University, Modena, Italy Federico Cosenz Department of Political Sciences and International Relations, University of Palermo, Palermo, Italy Paola De Bernardi Department of Management, University of Turin, Torino, Italy Johan A. M. de Kruijf Nijmegen School of Management, Radboud University, Nijmegen, The Netherlands Canio Forliano Department of Political Science and International Relations, University of Palermo, Palermo, Italy Department of Management, University of Turin, Turin, Italy Simone Lazzini Department of Economics and Management, University of Pisa, Pisa, Italy

ix

x

Contributors

Patrizio Monfardini Dipartimento di Scienze economiche ed aziendali, University of Cagliari, Cagliari, Italy Patrícia Moura e Sá Faculty of Economics & Research Centre in Political Science (CICP), University of Coimbra, Coimbra, Portugal Giuseppe Nicolò Department of Management & Innovation Systems, University of Salerno, Fisciano, Italy Zeila Occhipinti Department of Economics and Management, University of Pisa, Pisa, Italy Anna Francesca Pattaro Department of Communication and Economics, University of Modena and Reggio Emilia, Reggio Emilia, Italy Clément Pin Laboratory for interdisciplinary evaluation of public policies, Sciences Po, Paris, France Pina Puntillo Department of Business Administration and Law, University of Calabria, Arcavacata di Rende, CS, Italy Francesca Ricciardi Department of Management, University of Turin, Torino, Italy Franco Rubino Department of Business Administration and Law, University of Calabria, Arcavacata di Rende, CS, Italy Pasquale Ruggiero Department of Business and Law, University of Siena, Siena, Italy School of Business and Law, University of Brighton, Brighton, UK Giuseppe Sannino Economics Department, University of Campania “Luigi Vanvitelli”, Capua, Italy Paolo Tartaglia Polcini Department of Management & Innovation Systems, University of Salerno, Fisciano, Italy Matteo Turri Dipartimento di Economia, Management e Metodi Quantitativi (DEMM), Università degli Studi di Milano, Milan, Italy Stefania Veltri Department of Business Administration and Law, University of Calabria, Arcavacata di Rende, CS, Italy Dieter Wagner Potsdam Centrum für Politik und Management, University of Potsdam, Potsdam, Germany

Performance Management and Governance in Public Universities: Challenges and Opportunities Carmine Bianchi and Eugenio Caperchione

1 The Relevance of Higher Education for Social Sustainable Development This book is about performance management and governance in higher education institutions (HEIs), with a specific focus on public Universities. The relevance of higher education for social sustainable development has been remarked since old times in history. In the IV century BC, Aristotle claimed that education must be a matter of public concern. . . . it is manifest that education should be one and the same for all, and that it should be public, and not private—not as at present, when every one looks after his own children separately, and gives them separate instruction of the sort which he thinks best; the training in things which are of common interest should be the same for all. Neither must we suppose that any one of the citizens belongs to himself, for they all belong to the state, and are each of them a part of the state, and the care of each part is inseparable from the care of the whole (Jowett, 1885, p. 244).

This view prioritizes the need of a public education aimed at training for citizenship. The holistic perspective of education adopted by Aristotle combines learning moral and intellectual virtues, which depend on each other. In fact:

C. Bianchi (*) Department of Political Sciences and International Relations, University of Palermo, Palermo, Italy e-mail: [email protected] E. Caperchione Department of Economics Marco Biagi, Modena and Reggio Emilia University, Modena, Italy e-mail: [email protected] © The Author(s), under exclusive license to Springer Nature Switzerland AG 2022 E. Caperchione, C. Bianchi (eds.), Governance and Performance Management in Public Universities, SIDREA Series in Accounting and Business Administration, https://doi.org/10.1007/978-3-030-85698-4_1

1

2

C. Bianchi and E. Caperchione “habit” cannot be thoughtless, unguided repetition . . . This requires supervision to ensure that the learner does the right thing, and coaching that leads her through progressive mastery of various nuances of what she is doing, calling her attention to aspects of it she will not have perceived nor had any language to describe. [On the other hand,] the moral virtues are both a necessary step towards, and only completed by, the acquisition of the intellectual virtue of practical wisdom or good judgment . . . No moral virtue is a true virtue unless it is guided by good judgment, and no one can develop good judgment without first possessing natural or habituated forms of the moral virtues (Curren, 2010, p. 547).

The described view of education has a primary function: fitting the learner “for intelligent participation in civic activities. Only in this manner can the institutions of the state be safeguarded and the plurality which is the state be made into a unity” (Robb, 1943, p. 207). The same principles have been inherited today by different countries in the world. Not only in Scandinavian and mid-southern European countries (traditionally inspired to social welfare values), or in socialist states (traditionally oriented to public intervention in different societal fields), but even in the so-called capitalistic systems the argument that education should take a crucial role in public policy is undisputed. The former chancellor of the City Colleges of Chicago (USA), Dr. Sal Rotella, in his public statement about the “Pell Grant Program” addressed to the Committee on Education and Labor of the Congress in June 1985, commented the following: Community colleges are unique. Each one of them tends to be shaped by the locale and the clientele that it is trying to serve. . . Because American society has progressed and the issues we face have become increasingly complex, higher education has become as important to our individual and collective lives today as elementary education was in Jefferson’s days. The community and junior colleges have a unique role in American higher education, the role of providing access to higher education for many who would be shut out of this process. The new . . . college student we have created together differs in fundamental ways from the traditional college student. He tends to be older; she tends to work and attend college part time; they are commuters. He is often from a minority group or is a new immigrant; she is often the first member of her family to attend college; they are often required to interrupt their education at some point because of their obligations. . . They are likely to need financial assistance to pay even the minimal costs of attending a community college. For this new type of student, the Pell grants provide a basic foundation of assistance, especially because they are the neediest students in the country. . . With the help of the Federal student financial aid, students, regardless of their age, sex, race, or income, should be able to live at home and commute to their community college. . . They need a strong Pell program sensitive to their needs if their educational needs and the Nation’s needs for skills competitive in the global marketplace are to be realized. . . We are talking not only about education; we are talking about the economic development of our future. . . The issue is both simple and fundamental. We are either prepared to provide all of our people with the knowledge and skills they need to be contributing members of our society or we are not. We are either prepared to provide all of our people with the change to share the quality of life that higher education has helped to give you and me or we are not. We will either invest in America’s future now or we pay the bills for not doing so later on in alienation, discontent, unemployment, and diminished capacity (Reauthorization of the Higher education Act, 1985, pp. 32–33).

The previous passages provide interesting insights on the reasons behind this book. And, in particular, of why this book addresses the topic under a performance

Performance Management and Governance in Public Universities: Challenges. . .

3

management and governance perspective. Designing and delivering high quality public higher education is a fundamental requirement for pursuing good societal outcomes. Since each society has its own features and challenges, public higher education needs to fit with the context where the institutions delivering it are located. So, a tailored and collaborative approach is needed. Also, a consistent and purposeful approach is required, since public universities are expected to generate a value that may transcend the individual user dimension. Furthermore, since such organizations are economic institutions, they need proper methods and skills to carry on their administration not only according to effectiveness, but also to efficiency principles. The goal of this introductory chapter is to outline a set of problematic issues, challenges, and opportunities behind this topic, so to provide a key to the subsequent chapters. How can we frame sustainable performance, if referred to an organization (such as a public HEI) located in a given context? What binds governance with performance management in the field of HEIs? What challenges are associated with different governance perspectives in HEIs? Why linking performance management to governance is particularly crucial in the investigated field?

2 Managing Sustainable Organizational Performance: The Role of Performance Management Systems Successful organizational performance underlies a “healthy” administrative condition displaying an economic institution’s steady aptitude to establish a mutually synergic dynamic equilibrium with its environment. Such condition implies that an organization is able to attain balanced and sustainable results, under multiple fields (e.g.: financial, competitive, and social), that might foster its development, endurance, and lifelong existence, for achieving its purpose or raison d’être, i.e.: satisfying human needs (Zappa, 1957, p. 37; Coda, 2010, pp. 91–113). Assessing performance sustainability implies a more pervasive effort than assessing organizational results in terms of their impact on only the ecosystem’s natural resources (Bianchi, 2016, 2021), which is a traditional domain of sustainability studies. Sustainable organizational performance is, rather, a condition that— if holistically framed—plays a central role for pursuing organizational resilience and longevity. A holistic view in pursuing sustainable organizational results requires that performance management systems (PMS) are able to enhance decision makers’ capability to frame trade-offs in both time and space (Bianchi, 2020). Trade-offs in time concern the effects of policies in the short vs. long term; trade-offs in space relate to the effects of policies on a subsystem vs. another subsystem. Enhancing policy design and implementation by framing results under both kinds of trade-offs requires that PMS are able to foster a systemic view to deal with the dynamic complexity of today’s times. This implies relevant, prompt, and selective

4

C. Bianchi and E. Caperchione

information support, to gauge and manage performance under multiple viewpoints. In this regard, managing sustainable performance implies balancing different policy fields and dimensions, in the short and long run. For instance, financial equilibrium should be compatible with public service value generation, on both a competitive and social system dimension (Osborne, 2021). This also requires that sustainable performance is able to enhance coordination and collaboration, both inside and outside of organizational boundaries. The above features provide an essential framework for designing robust and intelligent performance management systems whose key-performance-indicators may enable decision makers to deal with dynamic complexity (Bianchi & Salazar-Rua, 2020). Pursuing sustainable performance in public HEIs requires using PMS as a lever to foster decision makers’ learning processes in dealing with the specific complexity factors that challenge the success and survival of such organizations in the context where they are located. Though the performance management application domains and levels in such field are characterized by a variety of dimensions, a crucial issue is the need of outcome-based approaches in assessing the impact of adopted strategies in HEIs. This perspective is particularly significant in public Universities, due to the role that such organizations are expected to play in the society, i.e.: developing human capital, and impacting on socio-economic context development. This important outcome requires that performance management efforts may strive towards performance governance (Bouckaert & Halligan, 2007) and stakeholder collaboration.

3 Main Challenges for Designing “Robust” and “Intelligent” Performance Management Systems in Public Universities What are the main challenges that HEI should face today in adopting PMS to foster sustainable performance? A first important challenge can be related to culture: is performance management mainly adopted since it is conceived by decision makers as a fundamental support for decision making, coordination, learning, and collaboration, or simply because it is made compulsory by law? In this regard, though in many countries legislative and administrative prescriptions have mandated the use of formal PMS and of benchmarking techniques in public universities, only a major cultural shift from a passive (i.e.: compliance-oriented) to an entrepreneurial style may enable HEIs to generate the full potential that performance management can provide. To this end, a second challenge is the need of skilled performance analysts to both design “robust” PMS, and to implement them by properly supporting information users (Bianchi & Rivenbark, 2012). Related to this priority, a third challenge is fostering a shift of mind from data gathering to data use, to enhance the capability of reported measures to provide

Performance Management and Governance in Public Universities: Challenges. . .

5

decision makers with relevant, selective, and timely information that would foster coordination, learning, and collaboration (Bianchi & Rivenbark, 2014; Bianchi & Xavier 2017; de Lancer & Holzer 2001). The last two challenges emphasize the cruciality of a fourth challenge, i.e.: using a holistic approach in designing PMS, rather than narrowing such practice to the adoption of a set of techniques, such as: budgets and standards, variances, and benchmarks. Each of these tools can, in principle, contribute to better performance management only if framed consistently with the other organizational, institutional, socio-political, and environmental factors providing the setting where they will be used (Amigoni, 1978; Borgonovi, 1996; Brunetti, 1985). Therefore, a focus on PMS design (Otley, 2012) is needed. This underlies a tailored made effort that analysts should foster from inside an organization to fit PMS attributes with the relevant context. This also implies a fifth challenge, i.e.: adopting a balanced and holistic view in managing the performance management cycle. In this regard, an excess of focus on specific and isolated steps in carrying out performance management (e.g.: evaluation, planning, reporting) may jeopardize the overall functioning of the system. In particular, the need of stringent individual performance evaluation metrics should not be conceived as an end per se, but instead as an effect of group and organizational performance evaluation. This would be in turn conceived as a component of the wider performance management cycle. Related to this challenge, focusing on only evaluating administrative performance without connecting it to political performance would imply identifying performance management with just measuring those results pertaining to management control. This narrow view would undermine the capability of PMS to frame how policy design and implementation may affect (or may have affected) outcomes. Provided the dynamic and complex settings characterizing the context where HEIs operate today, this myopic and mechanistic view would turn this practice into an illusion of control (Dermer & Lucas, 1986; Otley, 2012, p. 258). In such scenario, the production of only formal reporting for performance measurement may give rise to defensive routines by evaluated people, who might game the system. The previous challenges prelude to a sixth important challenge in designing and implementing robust and intelligent PMS in HEIs, i.e.: fostering outcome-based performance management. Indeed, measuring and managing outcomes entails facing the “attribution problem,” implying an objective difficulty to directly attribute certain outcomes to specific policies (Pollitt & Bouckaert, 2004; Bovaird, 2012). Most practices in measuring and managing outcomes in public sector organizations—and more specifically in HEIs—are still rooted on estimating the impact of service delivery on individual stakeholders or users, such as: students, research commissioners and funders, ministries, etc. (Aversano et al., 2018; Bonollo & Zuccardi-Merli, 2018; Martin-Sadersai & Guthrie 2021). Such impact is mostly assessed based on external benchmarks or standards, which provide surrogate and partial measures of the overall benefits generated by the delivered services on their direct audiences. For this reason, they may lead to controversial insights if the impact of such services was related to a more holistic set of experiences gained in the long

6

C. Bianchi and E. Caperchione

run by the individual users from exploiting their potential value in the society (e.g.: in the labor market). Osborne (2021, pp. 74–77) describes this concept by distinguishing four main dimensions of public value: (1) value-in-exchange, (2) value-in-production, (3) value-in-use, and (4) value-in-context. While the first two dimensions can be referred to a one-to-one relation between the public service provider and the user, the other two provide an extension of the analyzed boundaries for estimating such value. In fact, value-in-use refers to the “value-added derived from the experience of using a public service, either in terms of its short or medium term effect upon well-being or of its impact upon the holistic lived experience of a service user.” On the other hand, value-in-context refers to the “value-added derived from how a service impacts upon the needs of a service user, in the context of their lived experience.” Therefore, though measuring service outcomes related to direct users based on individual performance standards is a signal of generated value, this may not be an expression of the ultimate overall exploited value in the society. In this sense, the direct outcomes generated by service provision on individual users could be at best defined as intermediate, rather than final, outcomes. The difficulties in measuring and assessing research outcomes are possibly even more significant than those associated with teaching or with the so-called third mission. A vivid illustration of the uncertainties behind research outcomes assessment is provided by Otley (2012, pp. 257–258), who observes the following: “Looking to the future, expert judgement will likely be substantially replaced by more objective metrics such as citations or by impact factors of journals in which work is published. At this point a considerable amount of noise enters the system. To what extent can the ‘quality’ of an article be assessed by the quality of the journal in which it is published? Those who have been involved with the publication process will also recognize that there is a considerable amount of luck as to whether one gets published in a particular journal, depending upon the preferences of the referees selected to review your work, and also those of the editor. So a 3* article may easily be categorized as 2*, or occasionally as 4*.” Major issues in causation analysis when outcome-based performance management is adopted by single organizations are due to an atomistic view of the system. In fact, individual institutions strive to match their policies to their supposed impact— after a time lag—on society, without involving in this effort other stakeholders (e.g., different governance levels in public administration, different departments in a same organization, businesses, non-profit organizations, households) who also play a role in affecting such outcomes. Involving other stakeholders in outcome-based performance management implies using a participatory approach in designing, implementing, and assessing policies, so to extend the investigated system boundaries for performance management from an institutional to an interinstitutional dimension. To this end, PMS may play an important function in enhancing collaborative strategies inside stakeholder networks in local areas, where a leading role is often played by a public sector institution (Ansell & Gash, 2007; Crosby & Bryson, 2010). This is the domain of performance governance, which implies using performance management as a lever to implement governance in a collaborative network

Performance Management and Governance in Public Universities: Challenges. . .

7

setting. Therefore, striving towards managing policy outcomes, in terms of major impact on the society, requires adopting an interinstitutional view of policy design and implementation, which in turn entails a major shift in the way public administration is conceived. This challenge is particularly significant for public education, due to the dynamic complexity and wickedness of such policy domain. In the next section a more detailed analysis of this specific topic will be carried out, with the goal to illustrate further challenges and opportunities for the design of PMS in HEIs.

4 Towards Innovative Performance Management Regimes in Higher Education to Foster Collaborative Networking and Sustainable Community Outcomes The dynamic complexity shaping organizations and society today implies that the “lenses” needed for framing organizational performance significantly differ from those which were successfully used until two decades ago. This requires the adoption of new performance management regimes, particularly in public sector organizations. In fact, the aptitude of such organizations to pursue balanced and sustainable results cannot be bounded to only an “input” or “process” control, that was consistent with an “Old Public Administration” mode. Also, a focus on only measuring performance efficiency and effectiveness, based on output control, is consistent with a “New Public Management” mode, whose challenges—across the ’80s and ’90s— were mostly rooted on an institutional dimension. A local area dimension for performance management and governance entails a number of social “wicked” problems (e.g.: unemployment, poverty, crime, homelessness, urban blight, and low cohesion) providing today a major source of fragmentation in public policy design and implementation, that often leads to poor community outcomes. In this setting, PMS in the public sector are expected to boost governance by fostering an outcome-based and learning-oriented approach, aimed at combining organizational efficiency and effectiveness with an ability to enhance stakeholder collaboration, and to affect community outcomes (Bianchi, 2020, 2021; Borgonovi et al., 2019). Related to the previously commented challenges, the new challenges that such mutated scenario implies today for PMS designers, particularly in the public sector, are manyfold and portray blurred contours. Specific challenges are: (1) dealing with inertial change to frame complexity and uncertainty, (2) exploring system boundaries associated with performance outcomes, (3) framing the impact of systems structures and processes (including PMS) on human behavior, and the (often unintended effects) on performance outcomes, (4) understanding how intangibles and delays affect performance, and (5) enhancing performance dialogue and a learning-oriented approach in performance management and governance. Though such new challenges can be referred to relatively recent phenomena, the need of innovating performance management regimes to face problems and

8

C. Bianchi and E. Caperchione

inconsistencies that may originate unintended outcomes was emphasized by scholars starting from the late ’70s. Hofstede (1978, 1981) remarked that a condition under which a system is under control on paper (the so-called pseudo control) often occurs when behavioral factors may diverge action with respect to the standards set by cybernetic control mechanisms. Likewise, Ouchi (1979) recommended the need of also taking into account organizational control mechanisms, as part of the design process of a PMS that could go beyond the use of bureaucratic and market mechanisms. Such extended perspective in PMS design may contribute to overcome the risk of an illusion of control (Dermer & Lucas, 1986; Hall, 2017; Otley, 2012) and of inconsistent policy implementation (Argyris, 1990). These studies provided the field for “behavioral public administration” (Bhanot & Linos, 2020). Particularly in the last decade, by drawing from behavioral science (Simon, 1947) and social psychology, scholars have been analyzing the effects of bounded rationality and cognitive bias on a wide variety of public policy and management issues. Among them: citizen assessment of government policies (Battaglio et al., 2019), political decision making, public administrators’ behavior, relationships between elected officials and public administrators, government transparency, accountability, citizens trust and governance legitimacy (Grimmelikhuijsen et al., 2017), active citizenship, and public agencies’ responses to performance feedback (Hong, 2018). Also, behavioral issues associated with the use of performance data (at interinstitutional, organizational, group, and individual level) have become more significant than in the past, particularly in those sectors characterized by a substantial role of “street-level” bureaucracy in policy implementation (Brodkin, 2008, 2011, 2012; Eterno & Silverman, 2010; Henman & Gable, 2015; Honig, 2006; Hursh, 2005; Lipsky, 1980; Wiggins & Tymms, 2002). Another innovative research area featuring interesting connections with behavioral public administration is collaborative performance management (Choi & Moynihan, 2019). This is an emerging field of research and practice, which combines different viewpoints (embracing performance management, collaborative governance, and systems theory) aimed at fostering a learning-oriented perspective in performance data use. Even concerning this last strand of performance management studies, the maturation process of research dates back to older times than the last decade. In fact, more than 25 years ago, Otley (1994) remarked how management control systems should enhance learning processes that could lead to an organizational evolution by design, with a focus no longer confined within only the institutional boundaries. In this regard, he recommended a proactive feedforward performance management logic (Otley, 1999, p. 369), implying that the emerging problems or opportunities from policy implementation at departmental level may suggest possible changes in the designed policies, at both an institutional and community level. This is the core of a strategic dialogue that would enhance decision makers’ aptitude to promptly and selectively perceive weak signals of change and to properly respond to them, for enhancing resilience and long-term sustainability.

Performance Management and Governance in Public Universities: Challenges. . .

9

As previously remarked, the rising complexity associated with New Public Governance (Osborne, 2010, p. 9) requires extending the boundaries of performance evaluation from an organizational to an interorganizational dimension, and to involve different stakeholders in the planning process (Rajala et al., 2018). Performance dialogue is expected to enhance the aptitude of key-actors to frame and manage the complexity of contemporary governance embeddedness (Moynihan et al., 2011; Rajala & Laihonen, 2018). In doing that, it may enable a paradigm shift from performance measurement to performance management and governance (Bianchi, 2021; Moynihan, 2005). This endeavor is particularly important since failure in using performance data is a primary factor of poor organizational learning (Moynihan & Landuyt, 2009, p. 1097) and of unsustainable performance outcomes (Bianchi, 2016; Bianchi & Rivenbark, 2014; Moynihan, 2005). To cope with such risk, learning forums may provide a venue for implementing performance dialogue (Laihonen & Mäntylä, 2017, p. 215), which enhances a use of performance information based on social interaction within and across organizations. This research area can be considered as an evolution of outcome-based performance management (Bianchi, 2021; Wichowsky & Moynihan, 2008), and an innovative component of performance governance (Bouckaert & Halligan, 2007) which may significantly matter for HEIs. The use of learning forums in an interorganizational setting (Rajala et al., 2019) has also been defined as an example of “hybrid” performance regimes (Douglas & Ansell, 2020). In such settings, boundary-crossing performance dialogues involve representatives of both public and private sector organizations, aimed at improving local area performance. The described patterns of change in performance management regimes point out the crucial role of HEIs in the society for developing shared strategic resources (e.g.: human capital, culture, active citizenship, image), whose building and deployment processes require stakeholder collaboration, trust, and consensus building. They also point out the perils of mechanistic approaches in designing PMS that would ignore the behavioral implications associated with the specific complexity of HEIs. If observed under this profile, framing performance management through a governance perspective in higher education may provide fundamental insights to assess the functional and dysfunctional profiles of the relations between different layers of governance and administration. For example: State (e.g.: Ministry of Research) vs. Central Academic Administration (e.g.: Academic Senate, and Board; Central offices) vs. Departmental Administration.

5 Structure of this Book Based on the illustrated scientific background and thematic issues, this book aims at fostering a reflection upon conceptual and pragmatic implications related to governance and performance management in public universities. In this realm, a special attention is devoted to the third mission currently an important area of investment for public universities.

10

C. Bianchi and E. Caperchione

Though a specific focus of the book is on the Italian system, where the oldest University (Bologna) in the Western world is located, an extended international research perspective is provided throughout the nine subsequent chapters, written by authors based in five European countries. The papers can be divided in three groups, each focusing on a different aspect. A first batch of papers frames performance management and evaluation, links them to accountability, and examines the current situation and trends in some European countries. In Chap. 2, “Transparency and Accountability in Higher Education as a Response to External Stakeholders and Rules: A Comparison Between Three Country-Case Studies”, Anna Francesca Pattaro, Patrícia Moura e Sá, and Johan A.M. de Kruijf analyze the relationships between transparency, accountability, and performance in higher education institutions, emerging from the formal disclosure documents addressed to external stakeholders in the Netherlands, Portugal, and Italy. In Chap. 3, “An Analysis of Methodologies, Incentives and Effects of Performance Evaluation in Higher Education: The English Experience”, Giovanni Barbato and Matteo Turri focus on the “Research Excellence Framework” and the “Teaching Excellence and Student Outcomes Framework” as qualifying examples of university performance evaluation systems adopted in England. France is at the core of Chap. 4, “A Longitudinal Analysis of the Relationship Between Central Government and Universities in France: The Role of Performance Measurement Mechanisms,” in which Giovanni Barbato, Clément Pin, and Matteo Turri investigate the relationships between central government bodies and French universities, with a specific focus on the role played by performance measurement mechanisms. A second set of papers introduce the theme of the so-called third mission, a strategic area of investment and development for universities, and offer the reader an insight into the Italian situation. Federico Cosenz, in Chap. 5, “Adopting a Dynamic Performance Governance Approach to Frame Inter-Organizational Value Generation Processes into a University Third Mission Setting,” adopts a dynamic performance governance approach to frame academic network value generation processes underlying third mission activities. In Chap. 6, “The Third Mission Strategies Disclosure Through the Integrated Plan”, Natalia Aversano, Giuseppe Nicolò, Giuseppe Sannino, and Paolo Tartaglia Polcini illustrate how different Italian public universities incorporate third mission strategies within their integrated plans. In Chap. 7, “Transferring Knowledge to Improve University Competitiveness: The Performance of Technology Transfer Offices”, Pina Puntillo, Franco Rubino, and Stefania Veltri analyze the role played by technology transfer offices in affecting university and regional competitiveness, as outcomes of university third mission strategies. To this end, the case of the University of Calabria is discussed. The final batch of papers puts the third mission into an international perspective, and provides the reader with fresh comparative material. In Chap. 8, “Third Mission and Intellectual Capital External Dimension: The Implications in the European University Planning Process”, Elisa Bonollo, Simone

Performance Management and Governance in Public Universities: Challenges. . .

11

Lazzini, and Zeila Occhipinti present a survey on the planning documents of 25 higher education institutions based in Italy, the UK, and France. The authors offer such analysis from an intellectual capital perspective on “third mission” performance. In Chap. 9, “Institutional Logics to Unveil Entrepreneurial Universities’ Performances: A Cross-Country Comparative Study”, Canio Forliano, Paola De Bernardi, Alberto Bertello, and Francesca Ricciardi adopt an “institutional logics” perspective to investigate the drivers of teaching, research, and third mission performance related to the cases of the University of Birmingham (UK), the Hong Kong University, and Milan Polytechnic (Italy). In Chap. 10, “Third Mission in Universities from a Performance Management Perspective: A Comparison Between Germany and Italy”, Pasquale Ruggiero, Patrizio Monfardini, Dieter Wagner, and Dominik Bartsch adopt a comparative case-study approach involving universities located in Italy and Germany to illustrate how third mission performance is affected by governance and control factors in different contexts. We are confident the chapters will be very interesting for the readers, both academics and professionals, as they thoroughly deal with very relevant issues. However, knowledge is never complete. Therefore, further research and practical experimentation is needed—and welcome—to learn more about good performance and governance in public universities. In particular, the link between governance and performance, the adequacy of governance schemes, and the factors affecting the actual use of performance information will benefit from further scientific debate and fieldwork. Consequently, a set of open issues remain in the working agenda. Among them: What are the real features of new governance schemes and structures in public universities? What is their impact? How is power actually distributed? Is it functional to sustainable performance? Are HEIs able to tackle the challenges they face? Are they more accountable than in the past? Are they building a brighter future? Are they more respected, trusted? How can PMS enhance trust and consensus building? Are PMS potentially useful for University Boards? To what extent are Boards actually using performance data? What are decisions based on? Do governance rules shape performance measures, or can other drivers be identified? What is the proper balance of power between Government and Universities? Are the latter really independent from central Government? Can each of them decide upon future? We believe that these open issues need more attention by both researchers and practitioners to better understand how to support the capability of public universities to generate sustainable community value.

References Amigoni, F. (1978). Planning management control systems. Journal of Business Finance & Accounting, 3, 279–291.

12

C. Bianchi and E. Caperchione

Ansell, C., & Gash, A. (2007). Collaborative governance in theory and practice. Journal of Public Administration Research and Theory, 18(4), 543–571. Argyris, C. (1990). The dilemma of implementing controls. The case of managerial accounting, accounting. Organizations and Society, 15(6), 503–511. Aversano, N., Manes Rossi, F., & Tartaglia Polcini, P. (2018). Performance measurement systems in universities: A critical review of the Italian system. In E. Borgonovi, E. Anessi-Pessina, & C. Bianchi (Eds.), Outcome-based performance management in the public sector (pp. 269–287). Springer. Battaglio, P., Belardinelli, P., Bellé, N., & Cantarelli, P. (2019). Behavioral public administration ad fontes: A synthesis of research on bounded rationality, cognitive biases, and nudging in public organizations. Public Administration Review, 79(3), 304–320. Bhanot, S. P., & Linos, E. (2020). Behavioral public administration: Past, present, and future. Public Administration Review, 80(1), 168–171. Bianchi, C. (2016). Dynamic performance management. Springer. Bianchi, C. (2020). Fostering sustainable community outcomes through policy networks: A dynamic performance governance approach. In J. Meek (Ed.), Handbook of collaborative public management (pp. 333–356). Elgar. Bianchi, C. (2021). Enhancing policy design and sustainable community outcomes through collaborative platforms based on a dynamic performance management and governance approach. In G. Fontaine & B. G. Peters (Eds.), Handbook of policy design. Edward Elgar. Bianchi, C., & Rivenbark, W. (2012). A comparative analysis of performance management systems: The cases of Sicily and North Carolina. Public Performance & Management Review, 35(3), 509–526. Bianchi, C., & Rivenbark, W. (2014). Performance management in local government: The application of system dynamics to promote data use. International Journal of Public Administration, 37(13), 945–954. Bianchi, C., & Salazar-Rua, R. (2020). A feedback view of behavioral distortions from perceived public service gaps at ‘street-level’ policy implementation: The case of unintended outcomes in public schools. Systems Research & Behavioral Science. https://doi.org/10.1002/sres.2771 Bianchi, C., & Xavier, J. (2017). The design and execution of performance management systems at state level: A comparative analysis of Italy and Malaysia. International Journal of Public Administration, 40(9), 744–755. Bonollo, E., & Zuccardi-Merli, M. (2018). Performance reporting in Italian public universities: Activities in support of research, teaching and the ‘third Mission’. In E. Borgonovi, E. AnessiPessina, & C. Bianchi (Eds.), Outcome-based performance management in the public sector (pp. 307–329). Springer. Borgonovi, E. (1996). Principi e sistemi aziendali per le Amministrazioni Pubbliche (Management principles and systems for public administrations). Egea. Borgonovi, E., Bianchi, C., & Rivenbark, W. (2019). Pursuing community resilience through outcome-based public policies: Challenges and opportunities for the design of performance management systems. Public Organization Review, 19(4), 153–158. Bouckaert, G., & Halligan, J. (2007). Managing performance: International comparisons. Routledge. Bovaird, T. (2012). Attributing outcomes to social policy interventions – ‘Gold standard’ or ‘fool’s gold’ in public policy and management? Social Policy & Administration, 48(1), 1–23. Brodkin, E. (2008). Accountability in street-level organizations. International Journal of Public Administration, 31(3), 317–336. https://doi.org/10.1080/01900690701590587 Brodkin, E. (2011). Policy work: Street-level organizations under new managerialism. Journal of Public Administration Research and Theory, 21, i253–i277. Brodkin, E. (2012). Reflections on street-level bureaucracy: Past, present, and future. Public Administration Review, 72(6), 940–949. Brunetti, G. (1985). Il controllo di gestione in condizioni ambientali perturbate (Management control under unstable environmental conditions). Franco Angeli.

Performance Management and Governance in Public Universities: Challenges. . .

13

Choi, I., & Moynihan, D. (2019). How to foster collaborative performance management? Key factors in the US federal agencies. Public Management Review, 2(1), 1–22. Coda, V. (2010). Entrepreneurial values and strategic management. Palgrave Macmillan. Crosby, B., & Bryson, J. (2010). Integrative leadership and the creation and maintenance of crosssector collaborations. The Leadership Quarterly, 21(2), 211–230. Curren, R. (2010). Aristotle’s educational politics and the Aristotelian renaissance in philosophy of education. Oxford Review of Education, 36(5), 543–559. de Lancer, J. P., & Holzer, M. (2001). Promoting the utilization of performance measures in public organizations: An empirical study of factors affecting adoption and implementation. Public Administration Review, 61, 693–708. Dermer, J., & Lucas, R. (1986). The illusion of managerial control. Accounting Organizations and Society, 11(6), 471–482. Douglas, S., & Ansell, C. (2020). Getting a grip on the performance of collaborations: Examining collaborative performance regimes and collaborative performance summits. Public Administration Review. https://doi.org/10.1111/puar.13341 Eterno, J. A., & Silverman, E. B. (2010). The NYPD’s CompStat: Compare statistics or compose statistics? International Journal of Police Science & Management, 12(3), 426–449. Grimmelikhuijsen, S., Jilke, S., Olsen, A. L., & Tummers, L. (2017). Behavioral public administration: Combining insights from public administration and psychology. Public Administration Review, 77(1), 45–56. Hall, J. L. (2017). Performance management: Confronting the challenges for local government. Public Administration Quarterly, 41(1), 43–66. Henman, P., & Gable, A. (2015). “Schooling” performance measurement: The politics of governing teacher conduct in Australia. Policy and Society, 34(1), 63–74. Hofstede, G. (1978). The poverty of management control philosophy. The Academy of Management Review, 3(3), 450–461. Hofstede, G. (1981). Management control of public and not for profit activities. Accounting, Organizations and Society., 6(3), 193–211. Hong, S. (2018). A behavioral model of public organizations: Bounded rationality, performance feedback, and negativity bias. Journal of Public Administration Research and Theory, 29(1), 1–17. Honig, M. (2006). Street-level bureaucracy revisited: Frontline district central-office administrators as boundary spanners in education policy implementation. Educational Evaluation and Policy Analysis, 28(4), 357–383. Hursh, D. (2005). The growth of high-stakes testing in the USA: Accountability, markets and the decline in educational equality. British Educational Research Journal, 31(5), 605–622. Jowett, B. (1885). The politics of Aristotle, VIII, I, 2–11. Retrieved from https://www.stmarys-ca. edu/sites/default/files/attachments/files/Politics_1.pdf Laihonen, H., & Mäntylä, S. (2017). Principles of performance dialogue in public administration. International Journal of Public Sector Management, 30(5), 414–428. Lipsky, M. (1980). Street level bureaucracy: Dilemmas of the individual in public services. Russell Sage Foundation. Martin-Sadersai, A., & Guthrie, J. (2021). Outcomes-based metrics and research measurement in Australian higher education. In Z. Hoque (Ed.), Public sector reform and performance management in developed economies, outcomes-based approaches in practice (pp. 34–48). Routledge. Moynihan, D. (2005). Goal-based learning and the future of performance management. Public Administration Review, 65(2), 203–216. Moynihan, D., Fernandez, S., Kim, S., LeRoux, K., Piotrowski, S., Wright, B., & Yang, K. (2011). Performance regimes amidst governance complexity. Journal of Public Administration Research and Theory, 21(1), 141–155. Moynihan, D., & Landuyt, N. (2009). How do public organizations learn? Bridging Structural and Cultural Perspectives, 69(6), 1097–1105.

14

C. Bianchi and E. Caperchione

Osborne, S. P. (2010). The (new) public governance: A suitable case for treatment? In S. P. Osborne (Ed.), The new public governance? Emerging perspectives on the theory and practice of public governance (pp. 1–16). Routledge. Osborne, S. (2021). Public service logic: Creating value for public service users, citizens, and society through public service delivery. Routledge. Otley, D. (1994). Management control in contemporary organizations: Towards a wider framework. Management Accounting Research, 5(3–4), 289–299. Otley, W. (1999). Performance management: A framework for management control systems research. Management Accounting Research, 10(4), 363–382. Otley, D. (2012). Performance management under conditions of uncertainty: Some valedictory reflections. Pacific Accounting Review, 24(3), 247–261. Ouchi, W. (1979). A conceptual framework for the design of organizational control mechanisms. Management Science, 25(9), 833–848. Pollitt, C., & Bouckaert, G. (2004). Public management reform: A comparative analysis. Oxford University Press. Rajala, T., & Laihonen, H. (2018). Managerial choices in orchestrating dialogic performance management. Baltic Journal of Management, 14(1), 141–157. Rajala, T., Laihonen, H., & Haapala, P. (2018). Why is dialogue on performance challenging in the public sector? Measuring Business Excellence, 22(2), 117–129. Rajala, T., Laihonen, H., & Vakkuri, J. (2019). Exploring challenges of boundary-crossing performance dialogues in hybrids. Journal of Management and Governance, 1–22. https://doi.org/10. 1007/s10997-019-09485-x Reauthorization of the Higher Education Act. Pell Grant and Campus Based Programs (1985). Vol. 2, Hearings before the subcommittee on postsecondary education of the Committee on Education and Labor House of Representatives. Washington, DC, June 25, 27, and July 16, Serial No. 99, 32–37. Retrieved from https://files.eric.ed.gov/fulltext/ED271034.pdf Robb, F. C. (1943). Aristotle and education. Peabody Journal of Education, 20(4), 202–213. Simon, H. A. (1947). Administrative behavior: A study of decision-making processes in administrative organization. Macmillan. Wichowsky, A., & Moynihan, D. (2008). Measuring how administration shapes citizenship: A policy feedback perspective on performance management. Public Administration Review, 63(5), 908–920. Wiggins, A., & Tymms, P. (2002). Dysfunctional effects of league tables: A comparison between English and Scottish primary schools. Public Money and Management, 22, 43–48. Zappa, G. (1957). Le produzioni nell’economia delle imprese (Production in the economy of firms). Giuffrè.

Transparency and Accountability in Higher Education as a Response to External Stakeholders and Rules: A Comparison Between Three Country-Case Studies Anna Francesca Pattaro, Patrícia Moura e Sá, and Johan A. M. de Kruijf

1 Introduction Higher Education Institutions (HEIs) play a fundamental role for society and particularly for the socio-economic competitiveness of the territorial context they are settled in. So, it is relevant to pay attention and analyze their transparency and accountability towards internal and external stakeholders. In this contribution we focus on some resolutions and instruments adopted in three European countries (the Netherlands, Portugal, and Italy) in order to disclose information to external stakeholders about HEIs performance in terms of teaching/ education, research, and other activities, often labeled “third mission”, aiming, for instance, at spreading among society the results of research and validate them. Given the general purpose of the contribution, we focused our attention on resolutions and instruments aimed at external stakeholders (e.g., National Ministry and national agencies, other funding institutions, professionals, students and their families, and socio-economic context) both rule-driven and voluntary, if existent. As about our choice to focus on accountability towards external stakeholders, Freeman (1984) introduced the stakeholder theory, identifying several groups of A. F. Pattaro (*) Department of Communication and Economics, University of Modena and Reggio Emilia, Reggio Emilia, Italy e-mail: [email protected] P. Moura e Sá Faculty of Economics & Research Centre in Political Science (CICP), University of Coimbra, Coimbra, Portugal e-mail: [email protected] J. A. M. de Kruijf Nijmegen School of Management, Radboud University, Nijmegen, The Netherlands e-mail: [email protected] © The Author(s), under exclusive license to Springer Nature Switzerland AG 2022 E. Caperchione, C. Bianchi (eds.), Governance and Performance Management in Public Universities, SIDREA Series in Accounting and Business Administration, https://doi.org/10.1007/978-3-030-85698-4_2

15

16

A. F. Pattaro et al.

people having an interest in accountability of an organization. Later, Bovens (2007) developed a scheme in which he identified the forum as the audience for a particular accountability process. Each forum can have its own needs for information. More recently, van Helden and Reichard (2019) identified fora based on level of knowledge of the members of a forum. On the one hand, they claim that professionals understand highly detailed technical elaborations in financial reports, whereas laymen may get lost in detailed information due to lack of conciseness and lucidity. This issue is likely to emerge in accountability reports of Higher Education Institutions as well. Therefore, we chose to take a position biased towards laymen accountability, which means that we had a look to the kind of information that may be relevant for students and their parents, as well as for politicians. The latter group is included based on the notions in van Helden and Reichard (2019) that most politicians qualify as laymen when it comes to reporting and disclosing issues. During the last decades, both literature and practice have offered much attention to the pursuit of transparency and accountability in Public Sector activities. Those values are considered fundamental for a sustainable and effective internal management, a correct relationship with citizens and firms—also in terms of democratic representation—and a possible protection against corruption. Literature investigated, among the other things, the nature of accountability and of the related support systems (e.g., Romzek & Dubnick, 1987), the roles assumed by the actors involved (e.g., Bovens, 2005), the effects and links between accountability and transparency (e.g., Armstrong, 2005; Hood, 2010) especially with the use of ICTs (e.g., Pina et al., 2007); the relationships between accountability, democracy, and citizen participation (e.g., Blair, 2000; Behn, 1998), the relation between managerial reforms, performance, and accountability (e.g., Peters, 2007; Moynihan & Ingraham, 2003; Barberis, 1998); as well as the paradoxes inherent in the concept itself (e.g., Jos & Tompkins, 2004; Ezzamel et al., 2007). In addition, many national governments have imposed regulative measures to public sector institutions for the disclosure of both financial and non-financial performance through the adoption of several instruments (reports, statements, accounting practices, data gathering, etc.), especially after the diffusion of New Public Management (NPM) perspective in public management and public services delivery (e.g., Newman, 2011). However, performance management is becoming more and more important and critical in the public sector since, as Pollitt (2018) claimed, nowadays it is implemented more broadly than in a NPM context only. Currently, National Higher Education Systems are also increasingly influenced by supra-national institutions, such as the OECD and the EU, supporting not only the adoption of transparency and accountability schemes, but also endorsing a strongly evaluative culture, with an emphasis on measuring, assessing, and benchmarking performance (e.g., Argento & van Helden, 2021). Accordingly, HEIs are at present expected to demonstrate high levels of accountability and transparency about the way resources are applied and the return on the use of resources. As EU guidelines became more effective after some years of enforcement, it looks like differences between EU countries in their implementation are partly smoothing, even though

Transparency and Accountability in Higher Education as a Response to. . .

17

the very peculiar institutional, political, and social traditions still have great impact in their accomplishment. The purpose of this contribution is to offer a general overview of the present accountability requirements and of the instruments or resolutions, rule-driven but also voluntary, adopted in three European countries to propose a general assessment of the degree of transparency and accountability of public-funded HEIs. The HEI-concept in this study includes traditional universities as well as polytechnics that comply with the Bologna agreements. The focus is on documents, initiatives, and instruments used to inform the external stakeholders of public universities about the performance achieved (e.g., resources committed, results, efficiency, effectiveness, and possible outcomes/impacts measures) in teaching, research, and other activities, often labeled “third mission,” such as validation of research in society. International comparison can be a relevant element of interest and distinctiveness, but also an important way to collect ideas and suggestions for future research avenues. In addition, this chapter aims at contributing to the literature on transparency and accountability, but also on performance information disclosure in the specific environment of public-funded universities and other Higher Education Institutions. We also plan to impact on practice, offering to managers and policymakers an assessment of the present situation through a comparison of some European countries, as well as some suggestions about possible future amendments and improvements. From a methodological point of view, this exploratory qualitative research is not intended to be exhaustive. It has been conducted through an analysis of literature and national legislation, and of the information disclosed in different institutional websites in each country. It is meant to offer a general overview and assessment of the degree of transparency and accountability of public-funded HEIs in three EU countries the authors are familiar with. According with Yin’s plea (Yin, 2018, p. 61), a replication rather than a sampling logic has been adopted in selecting and presenting country-case studies which are anyway endowed with some common elements. The remainder of the chapter is organized as follows. First, we offer some references to literature about the main challenges HEIs are recently facing in terms of accountability, transparency, and quality evaluation and accreditation. Afterwards, in Sect. 3 we present three country-case studies about a general overview of the instruments or resolutions, especially rule-driven, adopted in the Netherlands, Portugal, and Italy, on transparency and accountability of public-funded HEIs towards external stakeholders. Finally, Sect. 4 provides some analytical and critical reflections on the case studies findings.

18

A. F. Pattaro et al.

2 Accountability, Transparency, and Quality Evaluation in Higher Education Institutions As Hazelkorn (2018, p. 428) stresses referring to HEIs, “higher education has traditionally relied on peer-review and self-reporting and has asked the public to trust this form of accountability”, thus it has been accused of being “too self-serving and insufficiently interested in student learning or outcomes” (Hazelkorn & Gibson, 2017). Yet, in the last decades of the twentieth century it became obvious that such a system was no longer adequate to face the challenges of massification and increasing internationalization of Higher Education Systems worldwide. Thus, HEIs needed to be more accountable and responsible to the public for quality. As Eaton (2016) states, “it is about meeting the needs of students, society and government. It is about the effectiveness and performance of colleges and universities, as well as their transparency of their efforts. Accountability is about higher education serving the public interest and about higher education as a public trust”. According to Jongbloed et al. (2018), increasing demand for transparency is mainly linked to the fact that in most countries financial contributions made by students and/or taxpayers are rising and that the number of HE providers (and programs they offer) is increasing making the regulation of the system more complex. Consequently, several mechanisms—like, for instance, quality evaluation and accreditation, but also performance evaluation in education and research—have emerged that rely on independent or external verification, also at international level. Van Vught and Ziegele (2011), cited in Gunn, 2018, (p. 505) define transparency tools as “instruments that aim to provide information to stakeholders about the efforts and performance of higher education and research institutions” (p. 25). According to them it is expected transparency tools to be based on the users’ information needs and to take into account their capacities to process information for these stakeholders. In this context, as Gunn (2018) states, the main stakeholders—that correspond in this context to whom the HEIs are accountable—identified in the literature are: Students and their Families, Academics, Institutional Leaders, Funders, Public Authorities and Policy Makers, Industry and Businesses, and Society in general. The variety of transparency tools, aiming to cover the main missions of HEIs, namely teaching, research, knowledge transfer and community engagement, has significantly grown over the last 60 years (Morris, 2018). These instruments tend to focus on: the ways learning outcomes are set up and achieved, graduate attributes, life-sustaining skills, and research impacts. Nevertheless, as a consequence of the Bologna Process, learning outcomes have assumed a central role in HEIs. They are regarded as part of accountability regimes providing information that can be compared (Lennon, 2018). Establishing learning outcomes clarifies expectations of student knowledge, skills, and abilities that can be used as indicators of educational quality, together with the increased use of students’ satisfaction surveys. As Jongbloed et al. (2018) claim, over the years transparency instruments regarding teaching and education have moved from a focus on inputs (numbers and

Transparency and Accountability in Higher Education as a Response to. . .

19

qualifications of teaching staff, staff-student ratios) to a focus on outputs (e.g., students’ satisfaction, degree completions rates), outcomes (e.g., graduate employment), and impacts (that are even harder to measure). According to Hazelkorn (2018, p. 433), for instance, people essentially want to know “how effectively students are learning, what they are achieving, and how personnel, institutions and the systems overall help students to succeed”. On the other hand, in research, international competition and the diffusion of the mentality more and more inspired by the “publish or perish” attitude in recruitment, funding, and research evaluation are observable. Recently, concerns about the over-emphasis of most of these instruments on research productivity have become more evident, and they also influenced teaching and knowledge transfer and community engagement missions (e.g., Argento & van Helden, 2021). In effects, the recent reforms in Higher Education Institutions all around Europe, partly inspired to NPM principles, have determined not only some positive effects, but also some negative ones. The positive side is that, for instance, the diffusion of managerial values in HEIs often determined efficiency improvements which, according to Parker (2011), helped universities recruiting increasing numbers of students without a proportionate rise in teaching budgets. At the same time, the general internationalization in education and research activities contributed to improve the general level of quality and the amount of research products, and moreover it contributed to make teaching programs more updated and attractive. On the other hand, the increased use of metrics, sometimes incorrectly associated with quality (e.g., Söderlind & Geschwind, 2019; Kallio et al., 2017), and rankings could create useless pressure on academics and, at the same time, undermine the academic freedom and self-determination, particularly for those with less academic experience (Cleaver, 2021). In addition, it is worth to consider the possible side effects of the competitive pressures in both teaching and research such as damaged motivation and worsening of collegial work relationships (van Helden & Argento, 2020b), or even the dismissal of some less “profitable” or fashionable disciplines (Parker, 2011). Over the last decades, European countries have experienced quite similar strategies aimed at enhancing institutional autonomy and greater managerial steering (Capano & Pritoni, 2020), with the expectation of contributing to increased HEIs performance. According to Kuhlmann and Wollmann (2014) five different institutional models could be identified in Europe: Anglo-saxon, Nordic countries, Continental Europe, Continental Napoleonic countries, and Eastern Europe tradition. These institutional and political models can be crossed and can resonate with different (ideal typical) modes of regulating higher education, and therefore to guarantee accountability and transparency, presented by Lodge (2015) which builds on Hood and colleagues’ comparative studies of regulatory regimes (Hood et al., 1999, 2004). According to Lodge the different ways of organizing regulation in higher education can be: “contrived randomness” based on anonymity of reviewing process, circulation of staff, changing evaluation/assessment criteria; “oversight” focused on reporting to ministries/agencies, growing inspection and evaluation systems, curriculum setting, appointment by ministries; “rivalry” based on League tables, competition over grant funding and student recruitment; and finally

20

A. F. Pattaro et al.

“mutuality” focused on academic peer-review, collegiate decision-making, emphasis on decision-making by committees. Currently, the literature points to the existence of three groups of regularly used transparency instruments in Higher Education Institutions: Quality Assurance and Accreditation schemes—policy tools put in place by public authorities; Rankings and Bibliometric Systems—mainly originated by private initiatives (e.g., rankings produced by media organizations) and reflecting the growing marketization of Higher Education; and Performance Agreements/Contracts. A. Quality Assurance and Accreditation schemes—policy tools put in place by public authorities In Jongbloed et al.’s (2018, p. 445) words, “Accreditation is the simplest and, therefore, prima facie most transparent form that quality assurance can take”. It leads to pass/fail or graded judgment that typically is a condition for recognition of the institution or the program assessed and their public funding. Since the last part of the twentieth century, students assessment is regarded as an important part of accreditation processes. In Europe, this is clearly assumed in the ESG standards and guidelines for quality assurance in the European Higher Education Area, prepared by the European Association for Quality Assurance in Higher Education (ENQA) in cooperation with the European Students’ Union (ESU), the European Association of Institutions in Higher Education (EURASHE), and the European University Association (EUA), and afterwards revised by the E4 Group (ENQA, ESU, EUA, EURASHE) in cooperation with Education International (EI), BUSINESSEUROPE, and the European Quality Assurance Register for Higher Education (EQAR) (https://enqa.eu/index.php/ home/esg/). This view is stressed by Jongbloed et al. (2018, p. 446), when they state that “the degree to which study programmes succeed in making students learn what the curriculum intends to teach is assumed to present a more transparent, more pertinent, and more locally-differentiated picture of quality.” Nevertheless, the idea of quality assurance as a transparency tool is not consensual. Gunn (2018), for instance, argues that quality assurance “comes from within the higher education community” (p. 505) and is mainly directed towards enhancement, whereas transparency tools “tend to be imposed from outside” (p. 505) and mainly “serve agendas and stakeholders outside the academic community” (p. 506). But from our experience we tend to consider quality assurance and accreditation schemes as transparency tools, since they often include further requirements in terms of disclosure of information and data about teaching and education activity. B. Rankings and Bibliometric Systems—mainly originated by private initiatives (e.g., rankings produced by media organizations) and reflecting the growing marketization of Higher Education Rankings (such as the Times Higher Education, the QS or Shanghai League tables) offer snapshot pictures of the performance of HEIs in different areas of their activity. Such easy-to-understand tools are attractive for external (laymen)

Transparency and Accountability in Higher Education as a Response to. . .

21

stakeholders (Jongbloed et al., 2018, p. 447). However, they have been extensively criticized since: • they provide a single, fixed-ranking for all their stakeholders, disregarding that stakeholders might well have different needs; • they ignore intra-institutional diversity; • they use available information on a narrow set of dimensions only, overemphasizing research; • and in addition the various indicators used are weighted by the ranking producers and lumped into a single composite value for each university. Recently, a new type of ranking has been created: the U-Multirank (https:// www.umultirank.org/). As Jongbloed et al. (2018) explain, the U-Multirank takes a multi-dimensional view of higher education performance; when comparing HEIs, giving information about each of the HEIs activities: teaching and learning, research, knowledge transfer, international orientation, and regional engagement. In addition, U-Multirank invites its users to compare institutions with similar profiles. Moreover, it allows users to choose from a menu of performance indicators, without combining indicators into a weighted score or a numbered league table position. It also gives information on academic disciplines or groups of programs, especially relevant for large institutions with a broad scope of programs. C. Performance Agreements/Contracts between individual HEIs and their government(s) or funding authorities Performance contracts are agreements between individual Higher Education Institutions and their government(s) or funding authorities that tie (part of) the institution’s public funding to its ambitions in terms of performance (Jongbloed et al., 2018, p. 449). Thus, this instrument encourages HEIs to think strategically and to select and negotiate their goals according to their strengths and the individual contexts they face, and according to the level of autonomy attributed to individual universities and HEIs in their national governance system. Furthermore, performance contracts lead HEIs to publish information about their efforts and results on the specified areas. Considering the extant literature on accountability instruments in HEIs presented in the present paragraph, in the following ones we examine the solutions specifically implemented in three EU countries: the Netherlands, Portugal, and Italy.

3 Country-Case Studies In this contribution a comparison between the experiences of accountability and transparency in public-funded universities and HEIs in some EU countries endowed with different, but not so distant, political, institutional, and social characteristics and traditions, is performed.

22

A. F. Pattaro et al.

The choice of the three countries is related to the resolution to compare three EU countries the authors are familiar with and, according with Yin’s plea (2018, p. 61), a replication rather than a sampling logic is adopted in selecting and presenting country-case studies which are anyway endowed with some common elements. From an institutional and regulative point of view, some similarities between the three countries concern the tendency, in the last decades, towards the decentralization of responsibilities and decisional power from central government and ministries to single HEIs. This could also have an impact on accountability since it may imply a shift in focus from vertical accountability (towards government, ministries, and national and international agencies) to a focus also on horizontal accountability (students and families and local socio-economic stakeholders) (e.g., Bovens, 2005). Nevertheless, as Carvalho and Diogo (2018) state, the autonomy of HEIs risks to be only formal, given the significant control on the compliance with regulation from the national government layer about performance results in education, research, and third mission. In addition, it is worth to remember that Portuguese and Dutch higher education systems are both binary (with universities and polytechnics), while in the Italian system this distinction is not relevant, as further explained in the case study. Nevertheless, the three countries share many similarities in terms of academic positions and their hierarchical career ladders. As mentioned above, over the last decades European countries have experienced quite similar strategies aimed at enhancing institutional autonomy and greater managerial steering (Capano & Pritoni, 2020), with the expectation of contributing to increased HEIs performance. Yet, Italy and Portugal are still rather influenced by the so-called Continental governance model (Clark, 1983; Capano & Pritoni, 2020), characterized by hierarchical coordination through state-centered policies and relatively smaller institutional autonomy compared to that typical of Anglo-Saxon countries, which has a stronger influence on the Netherlands (see Kuhlmann & Wollmann, 2014). Therefore, having Italy, Netherlands, and Portugal as case studies allows to compare how three countries that face EU identical regulatory models— based on national standards, procedures for monitoring and evaluation, criteria for financial rewards, and diverse internal institutional governance arrangements— might react differently to accountability and transparency demands. For instance, in the Netherlands the law recognizes HEIs as eligible for public funding according to student numbers, graduations as well as quality related issues, whereas in Italy, even though the principle is the same, the application of these rules tends to be softer and the percentage of public funds depending on these outputs is less significant (Capano & Pritoni, 2020). In effects, HEIs in Italy adopt different options to offer some rewards to academics obtaining specific performance results in education, research, and in the so-called third mission. The country-case studies are organized as follows: first an overview of the HEI system in the country considered is offered. Next, the legislative framework in terms of transparency and accountability and financial performance disclosure is presented. Then, we describe the main instruments of transparency, accountability, and performance evaluation. Finally, some relevant voluntary initiatives are highlighted.

Transparency and Accountability in Higher Education as a Response to. . .

23

In order to assess the level of transparency of HEIs towards the external stakeholders, especially the laymen—such as present and potential students, their families and in general the socio-economical local stakeholders—the description and plain analysis of case studies also pay attention to the following issues: • the existence and the content of specific requirements at national level concerning transparency and accountability (what must be disclosed, towards whom and how. . .). • the availability and easiness to access to information about the universities’ performance in each country, also considering where it is possible to find those data (e.g., some specific Databases from relevant national ministries, National Statistics Office Database or each HEI website. . .). • the usability and understandability of information (e.g., if data are open or not; provided singularly or in official reports, plans, budgets, or balances, for instance, in PDF format; clearly understandable for the public).

3.1 3.1.1

The Netherlands’ Country-Case Introduction

The organization of Dutch education has a long and complex history. Key in the debate was about who is allowed to establish a school and who has entitlements to government funding. The problem was solved in 1917 with a political deal that allowed equal funding for public and private (mainly church related) schools (Lijphart, 1982, pp. 100–106). This basic rule still holds and is applicable for higher education as well. It means that all institutions that comply to higher education legislation are eligible for public funding. Some really privately funded HEIs exist but they are small both in numbers and in students. Higher education is split up in 13 Universities and 36 Polytechnics (including those for arts) with some 750,000 students, which is some 50% more than the number of students in secondary vocational education (Statistics Netherlands, 2020). Higher education funding amounts in 2020 to some € 8½ bn. on a total budget for education of some € 42 bn. Funding is a mixture of fixed contributions and student numbers/performance-based contributions. In general, polytechnics have a focus on education, whereas universities provide both education and research. This is reflected in funding: over 90% of polytechnics funding is education related, for universities this is some 50% (Parliament, 2019, pp. 64–65). On top of regular funding a separate government budget of some € 1 bn is available for research only. This budget is distributed based on competitive grant tenders organized by a separate, more or less independent, agency KNAW. The organizational setting of higher education is completed with two other agencies: an accreditation institution (NVAO) and an institution that advises government on research and manages a number of research institutions (NWO). From an accountability perspective, the

24

A. F. Pattaro et al.

Inspectorate General for Education, a unit within the ministry of Education, operates as a meta-evaluation entity on quality issues and assesses compliance and financial resilience of Higher Education Institutions. Next to that, an independent spin off from early quality assurance efforts started by the joint universities (QANU) still operates on the quality assurance market. A more detailed history on quality assurance in the Netherlands is given by Chu and Westerheijden (2018). They claim that the system has generated a new governance structure between government and universities with more autonomy and degrees of freedom in management for universities. At the same time, Chu and Westerheijden (2018) note that, due to the impact of a negative quality assessment on education, higher education institutions prefer to be on the safe side of accreditation standards, possibly at the expense of innovation.

3.1.2

Accountability for Whom?

As the legislator provides resources for education and research, accountability is firstly oriented towards the legislator and main sponsor, the Ministry of Education. Second other sponsors, providing EU-grants, KNAW-grants, or private—third mission—funding require accountability as well, depending on specific arrangements made when the sponsor provides resources. It means that accountability can be financial and non-financial in case of EU-grants but is basically non-financial in third mission commissioned research unless otherwise agreed upon. Essentially, this type of accountability can be labeled “vertical” accountability (Bovens, 2005, p. 196), be it for reasons of general accountability towards the political system or in a more commissioner-supplier oriented relation in case of earmarked grants or third mission activities. Horizontal accountability (Bovens, 2005, p. 198) is found internally at the different levels of university, faculty, and departments. The legislator requires that staff and student councils are involved in both financial and non-financial accountability. External horizontal accountability aims at (future) students and their parents, (future) third mission commissioning organizations and other stakeholders like trade unions, recruiters, general public, or journalists. A not institutionalized, but in practice regularly used, horizontal accountability practice is a form of advisory board to faculties in which employers, alumni, and staff discuss alignment between academic programs in research and education and demand in society.

3.1.3

Legal Framework

This section addresses the key regulations and accountability and financial reporting issues for HEIs in the Netherlands. The process of quality assurance is not taken into account, although the results of quality assurance processes can be included in the accountability processes. The following section will discuss non-financial accountability themes with an emphasis on horizontal accountability.

Transparency and Accountability in Higher Education as a Response to. . .

25

In terms of vertical accountability, the ministry of Education, Culture, and Science is responsible for all levels of education including its primary funding. Furthermore, the Ministry subsidies science programs, but has outsourced decision-making on funding of science to KNAW. From an accountability perspective, there is one uniform regulation on reporting [RJO 2008] that holds for all levels of education. This regulation is effective as of 2008 and replaced a number of education-level specific previous arrangements. By means of appendices, extension in level of detail in reporting is possible, but this mainly (44 out of 47 appendices in 2020) affects primary and secondary education. From a pure financial accounting perspective, the regulation refers to a separate education section in Dutch GAAP regulations (RJ460). The core of RJO 2008 consists of financial reporting elements supplemented with some specific details. The financial data are based on a consolidated (i.e., all faculties and research entities) financial statement of the individual Higher Education Institution. This includes Balance sheet, Profit and loss statement, Cash-flow statement, and an external auditor’s statement on true and fair representation of financial position as well as compliance to legislation. Separate disclosure at faculty level is not required in RJO2008 but legislation requires financial and performance data are discussed at HEI-level and within faculty with both staff and student representatives. There is no regulation requiring a consolidated financial statement of all HEIs. Having said that, the Inspectorate General for Education annually prepares a report on the education system covering both financial and non-financial issues. In its most recent report comments are made on gaps in systematic reporting on student supervision and student performance (Inspectie van het Onderwijs, 2021, p. 175). Furthermore, financial data of all HEIs are published on a website (DUO, n.d.), disclosing in detail, e.g., revenues from contract research and its funders, contract education, tuition fees, and many more. In the notes to the annual report the following specific issues should be addressed. First, remuneration of executive and non-executive board members must be disclosed at the individual level. This is a requirement that holds for any publicly funded institution in the Netherlands. The idea behind this disclosure is that nobody in the public sector is allowed to receive remuneration above the level of remuneration of the Prime Minister. Second, for executive board members, a full specification of expenses must be disclosed (art. 4.3 RJO 2008). Both these disclosures are regarded as highly political salient issues and covered by annual reporting and a special website. Three other financial issues that must be disclosed are first, expenses with respect to costs of elderly staff with specific arrangements. Second, disclosure of grants to students who participate in (student)boards, participate in high level national and international sports, or need specific arrangements, for example, due to illness or disabilities. Expenses for these groups must be disclosed on the level of total expenses for each of these three groups as well as on the average grant issued within the fiscal year. Third, on an annual basis, the Ministry of Education may issue a letter in which it requests disclosure of information that is regarded as “politically salient or relevant for society” (art. 4.6 RJO 2008).

26

A. F. Pattaro et al.

Next to the details in the notes mentioned above, two other elements within the annual reporting should be mentioned. First, the non-executive board of the institution should report on its activities during the fiscal year including issues like items discussed and impact of these debates on the operations of the institution (art. 2f RJO 2008). Second, introduced in 2013, a separate section in the annual report is forward looking for at least 3 years, or 5 years when substantial investments are planned. Whereas in a normal accountability process continuity of operations is assumed and part of the assessment of an auditor, Parliament (2013, p. 16) wanted more information on stability and continuity of education institutions in general. The so-called continuity section in the annual report is the operationalization of that. It includes information on staff (split in management, education staff, other staff), multi annual balance sheet, and profit and loss estimates. Not implemented yet but announced is additional information on quality assurance and quality programs within the institution. These requirements will be implemented when funding of institutions will be changed in the near future. The change of funding gives a bit more focus on quality of education than in the current funding system. A last issue to be mentioned is that as of fiscal year 2021, a reflection on results by the employee and student council of the institution is required.

3.1.4

Accountability Issues

In the previous section, the general legal framework on (financial) accountability issues was discussed. In this section the emphasis is on non-financial accountability, particularly on publicly available research output and student information that can be found in national databases. Quality and completeness of research publications datasets depend on how uploading information is organized. University staff is generally assessed on both research and educational performance and therefore, incentives to upload publication data into university repositories exist. Open Access publications are available, although this is generally limited to publications of a more recent date. The numbers of Open Access publications available vary among universities, independent from a university’s size. The data in the university repositories are aggregated by the science agency KNAW into a single database (NARCIS) on publications and datasets originating from universities. The database now includes over two million publications including one million articles and more than 250,000 book(sections). Furthermore, some 300,000 datasets are included in the database. Some 40% of publications are qualified as open access publication and some 97% of all datasets are open access. However, finding the NARCIS database is not easy for non-professionals. There is no direct link on KNAW’s homepage, which means that assuming that a non-professional knows of the existence of KNAW, perseverance is needed to find research information. Separate from the KNAW-database, the polytechnics have developed a separate database (HBO-Kennisbank.nl) disclosing some 60,000 documents, mainly student’s thesis (over 50%) and some 10,000 articles.

Transparency and Accountability in Higher Education as a Response to. . .

27

General data on student characteristics is also available, mainly based on a national system of subscription of students. The data available to stakeholders encompasses information on Higher Education Institutions, educational programs, graduation, freshmen, gender, and level (BA/MA). This system is operated by an agency of the Ministry of Education and presented as open data information, allowing for further analysis by individual users. Although the information is available, the relatively technical contents seem to be aimed at professionals, rather than students looking for qualitative information on educational programs. Individual Institutions At the level of individual institutions, there are no prescriptions on how information to stakeholders should be provided. In this assessment, a search for annual reports on the institutions’ websites was done and next to that, information on research and education available on the websites was assessed. The population studied covers 14 universities and 27 polytechnics. First, annual reports were found in 40 cases. In two cases, next to the regular annual report, extending accountability to UN Sustainable Development Goals was found on the same webpage as the regular annual report. In general, performance on research and education and the information provided on websites is likely to be more relevant for society and stakeholders than an annual report, be it merely because an annual report is backward looking and on a rather aggregated level. In this chapter more general accountability issues are discussed that are relevant for horizontal accountability. That means that, e.g., internally relevant issues with respect to HR are not included. Often that type of information is covered in the annual reports. When concentrating on research and education accountability on websites, a clear distinction between universities and polytechnics can be observed. Some highlights are discussed below. The domains of expertise in both research and education can be found at all institutions. In general, universities provide more detailed information on research and publications than polytechnics do. Two main explanations for these differences seem to emerge. First, polytechnics only started some 20 years ago with research programs and their core publicly funded business is education, it is not really surprising to see that information on research is relatively lagging compared to what universities provide. This is particularly visible when it comes to statements or documents on research quality (visitation/research assessment documents) and on research integrity. Second, the relatively low number of lectorates at Polytechnics (about 700 on about 50,000 staff; werkenbijhogescholen.nl, n.d.) may be a reason for less attention for research related information at polytechnics. Another possible explanation can be found in funding. Core of research in polytechnics focuses on immediately applicable research sponsored by companies rather than publicly funded. That may lead to restrictions in publishing results as is the case with some privately sponsored research at universities. A last, more hypothetical issue could be that publication

28

A. F. Pattaro et al.

outlets for polytechnics are more limited in terms of journals in which research can be published. That hypothesis has not been operationalized in this contribution. From an education perspective, differences emerge as well. From a student perspective, information on contents and quality of programs contributes to decisions on selecting a particular program. Information on education programs and their contents, including issues like ECTS is in general available. Universities provide more information on student as well as peer rankings compared to polytechnics and the same holds for issues related to education quality. It should be noted however that some polytechnics have a separate webpage addressing issues like job perspectives and performance—including student assessments—data. Having said that, some comments on possible gaps in accountability issues are made. Higher Education Institutions may have differences in didactical approaches towards education but that type of information is hardly available on websites. The same seems to hold with respect to (education) quality or vision and mission statements by individual institutions. Finally, both the association of universities and the association of polytechnics operate a separate website on which “facts and figures” are disclosed for the aggregated level of Higher Education Institutions. This includes, for example, funding, number of students, staff and HR, quality issues and the like. Although that information is relevant, the question is whether laymen would think of websites like these for information on higher education. In sum, vertical accountability and quality assurance processes are relatively well organized in the Dutch setting, based on standard reporting frameworks and a relatively long history of accreditation processes linked to funding of education. In case of horizontal accountability, differences exist in the research domain. Access to research information from universities is provided in another database than information on research from polytechnics and has a much longer history. At the level of individual institutions research information is available. From a student perspective, one can find quality assessments on programs. It seems that structured information on didactical approach or vision on education at individual institutions is not structurally provided. So, although accountability information on research and education is publicly available, particularly on the education part of information improvements are still possible.

3.2 3.2.1

The Portuguese Country-Case Introduction

The higher education system in Portugal is binary, comprising university institutions and polytechnic institutions, which can be public or private. The first University was created in Portugal back in the thirteen century (1290) in Coimbra. Until the beginning of the 1970s, there were only four universities (all of them public). It was only after the Democratic Revolution of 1974, with the expansion of higher

Transparency and Accountability in Higher Education as a Response to. . .

29

education systems, that many institutions were created. There are currently 13 public universities and an Open University (Universidade Aberta), 15 public polytechnics, and 32 polytechnic colleges. In the private sector, there are 7 universities (including the Catholic University bearing a special status), 4 polytechnic institutes, and 72 colleges for university and polytechnic education. As argued by Amaral and Teixeira (2000, p. 246), “until the mid-1970s the Portuguese higher education system was clearly an elite system,” with low enrollment levels. In the 1980s and in the first part of the 90s, a significant number of private players emerged. The growing of the HE private sector stopped in the last years of the twentieth century. On its turn, the emergence of the polytechnic subsystem was intended to have a strong applied and technical emphasis and a marked vocational orientation adapted to regional needs (Winckler et al., 2018). While universities are mainly located in cities, a significant number of polytechnical institutions are situated in low density regions, far from coastal areas. Public institutions (both polytechnic schools and universities) are influenced by some governmental nationwide policies regarding the number of applications allowed and tuition fees for first degrees and there are also common standards for ranking students’ degree preferences and their grades in secondary school and on national exams. Private institutions, on the other hand, are free to determine the number of available applications and the tuition fees they charge. However, there are some important differences between universities and polytechnic institution as to the possibility of offering third cycle programs, with, up to now, universities only being able to give PhD degrees. Moreover, although both polytechnics and universities engage in research activities and projects, the importance given to research is substantially higher among universities. On the opposite, polytechnic institutions capture the majority of mature students (+23 years old) and offer more vocational study programs. It is interesting to notice that, as highlighted by Sin et al. (2017), no statistically significant differences exist between universities and polytechnics in the results of accreditation processes. Private Higher Education Institutions are subject to the Ministry of Education and are governed by the Statute of Private and Cooperative Education. In order to be part of the system, a private HEI have to obtain prior recognition of the Ministry with the authority of the HE. All the courses delivered by public and private institutions that grant an academic degree must also obtain an accreditation from the Portuguese Agency for Assessment and Accreditation of Higher Education (A3ES). In 2019, there were around 385.000 students enrolled in Higher Education, 316.000 in public institutions and 69.000 in private institutions (Source: https:// www.pordata.pt/Portugal). Around 248.000 were studying in universities and 137.000 in polytechnics (https://www.pordata.pt/Portugal). All Portuguese HEIs (public and private, universities and polytechnics) charge tuition fees to students, with the government establishing for public institutions the maximum of €697 per year for bachelor’s degrees (academic year 2020/2021), an amount that has been decreasing over the last years.

30

3.2.2

A. F. Pattaro et al.

Accountability for Whom?

When analyzing the main stakeholders of HEI in Portugal, to whom they must be accountable, Central government clearly emerges as a major reference. Following a period of high centralization, the self-regulatory model that has been consolidated over the last decades after the release of the Lei de Autonomia Universitária (University Autonomy Law or Law 108/88) was an important landmark in the change towards a new type of relationship between government and HEIs. Institutions were awarded “statutory, scientific, pedagogic, administrative, financial and disciplinary autonomy.” Given their importance, scientific and pedagogical autonomy are stressed. Scientific autonomy consists of the ability to define, program, and execute research and other scientific activities. On the other hand, pedagogical autonomy encompasses the ability to draw up curricula, define curricular unit objects, define teaching methods, affect resources, and choose knowledge assessment processes (https://www.dges. gov.pt/en/pagina/portuguese-higher-education-system). Moreover, as stated in the Ministry web page, these autonomies include subjects such as the specific conditions of entry into study cycles, the conditions of study cycles, study plans, precedence and evaluation schemes, the prescription regime, curricular transitional norms, deadlines for issuing academic documents, changes in schedules and operating regimes, or deadlines for responding to requirements. As it usually happens, increased levels of autonomy have been accompanied by new mechanisms of control, especially in what refers to the pedagogical activities carried out by academics, and by the replacement of ex ante by ex post forms of supervision (Carvalho & Diogo, 2018). Likewise, the steps given to increase institutional autonomy were followed by changes in HEIs governance models. In this regard, the Law 62/2007 established the Legal framework of tertiary education institutions (RJIES), which introduced important changes in the governance of Portuguese HEIs. Until then, universities were almost exclusively run by academics with high level of professional self-regulation and rely mainly on collegial decision-making. With the new diploma, some governance bodies became extinct, e.g., the University Senate, and others were created, like the General Council. The participation of lay members in university governance became obligatory. The possibility of universities to become foundations was also introduced. In fact, the diploma establishes the possibility of HEIs to “adopt an institutional model of organization and management deemed most appropriate for the performance of their mission, as well as the specificity of the context in which they operate”, subject to compliance with the law. A HEI adopting the foundational form enjoys a greater degree of organizational autonomy (i.e., have more freedom to decide on their own internal governance, institutional leadership, and accountability structures). Before the RJIES, Portuguese universities had four governance bodies: the Rector, the Academic Senate, the Assembly, and the Administrative Board. After the new legislation, these bodies were reduced to three: the Rector, the General Council, and the Management Board. Universities that opted for a foundational

Transparency and Accountability in Higher Education as a Response to. . .

31

model have a Board of Trustees. With these changes in governance models, the importance of some external stakeholders was enhanced. An evidence of this move towards greater openness of HEIs to external stakeholders’ participation is the fact that the General Council (the Board that elects the Rector based on an international application process and that monitors its action) has representatives from internal actors (academics, students, and non-teaching staff) but also from external stakeholders. Thus, the General Council (composed of 15–35 members, depending on the size of each institution), has representatives of academics and researchers (55%), student representatives (15%), and publicly recognized external representatives (30%). All these changes show that, even if Central Government remains a major stakeholder, HEIs are increasingly aware of the need to disclose information relevant to a wide range of parties, including students, parents, and politicians. This marks a path towards the increase of horizontal accountability. In terms of financial accountability, one can say that Portuguese HEIs mainly remain accountable to the Central Government although other important sources of funding have been gaining importance. As Teixeira (2009) highlights, HEIs have obtained increased financial autonomy, but at the same time, they face a more demanding attitude by state authority in terms of public funding, though, for example, the emergence of conditional funding. Once again, HEIs choosing the foundational regime have more freedom to decide on diversification of sources of income, on the internal allocation of public and private funds and to borrow funds on the capital market. As an example, those universities that became public institutions ruled under private law (Foundations) were allowed to derive two-thirds of their budget from other sources. However, in general, despite these changes, Portuguese HEIs are still highly dependent on government income, which is mainly associated with student numbers. At the same time, reporting requirements have substantially increased (Carvalho & Diogo, 2018). Teaching, research, and financial performance all need to be reported to external agencies. In what concerns teaching activities/study programs, the A3ES is a very relevant stakeholder. On the other hand, research centers are evaluated by the Portuguese Foundation for Science and Technology, in a process that involves international and national experts. Public funding is assigned according to the research centers classification. Finally, in Portugal new audit and accountability bodies were also created, like the Auditor and the Management council (which controls Rector Actions).

3.2.3

Legal Framework

The changes implemented in the Portuguese HE system described above promoting greater autonomy and more open governance models have also be accompanied by increased control of public expenditure, with institutions having held more responsible towards the Government and the society in general (Fonseca et al., 2020). In this regard, as it happens in other areas of public management, auditing in HE can

32

A. F. Pattaro et al.

assume a preponderant role, not only in validating financial information, by assessing compliance with standards and rules and thereby verifying financial execution legality and regularity, but also in controlling non-financial information (Fonseca et al., 2020). Auditing is thus regarded as a response to the obligation of rendering accounts transparently, and in an accurate and accessible way. Financial accountability to the State is regulated by Law no. 37/2003, which concerns the financing of higher education and institutes a calculation formula. Within this legal diploma, the legislator establishes that accountability should be based on the following documents: (a) balance sheet; (b) profit and loss statement; (c) budgetary execution statement (one for revenue and another one for expenditure); (d) cash-flow statement (displaying the budgetary performance in cash terms); (e) financial situation statement; (f) notes to the financial and budgetary statements; (g) management report; and (h) auditors’ opinion. Part of those statements are accrual: financial statements are accrual, budgetary reports statements are cash, The Law no. 37/2003 also clarifies that the financial data are based on a consolidated (i.e., all faculties and research entities) financial reporting of the higher education institution. In line with happens in Public Administration as a whole, individual performance evaluation is compulsory. University academic staff is typically evaluated on 3-years cycle based on research, educational/teaching activities, and third mission criteria. Given the shortcomings of financial reports, other forms of reporting have been developed. Sustainability reports are voluntary, but communicate relevant information to stakeholders, concerning the efforts developed by HEIs in economic, social, and environmental dimensions (Alonso-Almeida et al., 2015). In a study conducted by Aleixo et al. (2016), where the websites of the public Portuguese HEIs were analyzed, it was concluded that the majority of the institutions are at early stages in what the measurement and communication of sustainability results is concerned. Silva (2017) also reports that only 3 HEIs published sustainability reports on their own, whereas 8 others incorporated sustainability performance indicators in other documents (management reports, activity reports, strategic plans, or consolidated financial reports). According to the RJIES, all HEIs must have an external auditor that controls their financial and assets management and twice a year external audits must take place. Moreover, from 2008, as established in the Law 54/2008, all public administration entities, including HEIs, are obliged to have plans for managing the risk of corruption and related infractions. As part of this priority. HEIs are expected to improve internal control systems and regularly carry out audits. According to the control framework in the Manual of Procedures of Control of the Portuguese Court of Auditors (Tribunal de Contas, 1999), internal auditors are the main actors in internal auditing in HEIs in Portugal. A recent study conducted in the Portuguese HE setting (Fonseca et al., 2020) stresses that Internal auditing tends to become increasingly important as its role expands to embrace performance management assessment, as well as risks assessment. The study also shows that financial statement audits is still the prevailing type of auditing, carried out by almost

Transparency and Accountability in Higher Education as a Response to. . .

33

all responding institutions, followed by projects or programs auditing. According to the authors, external audits by the Court of Auditors happened in the last 5 years in about half of the HEIs.

3.2.4

Accountability Issues

In terms of accountability and transparency instruments, Portugal has clearly received the influence of supra-national institutions, such as the OECD and the EU, visible in the emergence of a strong evaluative culture, with an emphasis on measuring, assessing, and benchmarking performance. Accordingly, HEIs are expected to exhibit high levels of accountability and transparency about the way resources are applied and the return on the use of resources, many of them public. As it happened in other contexts, Portugal has introduced several “soft governance mechanisms”. Such mechanisms made it possible for the State to replace direct control over the HEIs with indirect supervision over them. Quality assurance mechanisms have been regarded as essential tools in this regard. The website of the Directorate General of Education and Science Statistics (DGEEC) provides a wide range of statistics concerning the number of vacancies, the occupation rates, the number of first-year enrolled students, profile of the new diplomats, and the number of teaching staff/professors, among others. General information about all the degrees existent in the various HEIs is also publicly available in the same website, including data about the graduates of each course who are unemployed. Some databases can be downloaded as excel files. Every year, a few national-wide surveys are conducted. This is the case of the RAIDES (survey of the students enrolled and graduated in the HE system) and of the ICPTN (survey administered to measure the scientific and technological potential of all R&D units, whose response to the questionnaire is mandatory). The research performance of the HEIs in terms of publications in indexed journals and average number of citations is equally available to the general public. There is Open Access to the national register of the thesis and dissertations produced in the Portuguese HEIs since 2013, with the vast majority of the documents available for download. Each university also develops their own repository, with a growing number of open access documents. The idea of open access to data and publications has recently been extended and, in line with the Open Science principles (Morais et al., 2021). Universities are expected to work collaboratively with society and companies, sharing knowledge and the potential social and economic impact of their activities by leveraging the development of new products, services, businesses, and companies. The concept of scientific social responsibility has been reinforced. The movement has been lauched by the EU initiatives that have been changing the rules associated with funding. In Portugal, the Foundation for Science and Technology (FCT) has published its Open Access Policy and the FCT policies on Open Access come into force in 2014. The policy applies to papers in scientific journals, conference proceedings, posters, books and book chapters, monographs, Masters and PhD theses, which according to the

34

A. F. Pattaro et al.

policies adopted should be made available and deposited in one Open Access repository. Thus, the RCAAP (Repositório Científico do Acesso Aberto de Portugal) is only one, even if probably the most well-known, result of the Open Science movement in Portugal. It plays a crucial role in increasing the visibility of the scientific production of the Portuguese universities. Within the collaborative approach to science, many Portuguese Universities are also involved in Citizen Science projects. Citizen science is emerging as a new form of interaction between scientists and citizens, allowing for social participation and involvement in scientific activities. It may assume many forms and cover a wide range of topics. Many projects deal with great societal challenges and threats, such as climate change, biodiversity loss, pollution, habitat destruction, or health quality. In 2019, a portal has been launched with the aim of aggregating the various projects and initiatives, thus enhancing the accountability towards the society as a whole. Yet currently the portal is still under development. On the topic of quality assurance and accreditation, HEIs in Portugal follow the guidelines issued by the ENQA (European Association for Quality Assurance in Higher Education) and the European Quality Assurance Register for Higher Education (EQUAR). The European Standard Guidelines (ESG) play a major role. The “Agência de Avaliação e Acreditação do Ensino Superior” (Agency for the Evaluation and Accreditation of Higher Education) is the competent authority to evaluate and accredit HEIs and their study cycles. This Agency is an independent body vis-à-vis state and institutions and aims to promote and ensure quality in higher education. In addition to the previous accreditation of study cycles to be created, that Agency also carries out regular accreditation of the study cycles that are in operation. Moreover, quality assurance, and accreditation processes in particular, have been responsible for producing a large amount of information on HEIs performance and functioning that is publicly available. The Agency website gives access to the selfassessment reports submitted by each institution/study cycle, as well as to the Agency decision document. The influence of international rankings, such as the Times Higher Education, the QS or Shanghai League tables, is also visible in Portugal. Despite many criticisms (e.g., Jongbloed et al., 2018), they are used as a market communication tool by the best institutions to attract students and receive considerable attention from the media. More recently, a new type of ranking has been attracting some attention too: the U-Multirank. Contrarily to other countries, there are no domestic rankings in Portugal, at least with some impact. Universities often publish their rankings on the website as a marketing tool.

Transparency and Accountability in Higher Education as a Response to. . .

3.3 3.3.1

35

The Italian Case Study Introduction

In Italy higher education is organized in several universities and institutions for higher education addressing to Arts (e.g., Accademia delle Belle Arti), Music (e.g., Conservatori), Dance (e.g., Accademia Nazionale di Danza), and Acting (e.g., Accademia Nazionale di Arte Drammatica). In Italy there are 68 Public—i.e., mainly funded by the Public Sector—Universities (Università Statali, literally “State universities”) and 38 Private—i.e., mainly funded by the private sector actors/institutions—Universities (Università non Statali) distributed in the 20 Regions of the country. Every Italian region has at least a public university, apart from Valle d’Aosta region hosting only a private university. Among public universities there also are four Polytechnical universities (in Torino, Milano, Ancona, and Bari), and six universities with special status attesting the extremely high level of quality of their teaching and research activities, especially in scientific areas of research (namely Pavia, Trieste, two in Pisa, Lucca, L’Aquila-Gran Sasso). Unlike the Netherlands and Portugal cases, Polytechnical universities in Italy are simply publicly funded universities specialized in scientific and technological disciplines, especially engineering and architecture, and some of them can show off excellent results in the research activity and in publications in the scientific-technological domain. In the AY 2017–2018 there have been around 1.5 million students in public universities (around 920,000 enrolled in bachelor’s degree programs and 600,000 in master’s degrees and similar) and 195,000 students in private universities (bachelor degree and master degree); while postgraduate students (enrolled in PhD programs, graduate schools, and postgraduate masters) have been, respectively, around 73,700 in public universities and 14,500 in private ones. As about the number of workers in universities: in public universities work around 84,000 people with academic roles (Faculty), while people with administrative roles are around 51,000 (in private universities, respectively, around 12,000 and around 5000) (MIUR, Ufficio Statistica e Studi, 2020).

3.3.2

Accountability for Whom?

Public Universities, as every Italian Public Sector entity, since the Dlgs 150/2009 are expected to disclose in their institutional website almost every official document concerning the institution itself, organization and HRs, the contracts with external workers and organizations, official acts, tendering, agencies, grants/contributions/ subsidies, real estate, and asset management, plans and reports management, and performance both financial and non-financial... Therefore, since 2009 budgets, financial reports, relations, performance plans, and reports must be published online

36

A. F. Pattaro et al.

in the dedicated website section called “Amministrazione Trasparente” (literally Transparent Administration) of the single public Sector entity, and therefore of the individual HEI. Even though the list of documents and information to be disclosed was large since the enactment of this rules, the implementation has been gradual, and it took some time. So, information and data about each university are usually available and easy to access to, but the usability and understandability of such information for the public it is not very high since it is usually provided in official reports, plans, budgets, or balances in PDF format, and seldom in open format. Therefore, on the one hand, considering the vertical accountability, the main external stakeholders for public-funded universities are for sure the Italian Ministry of Education, University and Research (MIUR—Ministero dell’Istruzione, dell’Università e della Ricerca on https://www.miur.gov.it/), who is the main external funder. Some other relevant external funder and stakeholders often are banking foundations and other institutions promoting research and education programs. But, on the other hand, considering the horizontal accountability issue, present and future students, their families and in general the socio-economic stakeholders and their representative and societal associations are lay actors playing a stakeholder role for each individual university and HEI. In addition, the Italian Ministry of Education, University and Research, first collects, and afterwards provides many figures, statistics, indicators and data, also through Open Data (Italian Open Data License V2.0—IODL 2.0), about HEIs and Services to Students, in the website Ustat (Portale dei Dati dell’Istruzione Superiore on http://ustat.miur.it/). The Ministry also provides information about the financial and economic situation of Universities in the website Bilanci Atenei (Reports of Universities https://ba. miur.it/) offering information about the financial statement data, the indicators of personnel expenses, debt and economic and financial sustainability, as well as the collection of the legislative rules of interest by providing also information relating to the composition of the boards of auditors. The website is organized in three large sections: University consolidation (from homogeneous drafting of final accounts), legislative references, and a Dashboard useful for searches on data about income and expenses of the universities over years for professionals and laymen. Conversely, data about the indirect impact of university performance is provided by Almalaurea, (https://www.almalaurea.it/) an Interuniversity Consortium founded in 1994 which represents 76 universities and approximately 90% of the graduates who have come out of the Italian university system every year. The Consortium is supported by the member universities, by the contribution of MIUR, by companies and entities that use the services offered. AlmaLaurea is recognized as a Research Institution and its Statistical Office has been a member of Sistan, the National Statistical System since 2015. It carries out two census surveys each year on the profile and employment status of graduates 1, 3, and 5 years after graduation, returning to the participating universities, the MIUR, the National Agency for the Evaluation of the University System and Research (ANVUR) reliable documentary bases to facilitate the planning, monitoring, and evaluation processes of the decisions taken by universities. Among other activities, it also monitors students’ study paths and analyzes the characteristics and performance of graduates on the academic and

Transparency and Accountability in Higher Education as a Response to. . .

37

employment front, allowing for comparison between different courses and places of study.

3.3.3

Legal Framework

After the reform brought by L. 240/2010, Universities adopted accrual-based accounting, with a university—economic budget and a budget of investments, and the financial statements containing Balance sheet, Profit and loss statement, Cashflow statement, Notes to the annual report, and an external auditor’s statement on true and fair representation of financial position as well as compliance to legislation—(Dlgs 19/2012). In addition, Universities publish performance plans and reports (Dlgs 150/2009), and consolidated reporting has become compulsory since 2016. Finally, the participation of lay members in university governance became obligatory since some of the members of the Board of Directors of the University must be external to the university Faculty or students. Universities and HEIs are traditionally endowed with autonomy from a scientific and pedagogical/educational point of view; recent reforms in the public sector, and in particular in HEIs introduced several ex post controls aiming to performance evaluation of management, education/teaching, research and third mission results and outcomes, but also requiring the accomplishment of several rules and requirements. These reforms offered opportunity for accountability and performance evaluation, even though it is sometimes difficult to define what performance and how to measure it (Mauro et al., 2020). The funding allocation system for universities has been changed over time and it is now partially results-driven, thus strengthening the need for disclosing financial and non-financial information and emphasizing the relevance of the results achieved by institutions. The concerns on financial sustainability have been increasingly integrated with concerns on different dimensions of performance. In addition to addressing and coordinating the system, the Ministry MIUR performs a central function for the functioning of the university system by allocating annual funding to public universities and legally recognized private universities. Public universities are assigned annually the Ordinary Financing Fund (FFO) intended to cover institutional expenses, including personnel and operating costs. To the functioning of legally recognized private universities, the State annually attributes the contribution provided for by Law 243 of 1991. The regulatory evolution of recent years has radically changed the way in which public resources are attributed to the university system, by introducing criteria that gradually reduce the weight of funding on a historical basis in favor of parameters such as: the standard cost per student; the prize share in relation to the results of teaching and research; equalization measures to safeguard particularly critical situations. Still in the context of the annual funding allocated to the university system, there also are a series of specific interventions which, although part of the FFO, have restricted destinations. The main ones are the fund for the support of young people

38

A. F. Pattaro et al.

and to encourage student mobility (article 1, paragraph 1, Law Decree 105 of 2003 converted into law 170 of 2003); the fund for postgraduate scholarships for PhD students; and resources for the extraordinary recruitment plans for Professors and Researchers. For 2019, the allocation in the State budget for the financial year 2019 and for the three-year period 2019–2020—is equal to € 7450 ml. In 2018, the FFO reward share was 23%, 80% broken down based on the 2011–14 results of the assessment of quality of research VQR. Law 98/2013 established that the FFO’s reward quota increases year by year up to a maximum of 30%.

3.3.4

Accountability Issues

As about the disclosure of information about research and its evaluation, the Agency for the Evaluation of the University and Research System ANVUR (Agenzia Nazionale di Valutazione del Sistema Universitario e della Ricerca on https:// www.anvur.it/) was established in 2006 by the Ministry of University. It is meant to oversee the national public quality assessment system of universities and research bodies. It takes care of the external evaluation of the quality of the activities of the Universities and Research Bodies receiving public funding and directs the activities of the Evaluation Units. Finally, it assesses the effectiveness and efficiency of public funding programs and incentives for research and innovation activities. ANVUR is required to evaluate the results of research mainly through peer reviews. Academics are required to update their profile in the Repository IRIS (Institutional Research Information System) of each HEI; IRIS repositories are based on the international platform DSpace, integrated with the most important publishing metadata providers and international bibliometric information (Web of Science, Scopus, CrossRef, PubMed), uploading their research products, but also informing about their third mission activities. Information about research and publications from IRIS are usually disclosed in the website of the Department/Institution the academic belongs to, and usually in the official personal page of the academic in the university website, together with information about his/her teaching activities. The IRIS websites are used by the Ministry and its agencies to collect information about research and third mission activities of the academic and of the department/ institutions he/she belongs to, in order to periodically (every 5 years) evaluate it through the VQR (Valutazione Qualità della Ricerca on https://www.anvur.it/ attivita/vqr/) system. Aggregated data about research performance in different Universities and Departments are publicly disclosed by ANVUR after the end of the evaluation process. As mentioned in the previous paragraph, the VQR results are also used for the allocation of the rewarding share of the Ordinary Financing Fund (FFO—Fondo di Finanziamento Ordinario) of Public Universities. Considering the evaluation and disclosure of information about performance in teaching activities, in Italy since 2013 is operating a management, (self) assessment

Transparency and Accountability in Higher Education as a Response to. . .

39

and evaluation system called AVA system. The AVA system (Autovalutazione— Valutazione periodica—Accreditamento on https://www.anvur.it/attivita/ava/) is meant to Self-Assessment, Periodic Evaluation and Accreditation. Its objective is to improve the quality of teaching and research carried out in universities, through the application of a Quality Assurance (QA) and Accreditation model which follows the Standards and guidelines for quality assurance in the European Higher Education Area (ESG) issued by the ENQA (European Association for Quality Assurance in Higher Education) and revised with the European Quality Assurance Register for Higher Education (EQUAR). The Italian Quality Assurance model is based on internal design, management procedures, self-evaluation, and improvement of training and scientific activities, and in addition, on an external audit carried out in a clear and transparent way. Afterwards, the audit translates into an accreditation judgment, the result of a process through which a University (and its Study Courses) are recognized as having (initial accreditation) or the permanence (periodic accreditation) of the Quality Requirements that make it suitable for carrying out its institutional functions. Thus, Italian regulation requires not only to implement information systems to assess the quality of research, teaching, and third mission performance, but prescribes in detail what system should be used and how. Teaching activities and the organization of single courses and of bachelor/master’s degree programs are also assessed by students themselves (OPIS—OPInioni Studenti). Aggregated results are usually disclosed on the website of departments/ institutions and discussed with students when possible. In addition, every Department is endowed with a Commission, called Commissione Paritetica Docenti Studenti, composed of both teachers and students (max 10 people: 5 teachers and 5 students) who meets periodically to make judgments on teaching activities and quality assurance documents. Teachers are proposed by the Department Director and appointed by the Department Council; they must be part of different study courses (but must not hold incompatible positions); while students are enrolled in different study courses and designated by the Student Council. University academic staff are evaluated on research, educational/teaching activities but also on third mission initiatives/activities. This evaluation is usually working at national level, and it involves the department/institution, and consequently the University itself, not the single academic people. Some financial incentives can depend on the positive evaluation of the research and teaching performance locally or at national level. In fact, Universities and Departments can establish some peculiar methods of additional evaluation and incentives provision connected, for instance, to the results of the VQR or the evaluation of students. In Italy, Universities sometimes appear in international rankings, like Times Higher Education, the QS or Shanghai League tables, but of course can be evaluated and compared internationally through U-Multirank. On the other hand, at national level, there are at least two rankings that are considered by the public when assessing a university, especially for marketing purposes: the Sole 24 ore Universities

40

A. F. Pattaro et al.

Ranking—provided by the national economic newspaper Il Sole24ore—and Censis1 Universities Ranking. In addition, it is relevant to mention European Researchers’ Nights,2 an initiative funded by the European Commission’s Research and Innovation Framework Programme H2020 (2014–2020) by the Marie Skłodowska-Curie actions. These public events are dedicated to bringing researchers closer to the public. They showcase the diversity of research and highlight the impact of research on our daily lives. The aim is also to motivate young people to embark on research careers. The events promote how researchers contribute to our society by displaying their work in an interactive and engaging forum. This is also a relevant instrument for the disclosure information about performance results in Teaching, Third Mission (and Research) towards students, families, and local socio-economic system. They are very common in Italian Universities. It is also possible to identify some interesting innovative projects related to the adoption of voluntary instruments of social and/or environmental reporting, such as Social Reports, Environmental Reports, Sustainability Reports, or Integrated Reports (e.g., Mauro et al., 2020). Some universities were also able to obtain an environmental certification, such as Ca0 Foscari University of Venice obtained the LEED Certificate (Leadership in Energy and Environmental Design) for the historical buildings hosting the Rectorate.

4 Discussion and Conclusions In this contribution we offered a general overview of the present accountability requirements and we compared the instruments or resolutions, rule-driven or voluntary, adopted in three European countries to propose a general assessment of the degree of transparency and accountability of public-funded HEIs. Both similarities and differences in accountability and transparency, but also in availability and usability of information, are possibly related to the specific regulation of transparency and accountability of quality in HEIs in each country. On the other hand, the differences we identified could be therefore related to the institutional differences and normative backgrounds, but also, to the different national practices and confidence with transparency and accountability in public sector institutions. Nevertheless, some analytical and critical reflections on the case studies findings are possible. Considering the three main groups of regularly used transparency instruments identified by literature: Quality Assurance and Accreditation Schemes, Rankings and Bibliometric Systems, and Performance Agreements/Contracts between a HEI and government or funding authorities—it is worth to highlight some reflections inspired by the three countries case studies accomplished.

1 Censis, Centri Studi Investimenti Sociali (i.e., Center for Social Investment Studies), is a socioeconomic research institute founded in 1964. https://www.censis.it/ 2 https://ec.europa.eu/research/mariecurieactions/actions/european-researchers-night_en

Transparency and Accountability in Higher Education as a Response to. . .

41

As about Quality Assurance and Accreditation Schemes at both national and international level, it is evident that they could be both considered as policy tools put in place by public authorities, and as conditions to be accomplished in order to obtain not only the recognition of educational programs, but also for funding from national ministries and agencies. In effects, the accreditation at national level of educational programs is compulsory in almost all European countries. On the other hand, accreditation at international level is totally voluntary and it offers the opportunity to enter in the élite of teaching programs which offers significant visibility and prestige, also at international level. From this issue emerges the relevance, also at operational level, of agencies in charge of HEIs assessment, evaluation, and accreditation. They are composed of both national and foreign auditors, and they operate a transversal analysis of HEIs embracing different activities, objectives, results, and expected impacts involving mainly external stakeholders. Rankings and Bibliometric Systems are, as above mentioned, mainly originated by private initiatives (e.g., rankings produced by media organizations). The case studies presented in this contribution show that, despite some differences in the experiences portrayed in the Netherlands, Portugal, and Italy, they tend anyhow to reflect the growing marketization of Higher Education Systems all around Europe, which deeply influences academics’ attitude towards research and indirectly also teaching and third mission activities (see, for instance, van Helden & Argento, 2020a). It could drive to the emergence of a new kind of accountability: competitive accountability, i.e. a form of public accountability through a research governance technology and its performance-based demand for academic researchers as producers of economic and societal impacts (Watermeyer, 2019). Internal Rankings, linked with quality assurance and only seldom to accreditation schemes (since they are mainly focused on research activity), are also observable inside different HEIs, and are functional to the distribution of public funds. Thus, they also determine effects on HEIs recruitment and incentives policies and indirectly on their governance. This relationship needs to be further investigated. Finally, with reference to Agreements/Contracts between a HEI and government or funding authority to guarantee some performance targets in research, teaching, and third mission, we identified some differences involving the level of autonomy accorded to HEIs in the three countries experiences and the inter-institutional relationships. In effect, while in Italy the ministry and agencies not only require the establishment of HEIs performance evaluation systems, but also prescribe in detail what system should be used and how; in the Netherlands and Portugal the how of implementation is left to universities rather than based on detailed prescriptions from higher authorities. These centralized and decentralized approaches to the implementation of performance information systems are related to the kind of accountability—vertical or horizontal—that is privileged in each country, in accordance with the existing national regulation (Table 1). In addition, it is worth to debate about the following critical issues, concerning, respectively, the identification of the stakeholders towards whom HEIs seem to be effectively transparent and accountable, the transparency, and the effective autonomy of HEIs’ activities, emerging from the case studies.

42

A. F. Pattaro et al.

Table 1 The implementation of the three main groups of regularly employed transparency instruments in HEIs in the Netherlands, Portugal, and Italy Quality Assurance and Accreditation Schemes Rankings (and Bibliometric Systems)

Agreements/ Contracts on HEIs Performance Targets

The Netherlands Internal: compulsory (also for obtaining public funds) International: voluntary (élite and promotion) Competitive accountability for public funds, recruitment, and incentives policies and indirectly on the HEI governance Quite decentralized approach to the implementation of performance information systems ! more horizontal accountability

Portugal Internal: compulsory (also for obtaining public funds) International: voluntary (élite and promotion) Competitive accountability for public funds, recruitment, and incentives policies and indirectly on the HEI governance Quite decentralized approach to the implementation of performance information systems ! balance horizontal and vertical accountability

Italy Internal: compulsory (also for obtaining public funds) International: voluntary (élite and promotion) Competitive accountability for public funds, recruitment, and incentives policies and indirectly on the HEI governance Quite centralized approach to the implementation of performance information systems ! vertical accountability prevalence

Source: Authors’ analysis

First, what categories of external stakeholders are effectively informed in the three countries we considered, so towards whom HEIs seem to be effectively transparent and accountable? In general, we observed quite a strong emphasis on accountability towards public or private funders/sponsors, to future students—both domestic or from abroad—and their families, and to actual students and families. On the other hand, information intended for society, politicians, and the socio-economic setting, despite the inclusion of people not belonging to the HEI in their governance bodies, look much more limited. In particular, we identified a relatively low attention on research disclosure for laymen, notwithstanding of science/society initiatives we mentioned in the case studies. It does not mean that research information is not available, but it seems more professional oriented. As above mentioned, some collaborative approaches have been trying to foster the development of new and more mature forms of communication between HEIs and not academic stakeholders, for instance, through Citizen Science projects which are increasingly emerging in EU countries. HEIs in the three countries we considered give some visibility to such projects on their webpages. Focusing on funding of research and education activities, private funding of HEIs research does not appear very significant compared to public funding both in Portugal and Italy. In Italy some private foundations, especially bank foundations offer some funds for research and teaching activities that could address research towards some areas more profitable or simply more popular in the local or national contest. In the Netherlands, the differences between universities and polytechnics are also related to funding, as above mentioned. Particularly for non-public commissioned research, disclosure of information may be limited, since the commissioner

Transparency and Accountability in Higher Education as a Response to. . .

43

could be interested in research in strategic areas (e.g., in drugs, chemicals, or military areas) asking researchers for a low profile and reduced disclosure of the results of research for security or competition reasons. Second, it is significant to concentrate attention on the transparency issue with special regard to societal results of higher education and research, to various stakeholders and to a (highly competitive) international audience. From the analysis of the three case studies, we could recognize, as we stated above, that the disclosure of the results of higher education and research is more focused to experts, while considering laymen they seem to privilege some categories of stakeholders (funders and actual/future students, also at international level). Overall, for the Portuguese and Italian case, it is possible to argue that HE institutions have being accountable to external stakeholders, especially in terms of what they are delivering with the public resources assigned to their research activities. However, the weakest area in terms of transparency of HEIs’ performance is for sure that of transfer and consulting activities. In general, the absence of nationalwide portals aggregating information in some research and teaching activities for all HEIs also makes performance comparison exercises difficult. Nevertheless, it is worth to highlight how information is effectively disclosed to external stakeholders. For teaching activities and programs, all countries provide information (course units, ECTS, competencies developed, employability, program accreditation, etc.) in the institution’s webpage, usually at the Faculty/Department level. However, performance indicators, such as cost/student, average time of degree completion, and teacher/student ratio, etc., are not always easily accessible. When they are available to the public, they are usually part of management reports or program reports, making their access and usability relatively low. On the other hand, programs that rank high in international rankings make such information highly visible to attract students’ applications and improve the institution reputation. In addition, with reference to the disclosure of information about the research results, knowledge transfer, and community engagement, apart from Citizen Science projects, not only it is usually minimal, but when available, it is highly scattered, and it is necessary to visit different universities and department official websites in the three countries we compared in the present contribution. Management reports prepared centralized for the whole institution often ignore many activities that are undertaken at the Faculties/Schools level. However, for instance, in Portugal, the national initiative of launching a portal that could act as a single point of access to these initiatives has not yet produced the desired results. Therefore, unless citizens know exactly what they are looking for, information on them is not easy to find. This is a shortcoming given that, by nature, Citizen Science projects are expected to have a high social and economic impact in the communities. Furthermore, the Open Access policies have been influencing the three countries analyzed over the last decade. In the Portuguese case, these policies have led to a high level of disclosure of research results and publications. For publications, the information is easier to be accessed by any external stakeholder since a national repository exists. For research projects publicly funded, namely by the Foundation

44

A. F. Pattaro et al.

for Science and Technology, besides the obligation to disclose the purposes and results through websites specifically dedicated to each project, there is also relevant information centralized in the FCT institutional webpage, in terms of the amount received, research team, institutions involved, and funding amount received. In Italy, as mentioned above in the country-case, there are many centralized websites with Open Data about financial and some non-financial performance of HEIs, but a centralized website with all the products of research is not completed yet and information is still scattered and not always with Open Access. In the Netherlands Open Access policies have been implemented since longer time, so, as highlighted in the case study, data and information are a little more easily recoverable for laymen, but the effective transparency matter is still open. According to Argento and van Helden (2021) particularly performance information for HEIs needs reconsideration leaving more space for professionals to focus on core research and teaching activities than on ambitions to be in top rankings. Thus, HEIs seem to be transparent but especially towards expert stakeholders who know exactly what to look for and where, since performance information is often scattered and available on wide documents, making the access and usability relatively low. Moreover, the use of Open Data is increasing and improving in the three countries but still not totally satisfying in transparency terms. Third, it is worth to focus on the role of rules and again on the level of autonomy accorded to HEIs. As above stated, the information systems about evaluation of research and teaching performance implemented in the three countries seem based on two different approaches: more centralized in Italy, while in the Netherlands and in Portugal it is much more decentralized and autonomous. These centralized and decentralized approaches to the implementation of performance information systems resonate with the reflections we expressed about formal autonomy contextually and politically defined (e.g., Carvalho & Diogo, 2018), like in Italy, and a more effective autonomy accorded to HEIs emerging in Portugal and especially in the Netherlands. As above referred, even though in the last decades HEIs have been nominally awarded with more responsibilities and autonomy, they can be declined in different ways and sometimes they are matched with the increase of vertical controls, also ex ante, and to significant restrictions and rules, which de facto reduce everything to a formal or limited autonomy, like in oversight regulation mode of higher education (Lodge, 2015). At the operational level, these approaches have disadvantages as well as some possible advantages, and should be further investigated, especially considering that both Italy and Portugal are countries endowed with institutional Napoleonic traditions, but their national governments adopted rather different attitudes to this issue. The above presented analytical and critical reflections on the current implementation of instruments and systems for the disclosure of performance results and impacts of HEIS activities in the Netherlands, Portugal, and in Italy to improve transparency and accountability towards external stakeholders, need be further investigated both in theory and practice. In fact, they suggest several evocative open questions concerning some institutions fundamental for society and for the creation of shared public value, but whose mission, activities, and management are quite peculiar and intricate to perform and fulfill.

Transparency and Accountability in Higher Education as a Response to. . .

45

References Aleixo, A. M., Azeiteiro, U. M., & Leal, S. (2016). Toward sustainability through higher education: Sustainable development incorporation into Portuguese higher education institutions. In W. L. Filho & J. P. Davim (Eds.), Challenges in higher education for sustainability (pp. 159–187). Springer. Alonso-Almeida, M., Marimon, F., Casani, F., & Rodriguez-Pomeda, J. (2015). Diffusion of sustainability reporting in universities: Current situation and future perspectives. Journal of Cleaner Production, 106, 144–154. Amaral, A., & Teixeira, P. (2000). The rise and fall of the private sector in Portuguese higher education. Higher Education Policy, 13(3), 245–266. Argento, D., & van Helden, J. (2021). New development: University managers balancing between sense and sensibility. Public Money & Management, 41(6). https://doi.org/10.1080/09540962. 2021.1890923 Armstrong, E. (2005). Integrity, transparency and accountability in public administration: Recent trends, regional and international developments and emerging issues. United Nations. Barberis, P. (1998). The new public management and a new accountability. Public Administration, 76(3), 451–470. Behn, R. D. (1998). The new public management paradigm and the search for democratic accountability. International Public Management Journal, 1, 131–164. Blair, H. (2000). Participation and accountability at the periphery: Democratic local governance in six countries. World Development, 28(1), 21–39. Bovens, M. (2005). Public accountability. In E. Ferlie, L. E. Lynn Jr., & C. Pollitt (Eds.), The Oxford handbook of public management. Oxford University Press. Bovens, M. (2007). Analysing and assessing accountability: A conceptual framework. European Law Journal, 13(4), 447–468. https://doi.org/10.1111/j.1468-0386.2007.00378.x Capano, G., & Pritoni, A. (2020). Exploring the determinants of higher education performance in Western Europe: A qualitative comparative analysis. Regulation & Governance, 14, 764–786. https://doi.org/10.1111/rego.12244 Carvalho, T., & Diogo, S. (2018). Exploring the relationship between institutional and professional autonomy: A comparative study between Portugal and Finland. Journal of Higher Education Policy and Management, 41, 18–33. Chu, A., & Westerheijden, D. F. (2018). Between quality and control: What can we learn from higher education quality assurance policy in the Netherlands. Quality in Higher Education, 24(3), 260–270. https://doi.org/10.1080/13538322.2018.1559513 Clark, B. (1983). The higher education system. Academic organization in cross national perspective. University of California Press. Cleaver, J. (2021). Debate: We need to change the culture of reliance on inappropriate uses of journal metrics—A publisher’s viewpoint. Public Money & Management, 41(2), 148–150. DUO. (n.d.). Financiële verantwoording xbrl. Retrieved from https://www.duo.nl/open_ onderwijsdata/databestanden/onderwijs-algemeen/financiele-cijfers/verantwoordingxbrl.jsp Eaton, J. (2016). The question for quality and the role, impact and influence of supra-national organisations. In E. Hazelkorn (Ed.), Global rankings and the geopolitics of higher education. Understanding the influence and impact of rankings on higher education, policy and society (pp. 324–338). Routledge. Ezzamel, M., Robson, K., Stapleton, P., & Mclean, C. (2007). Discourse and institutional change: ‘Giving accounts’ and accountability. Management Accounting Research, 18(2), 150–171. Fonseca, A., Jorge, S., & Nascimento, C. F. F. (2020). O papel da auditoria interna na promoção da accountability nas Instituições de Ensino Superior. Revista de Administração Pública, 54, 243– 265. https://doi.org/10.1590/0034-761220190267 Freeman, R. E. (1984). Strategic management: A stakeholder approach. Pitman.

46

A. F. Pattaro et al.

Gunn, A. (2018). The UK teaching excellence framework (TEF): The development of a new transparency tool. In A. Curaj, L. Deca, & R. Pricopie (Eds.), European higher education area: The impact of past and future policies (pp. 505–526). Springer. Hazelkorn, E. (2018). The accountability and transparency agenda: Emerging issues in the global era. In A. Curaj, L. Deca, & R. Pricopie (Eds.), European higher education area: The impact of past and future policies (pp. 423–440). Springer. Hazelkorn, E., & Gibson, A. (2017). Public goods and public policy: What is public good, and who and what decides? CGHE Working Papers No. 18. Centre for Global Higher Education. Retrieved from http://www.researchcghe.org/perch/resources/publications/wp18.pdf Hood, C. (2010). Accountability and transparency: Siamese twins, matching parts, Awkard couple? West European Politics, 33(5), 989–1009. Hood, C., Scott, C., James, O., Jones, G., & Travers, T. (1999). Regulation inside government. Oxford University Press. Hood, C., James, O., Peters, B. G., & Scott, C. (2004). Controlling modern government. Edward Elgar. Inspectie van het Onderwijs. (2021). De staat van het onderwijs 2021. Retrieved from https://www. onderwijsinspectie.nl/binaries/onderwijsinspectie/documenten/rapporten/2021/04/14/de-staatvan-het-onderwijs-2021/Staat-van-het-Onderwijs-2021.pdf Jongbloed, B., Vossensteyn, H., van Vught, F., & Westerheijden, F. (2018). Transparency in higher education: The emergence of a new perspective on higher education governance. In A. Curaj, L. Deca, & R. Pricopie (Eds.), European higher education area: The impact of past and future policies (pp. 441–454). Springer. Jos, P. H., & Tompkins, M. E. (2004). The accountability paradox in an age of reinvention. The perennial problem of preserving character and judgment. Administration & Society, 3, 255–281. Kallio, K. M., Kallio, T. J., & Grossi, G. (2017). Performance measurement in universities: Ambiguities in the use of quality versus quantity in performance indicators. Public Money & Management, 37(4), 293–300. Kuhlmann, S., & Wollmann, H. (2014). Introduction to comparative public administration: Administrative systems and reforms in Europe. Edward Elgar. Lennon, M. (2018). Learning outcomes policies for transparency: Impacts and promising practices in european higher education regulation. https://doi.org/10.1007/978-3-319-77407-7_32 Lijphart, A. (1982). Verzuiling, pacificatie en kentering in de Nederlandse politiek [The politics of accomodation]. J.H. de Bussy. Lodge, M. (2015). Regulating higher education. A comparative perspective. Discussion paper 77. CARR Centre for Analysis of Risk and Regulation at the London School of Economics and Political Science. Mauro, S. G., Cinquini, L., Simonini, E., & Tenucci, A. (2020). Moving from social and sustainability reporting to integrated reporting: Exploring the potential of Italian public-funded universities reports. Sustainability, 12(8), 1–19. https://doi.org/10.3390/su12083172 Morais, R., Saenen, B., Garbuglia, F., Berghmans, S., & Gaillard, V. (2021). From principles to practices: Open science at Europe’s Universities – 2020–2021 EUA open science results. European University Association. Morris, H. (2018). Transparency tools in Wales: Bringing higher education performances into focus? In the global era. In A. Curaj, L. Deca, & R. Pricopie (Eds.), European higher education area: The impact of past and future policies (pp. 487–504). Springer. Moynihan, D. P., & Ingraham, P. W. (2003). Look for the silver lining: When performance-based accountability systems work. Journal of Public Administration Research and Theory, 4, 469–490. Newman, J. (2011). Serving the public? Users, consumers and the limits of NPM. In T. Christensen & P. Laegreid (Eds.), The Ashgate research companion to new public management (pp. 349–360). Ashgate. Parker, L. (2011). University corporatization: Driving redefinition. Critical Perspectives on Accounting, 22, 434–450.

Transparency and Accountability in Higher Education as a Response to. . .

47

Parliament. (2013). Financiële positie van publiek bekostigde onderwijsinstellingen. [Financial assessment of publicly funded education institutions]. TK 2012/13 33495 nr. 10. Retrieved from https://zoek.officielebekendmakingen.nl/kst-33495-10.html Parliament. (2019). Budget for the ministry of Education 2020; TK 2019/20 35300 VIII nr. 2). Retrieved from https://www.rijksbegroting.nl/2020/voorbereiding/begroting,kst264847_14. html and https://www.rijksbegroting.nl/2020/voorbereiding/begroting,kst264846.html Peters, B. G. (2007). Performance-based accountability. In A. Shah (Ed.), Performance accountability and combatting corruption. World Bank. Pina, V., Torres, L., & Royo, S. (2007). Are ITCs improving transparency and accountability in the EU regional and local governments? An empirical study. Public Administration, 85(2), 449–472. Pollitt, C. (2018). Performance management 40 years on: A review. Some key decisions and consequences. Public Money & Management, 38(3), 167–174. https://doi.org/10.1080/ 09540962.2017.1407129 Romzek, B. S., & Dubnick, M. J. (1987). Accountability in the public sector: Lessons from the challenger tragedy. Public Administration Review, 47(3), 227–238. Silva, P. (2017). Caracterização do relato nas instituições de ensino superior portuguesas. Unpublished Master Dissertation, University of Porto. Sin, C., Tavares, O., & Amaral, A. (2017). The impact of programme accreditation on Portuguese higher education provision. Assessment & Evaluation in Higher Education, 42(6), 860–871. https://doi.org/10.1080/02602938.2016.1203860 Söderlind, J., & Geschwind, L. (2019). Making sense of academic work: The influence of performance measurement in Swedish universities. Policy Reviews in Higher Education, 3(1), 75–93. Statistics Netherlands. (2020). School size by type of education and ideological basis. Retrieved from https://opendata.cbs.nl/statline/#/CBS/en/dataset/03753eng/table?ts¼1591597914934 Teixeira, P. (2009). Mass higher education and private institutions, chapter 8. In Higher education to 2030, Vol. 2, Globalisation. OECD. Tribunal de Contas. (1999). Manual de auditoria e procedimentos – Volume I. TC. van Helden, J., & Argento, D. (2020a). New development: Our hate–love relationship with publication metrics. Public Money & Management, 40(2), 174–177. van Helden, J., & Argento, D. (2020b). Neo-liberal ideology threatens scholarship and collegiality of young researchers—A discussion starter. CIGAR Newsletter, 11–12, April 2020. van Helden, J., & Reichard, C. (2019). Making sense of the users of public sector accounting information and their needs. Journal of Public Budgeting, Accounting & Financial Management, 31(4), 478–495. https://doi.org/10.1108/JPBAFM-10-2018-0124 Van Vught, F. A., & Ziegele, F. (2011). Design and testing the feasibility of a multidimensional global university ranking: Final report. Commissioned by the Directorate General for Education and Culture of the European Commission. CHERPA Network. Retrieved from http://www. ireg-observatory.org/pdf/u_multirank_final_report.pdf Watermeyer, R. (2019). Competitive accountability in academic life: The struggle for social impact and public legitimacy. Edward Elgar. Werkenbijhogescholen.nl. (n.d.). Feiten en cijfers. Retrieved. Winckler, G., Nilelsen, L., Lindqvist, O., Abécassis, A., Noorda, S. J., Assunçâo, M., & Teixeira, P. (2018). National reform processes: Examples of six European countries. In K. Kruger, M. Parellada, D. Samoilovich, & A. Sursock (Eds.), Governance reforms in European university systems: The case of Austria, Denmark, Finland, France, the Netherlands and Portugal (pp. 11–158). Springer. Yin, R. K. (2018). Case study research and applications: Design and methods (6th ed.). Sage.

An Analysis of Methodologies, Incentives, and Effects of Performance Evaluation in Higher Education: The English Experience Giovanni Barbato and Matteo Turri

1 Introduction The spread of performance evaluation (PE) systems in the higher education (HE) sector has been mainly driven by the New Public Management (NPM) narrative, as has occurred in other public sector domains (Ferlie et al., 2008). The NPM narrative assumes that the introduction itself of business-like processes, values, and practices improves the efficiency, effectiveness, and value-for-money of public sector activities. Regarding the HE sector NPM-inspired reforms have been concentrated in five main areas: (a) introduction of competitive mechanisms for resources (students, funds) and performance-based funding; (b) hardening of budgetary constraints; (c) stress on performance evaluation (PE) and rankings; (d) steering of HE systems through agencies and targets; (e) strengthening of internal executive bodies and introduction of new managerial roles. PE is usually characterized by two main elements (Diefenbach, 2009): first, PE is a narrow conceptualization of performance in favor of quantifiable and short-term outputs measured through quantitative indicators; and second, PE is often linked to incentives that aim to orient and influence the behaviors of those who are evaluated in order to promote only specific behaviors and performances. In this sense, national PE systems of research and teaching can also be interpreted and analyzed as management tools used for promoting research

This study is part of the PRIN project “Comparing Governance Regime Changes in Higher Education: systemic performances, national policy dynamics, and institutional responses. A multidisciplinary and mixed methods analysis” (2015RJARX7). G. Barbato (*) · M. Turri Dipartimento di Economia, Management e Metodi Quantitativi (DEMM), Università degli Studi di Milano, Milan, Italy e-mail: [email protected]; [email protected] © The Author(s), under exclusive license to Springer Nature Switzerland AG 2022 E. Caperchione, C. Bianchi (eds.), Governance and Performance Management in Public Universities, SIDREA Series in Accounting and Business Administration, https://doi.org/10.1007/978-3-030-85698-4_3

49

50

G. Barbato and M. Turri

and teaching performance in the way desired by central government bodies (Rebora & Turri, 2013). Regarding the influence of NPM in the HE sector, the English reality is often seen as a clear example of a situation in which market-based mechanisms and PE mechanisms radically shape the operation of the HE system. The first national evaluation exercise of research can be traced back to the late 1980s, whereas teaching has recently seen the introduction of an annual assessment exercise since 2016 but with formers attempts during the 1980s. The purpose of this chapter is twofold. First, it aims to describe the rationales, features, and effects of the evaluation of research and teaching in England. Secondary sources such as technical reports, official statistics, and other empirical evidence from the literature have been used to carry out this goal. Second, it analyzes both research and teaching evaluation through the theoretical lens of Management Control Theory (MCT), which specifically focuses on the relationship between the type of evaluation to be used and the features of the object to be evaluated (Ouchi, 1979; Frey et al., 2013). This perspective can be particularly useful to understand why NPM-inspired PE systems often generate unintended consequences or dysfunctionalities (Bevan & Hood, 2006; Diefenbach, 2009). While this theoretical approach has been fruitfully and widely applied to other public sectors (Frey et al., 2013; Barbato & Turri, 2017), it has been less applied to HE activities (Rebora & Turri, 2013). The chapter is organized as follows: the next section describes the MCT approach toward the evaluation system. Sections 3 and 4 illustrate the PE systems for research and teaching in England, whereas the last section discusses them through the MCT lens.

2 A Theoretical Framework for Performance Evaluation Systems in Universities Management Control Theory (MCT) claims that the primary goal of performance measurement is to align individuals’ efforts and organizations’ objectives to improve performance. However, in order to effectively do that, the control system (here intended as the evaluation system) must be designed according to the nature of the activity/task under evaluation. In particular, two criteria must be considered (Ouchi, 1979; Frey et al., 2013), namely, the degree of the measurability and attributability of the output (high vs low) and the knowledge of the cause–effect relation or transformation process producing the output (high vs low). The matching of these two criteria leads to three different types of evaluation/ control, namely output control, input control, and process control. The first typology focuses predominantly on the assessment of outputs given that the knowledge of the cause–effect relation producing the output cannot be known adequately and deeply. This type of evaluation is feasible and effective only when outputs can be easily measurable, in other words, they can be observed and quantified. Moreover, outputs

An Analysis of Methodologies, Incentives, and Effects of Performance. . .

51

cannot be characterized by an intense interdependency between the actors who are involved in the generation of the output. In other words, output must be clearly attributable to individuals. In contrast, when both output measurability and knowledge of the transformation process are low, the evaluation should instead be conducted on the inputs of the activity/tasks to be evaluated. In other words, it should verify that individuals in charge of that activity/task present specific knowledge and competences crucial for the generation of high-quality outputs (Ouchi, 1979). As described by Frey et al. (2013, p. 958), the input or “clan” control assesses whether individuals “have internalized norms and professional standards, i.e., are dedicated intrinsically to their task.” Key mechanisms that can be used to evaluate are thus the selection and career procedures and all the processes that socialize and train individuals toward the set of rules and code of practices required to carry out that specific task. In contrast to an ex post output evaluation, input control is a long-term and more demanding process and is subject to a risk of low accountability toward those who do not share the same specific knowledge and rules. The last typology provided by the MCT is process control, which assesses the processes carried out to achieve certain outputs. This can indeed be used when their measurability is not high, but evaluators display sufficient knowledge and shared understanding of the transformation process to generate them. Process control is characterized by a high transparency and equity of treatment. In contrast, it is argued that process control might fall into a bureaucratic and rigid procedure. While MCT sheds light on the relevance of the relationship between the characteristics of the activity to be evaluated and the type of evaluation mechanisms that can be used, several scholars have widely underlined how NPM-based PE systems are predominantly skewed toward the assessment of measurable outputs. However, public services (e.g., education, health, and security) are predominantly complex and unstable activities, with tasks highly interdependent so that outputs cannot be precisely attributed to individuals (Frey et al., 2013). The tendency of measuring only what can be quantified through indicators (named “tunnel vision” by Smith, 1995) entails that qualitative aspects are mainly neglected even though they might be crucial in defining whether individuals/organizations have actually achieved the desired output. This situation might even lead to what Van Thiel and Leeuw (2002) describe as a “performance paradox,” whereby the evaluative metrics somehow misrepresent the actual level of performance achieved, since they lose their capacity to discriminate between satisfactory or poor performance. Moreover, the provision of incentives may generate further unintended consequences if the type of evaluation adopted is not aligned with the features of the object to be evaluated. As claimed by Speklé and Verbeeten (2014), if output measurability is low and the knowledge of the cause–effect is ambiguous, the incentives will probably induce individuals/organizations to concentrate their efforts only towards the activities that are considered in the final evaluation or even on those more easily achievable despite their actual relevance (the so-called “effort substitution”). Similarly, the concept of “gaming” represents the voluntary manipulation/alteration of the evaluation process, which improves future evaluative judgment without

52

G. Barbato and M. Turri

substantially affecting performance. Several examples of “gaming practices” have been documented across different public sectors (Van Thiel & Leeuw, 2002; Bevan & Hood, 2006; Diefenbach, 2009; Speklé & Verbeeten, 2014). Therefore, thanks to its specific focus on the relationship between the type of evaluation and the characteristics of the activities/tasks to be assessed, the MCT represents a fruitful theoretical lens through which to analyze the PE system in the HE sector, especially in contexts where the NPM narrative played such an important role, such as in the English context.

3 The Evaluation of Research in England Research at English universities is evaluated through a cyclical exercise, currently known as the Research Excellence Framework1 (REF). The first evaluation exercise was introduced in 1986, followed by 1989, 1992, 1996, 2001, 2008, and 2014, with universities currently being involved in REF 2021. The evaluation exercise is managed by Research England,2 the current funding and evaluation agency for the English HE sector, on behalf of the other British nations. The REF introduction can be explained through a mix of financial and accountability rationales, with the predominance of the former (Martin, 2011; Hicks, 2012). Research activities were indeed previously funded according to antecedent historical allocations and based on student numbers. It was assumed that all academics were engaged equally in research activities, and thus universities could be funded almost identically despite their actual level of research orientation (Shattock, 2012). This situation relevantly changed with the public budgetary cuts undertaken by Thatcher’s (1979–1990) governments as a result of the 1970s economic crisis. The HE sector was also heavily affected. Consequently, the funding body of UK universities at that time, the University Grant Committee (UGC), decided to introduce an element of selectivity in its research formula funding by launching an evaluation exercise in 1986 to inform the distribution of increasingly shrinking resources (Technopolis, 2018). In this way, resources would have been allocated based on research quality, meeting the efficiency principle as well. Finally, the introduction of an evaluation exercise also followed a rationale of public accountability; in other words, it was a way to verify that public expenditure

1

The evaluation exercise has taken different denominations over time. Between 1986 and 1989 it was the Research Selectivity Exercise (RSE) while from 1992 to 2008 the Research Assessment Exercise (RAE). The evaluation exercise has then been renamed Research Excellence Framework (REF) from 2014. We will only use the term REF in this chapter to generally indicate the research evaluation exercise in England. 2 Research England replaced the Higher Education Funding Council for England (HEFCE) through the 2017 Higher Education and Research Act. The funding agency was previously called the University Funding Council (UFC) between 1989 and 1992 and University Grant Committee (UGC) before 1989.

An Analysis of Methodologies, Incentives, and Effects of Performance. . .

53

was allocated effectively, making universities more accountable (Geuna & Martin, 2003; Bence & Oppenheim, 2005).

3.1

Main Features of the Research Excellence Framework

The research evaluation system has increased in terms of research outputs evaluated and academics and universities involved over time. REF 2014 assessed 4 research outputs per academics (more than 191,000 outputs from 52,061 academics) and 154 universities against the 2 outputs per department from approximately 50 institutions in 1986. However, its features have mainly remained constant. This section describes them primarily on the basis of the last REF 2014. I. The Research Excellence Framework (REF) is a peer-review evaluation system, which means that the assessment regarding the quality of research is carried out by experts, in this case, other academics. The REF is one of the few research evaluation systems in the world that presents (almost) exclusively a peer-review methodology, whereas many of them employ a combination of bibliometric indexes and other output indicators (Hicks, 2012). REF evaluators are organized in disciplinary panels who are in charge of assessing the quality of the outputs submitted in the corresponding Unit of Assessment (UoA). The REF 2014 involved more than 1000 experts, 77% of whom were academics (890) appointed by the funding council. Reviewers are organized in 36 subpanels coordinated by 4 main panels. As the highest quality ratings awarded during the evaluation process refer to international excellence, each main panel has also involved some non-UK experts since 2001 to provide an exogenous viewpoint in calibrating the assessment criteria globally (Bence & Oppenheim, 2005). Moreover, panels also consist of research users (23%), namely, individuals coming from the business, culture, and nonprofit sectors that either put universities’ research into practice or benefit from it. II. The REF is organized in discipline-based units of assessment. Each subpanel of evaluators corresponds to a UoA, which represents the unit of analysis of the assessment procedure. UoA roughly corresponds to a university department (Geuna & Martin, 2003). This discipline-based structure allows subpanels to specify evaluation criteria according to each discipline. However, to avoid losing consistency across disciplines, emerged during the first exercises, a limited number of ‘coordinating panels’ have been introduced since 2001 (Technopolis, 2018). These panels consist of the chairs of the subpanels to which they refer, and their main function is to improve the consistency of judgment criteria across them (Bence & Oppenheim, 2005). Moreover, these panels also contributed to addressing the (still) overlooked problem of interdisciplinarity. III. REF assessment is expressed in the form of ratings without aiming to build any ranking. Ratings are indeed attributed not to the level of the entire university but

54

G. Barbato and M. Turri

to each UoA. However, the way ratings were awarded to the UoAs significantly changed with the 2008 exercise. A single-point rating was indeed granted for each UoA until 2001. In contrast, the 2008 exercise presented the results as quality profiles and not single-point ratings. A quality profile displays the percentage of the outputs submitted in each of the five judgment grades. Consequently, it sheds light on how quality is spread within each UoA, previously hidden by an overall single rating. This change had the effect of enlarging the number of UoAs that received funds, since also small pockets of research quality within low-performing UoAa were going to be funded (Technopolis, 2018).

3.2

The Object of Assessment: Research Output, Impact, and Environment

The REF has historically evaluated research by looking at its outputs. The definition of research output has been traditionally broad, including articles, books, book chapters, conference proceedings, and patents (Geuna & Martin, 2003; HEFCE, 2015). The focus of the evaluation system was enriched with REF 2014, with the introduction of the evaluation of the nonacademic impact of research (Rebora & Turri, 2013). This choice was driven by financial and accountability reasons as well. Its introduction protected research funds from the Cameron government’s (2010–2016) plan of budget cuts and was also a way to demonstrate the value of public investment (Martin, 2011; Penfield et al., 2014). As a result of several initiatives, including an impact pilot exercise (Technopolis, 2010), the funding agency ultimately included the research impact assessment in the upcoming REF 2014. Its weight in the final judgment is equal to 20%, with the quality of research outputs and research environment accounting for, respectively, 65% and 15% of the overall score. The impact was defined as “an effect on, change or benefit to the economy, society, culture, public policy or services, health, the environment or quality of life, beyond academia” (HEFCE, 2011, p. 48), and it should have been evaluated in terms of “reach” (the spread or breadth of influence or effect on the relevant constituencies) and “significance” (the intensity or the influence or effect). Regarding the impact of research, universities must submit impact case studies, namely, documents that descriptively illustrate the impact and a template describing the piece of research (in the last 20 years) from which the impact has originated and their connection (HEFCE, 2015). As shown by the impact pilot exercise, the methodology of case studies allows to adopt a broader view of the academic impact, especially beyond mere quantitative economic effects. Nevertheless, it also implies bureaucratic and time-consuming processes to develop them (Technopolis, 2010). Moreover, both Martin (2011) and Penfield et al. (2014) noted three main methodological challenges that could affect the overall goodness of the impact evaluation. First, the conceptualization and

An Analysis of Methodologies, Incentives, and Effects of Performance. . .

55

magnitude of the impact might be very different across disciplines in terms of the time lag between research and impact and in the degree of feasibility in gathering evidence. Second, not all the effects of academic research can be considered positive. Third, the impact might also be indirect and fuzzy and can evolve over time. This contrast, at least partially, is the linear model that underpins the funding agency’s evaluation approach on the impact assessment. Last, Samuel and Derrick (2015) show that evaluators can present heterogeneous perceptions about how to characterize impact, pointing to a potential issue of subjectivity and inconsistency of judgement as well. “Research environment” is the third dimension of research evaluated by the REF, and a template describing the research environment is submitted for each UoA. This is assessed in terms of its contribution to the “vitality” and “sustainability” of the unit under evaluation. The template consists of four sections: research strategy, academic and nonacademic staff, income and infrastructures/facilities, and contribution to the discipline.

3.3

Effect of Research Evaluation and Transition Toward REF 2021

This section highlights the main intended and unintended consequences that the evaluation exercise has generated at the level of the HE sector, the university and academics (Rijcke et al., 2016). Some of these have also been clearly identified by the last independent review of the REF chaired by Lord Nicholas Stern (Stern, 2016). I. At the level of the HE sector, the REF has certainly contributed to enhancing the research orientation of British universities in terms of both productivity and awareness, especially for the more “teaching-oriented” institutions (Bence & Oppenheim, 2005; Moed, 2008). However, the concentration of incentives on research has undeniably decreased the value and importance being given to teaching from both academics and universities (Henkel, 1999; Geuna & Martin, 2003; UCU, 2013), especially within the most research-intensive institutions. Second, the REF-based funding system has increasingly concentrated resources within a limited set of institutions that receive the predominant share of quality-related funds, leading to a mechanism in which the “rich get richer.” The 1992 exercise allocated approximately 90% of funds to pre-1992 universities, whereas approximately 75% of REF 2014 funding was concentrated in Russell group universities, with the top 10 universities receiving just over 50% of the funds. Furthermore, this concentration implies that it would be difficult for less research-oriented universities to invest more in research activities and improve their performances much further. Ultimately, this concentration is almost entirely self-perpetuating (Geuna & Martin, 2003).

56

G. Barbato and M. Turri

II. At the level of universities, the REF provides information as well as benchmarking possibilities that can be used for the internal strategic management of the university (Technopolis, 2015). Furthermore, REF results can also be employed to externally improve the university’s image. Nevertheless, the evaluation exercise has also generated increasingly significant costs for universities (Geuna & Piolatto, 2016). The main share of REF 2014 costs is indeed covered by universities, and it is estimated at £232 million out of the total £246 million (Technopolis, 2015). This amount is much higher than that of the 2008 (£66 million) and 1989 (£4.1 million) exercises, partly due to the introduction of the impact evaluation. The largest percentage of the costs is represented by submission preparation (£212 million), among which £55 million only concerns the impact (£7500 pound per impact case study). Moreover, the “Accountability review” has also depicted an increasing bureaucratization of the evaluation system within universities, with the introduction of procedures and mechanisms aimed to support the REF submission process (Technopolis, 2015). These range from appointing a REF manager/committee to run a mock REF to choose the staff and outputs worth submitting. Most of these employ bibliometrics to predict the outputs that were going to be evaluated more positively. Several universities have also hired external consultants to develop impact case studies. III. At the level of academics, three main effects can be identified. First, the REF affects the nature and character of the research carried out by academics (Butler, 2007; Technopolis, 2018). Adams and Gurney’s analysis (2014) shows an increasing tendency from academics to skew the selection of their outputs to be submitted toward articles published in journals with a high impact factor, even though these articles are not well cited or even research papers. This is particularly evident in some fields of study in which articles are not the first primary source of publications, such as the social sciences and engineering. Being published in such journals is seen by itself as an objective and immediate proxy of research quality. Harley (2002) similarly sheds light on this process of compliance with the perceived demands of the evaluation exercise. Moreover, Joynson and Leyser (2015) and Technopolis (2018) illustrated a widespread perception among academics that interdisciplinary works could not be perceived as positively by university management as monodisciplinary works in terms of future REF evaluation (UCU, 2013). However, this issue stems more from how universities prepare submissions rather than from the mere presence of the REF (Technopolis, 2018). Second, academics are increasingly subjected to psychological pressures to publish enough papers to be considered for the submission process. The “publish or perish” narrative is indeed argued to lead to a commodification of academic labor (Henkel, 1999; Bence & Oppenheim, 2005). Almost 60% of the University and College Union’s survey respondents (2013) recognize that the pressure to meet REF expectations had increased their stress levels, and approximately 35% also experienced negative health effects. This can further

An Analysis of Methodologies, Incentives, and Effects of Performance. . .

57

affect their research integrity, encouraging the manipulation of data or the reporting of only positive results, as reported by Joynson and Leyser (2015). Third, the REF is claimed to affect academics’ careers directly and indirectly (Technopolis, 2018). Some studies have revealed that academics are often sanctioned if their research performances were not considered satisfying, with some cases of early forced retirement (Bence & Oppenheim, 2005). In terms of recruiting, many universities have started to use bibliometrics to determine who is going to contribute most to improving future REF submissions (UCU, 2013). Three main changes in the next REF 2021 (Research England, 2019) can be relevant to address some of these issues. First, universities could previously choose which academics can submit, causing potential negative consequences on academics. REF 2021 establishes that universities will now have to return all staff with “significant responsibility for research.” Universities still decide upon who displays this feature, but this has to be justified through the development of an internal code of practice concerning the selection of outputs. A second change addresses the number of outputs that can be submitted. REF 2021 shifts the emphasis from individuals to the submitting unit in computing the number of outputs to be submitted, partly reducing pressure to produce publications. Third, research outputs could no longer be “portable” from one university to another as a result of the academics’ transfer if the outputs were already made publicly available. Finally, REF 2021 also broadens the interpretation of impact, including that on public engagement and teaching; furthermore, it introduces an institutional level assessment for the research environment and provides new instruments to address interdisciplinary research.

4 The Evaluation of Teaching in the English HE Sector Similar to research activity, quality of university teaching is mainly3 assessed through a national evaluation exercise known as the Teaching Excellence and Student Outcomes Framework (TEF). The potential introduction of the TEF was initially mentioned in the Conservative party’s manifesto for the 2015 general election. This identified its rationale in ensuring that universities deliver the best possible value for money to students, which significantly contributes to the HE sector through high tuition fees and the connected loan system. A second purpose of the TEF is to inform students’ decisions about where to study based on the conception of students as “consumers,” which stems from the increasing marketization of the English HE sector (Gunn, 2018; Deem & Baird, 2020). However, the influence of the TEF in this regard seems still 3

The set of quality assurance processes implemented by the Quality Assurance Agency (QAA) operates alongside the TEF. Yet, this chapter will only focus on the TEF since this is the main evaluative system that counterbalance the REF.

58

G. Barbato and M. Turri

limited. A survey carried out by the Universities and Colleges Admissions Service (UCAS) (2018) highlights that only 17.1% of students approaching HE know what the TEF is, and 58% of this group stated that it was important in deciding which university to choose. The “market” narrative attached to the TEF has drawn several critics toward it (Deem & Baird, 2020), which have been further fostered by its rapid and top-down implementation from the Department for Education (DfE) and the funding agency (Gunn, 2018). A first introductory year of the TEF was indeed carried out in 2016 after an open consultation about its metrics, and it was legally implemented through the 2017 Higher Education and Research Act. Finally, the TEF was also thought to counterbalance the excessive emphasis on research brought by the REF by providing both reputational and financial incentives towards teaching, even though these incentives seem modest compared with those provided by the REF. Reputational incentives are expressed in the gold, silver, or bronze medals awarded during the assessment procedure, while the financial incentives included the possibility (not introduced to date) to charge tuition fees up to £9250 per year instead of the current maximum cap of £9000. Nevertheless, one of the first pieces of empirical evidence (Cui et al., 2019; UUK, 2019) underlines that the TEF certainly contributes to rebalancing attention toward teaching, even though the majority of academics (and universities) still attribute a very small or no effect to teaching quality.

4.1

Features and Operation of the Teaching Excellence Framework

The Teaching Excellence Framework (TEF) is a metrics-based evaluation system and thus evaluates teaching quality through a set of indicators covering different areas of the teaching mission, namely, “teaching quality,” “learning environment,” and “student outcomes and learning gain” (DfE, 2017). The Office for Students4 (OfS) is the body in charge of coordinating and managing the TEF. In contrast to the REF, it assesses teaching quality every academic year and provides a single rating at the institutional level. However, the government is aiming to implement the TEF at a subject level in the next several years due to two subject-level pilot exercises conducted for a sample of universities. The results of these pilot exercises will also inform Shirley Pearce’s independent review of the TEF that is expected to provide recommendations in 2021, leading to a potential significant revision of the TEF evaluative framework. The TEF evaluation procedure is carried out by an independent panel of experts and assessors, of which two-thirds are academics and one-third are students. Some

4

As a result of the 2017 Higher Education and Research Act, the Office for Students has replaced HEFCE as the regulatory body for teaching.

An Analysis of Methodologies, Incentives, and Effects of Performance. . .

59

experts on widening participation and employment are involved too. Moreover, TEF officers will provide training and assistance to panel members/assessors during the entire assessment process. The evaluation process is based two main sources of evidence: the quantitative indicators reported in Table 1 and a qualitative and narrative document known as “provider submission.” The quantitative indicators are divided into six core metrics and three supplementary metrics. The metrics related to “teaching quality” (1a and 1b) and one of “learning environment” (2a) are measures of students’ satisfaction calculated on a set of questions from the National Student Survey5 (NSS). The other indicators regard the status (3a) and type (3b) of employment of graduates as well as the regularity of students’ careers (2b). As can be noted from Table 1, almost all the metrics are measures of outputs or outcomes (career progression; employability) of the learning and teaching activity while processes and inputs are less considered and only through the perspectives of students (e.g., feedbacks from teachers/tutors). The metrics are computed for the three most recent years, cover only the undergraduate provision, and are presented separately for full- and part-time students (OfS, 2018). Furthermore, each core metric is computed for a series of subgroups based on some characteristics of the student body, such age, gender, ethnicity, disability, and domicile. These are known as split metrics. All the metrics are calculated directly by the OfS through national databases on HE and managed through a TEF metric workbook. The workbook provides for each core and split metric a benchmark value and the difference between the university’s metric value and the benchmark, along with a z-score that reports if the difference is statistically significant (this is underlined with a flag). The benchmark is a measure of “expected performance,” a weighted sector average for that specific metric, that takes into account variables that are outside the control of the provider, such as the entry qualification of students and the subject of study, to mention a few. Both the benchmark and the difference thus inform assessors on how to interpret metric values and are indeed unique for each university. The second source of evidence used during the assessment process is the provider submission. This is a qualitative document, no longer than 15 pages, through which universities contextualize their own performances and illustrate their institutional approach toward teaching excellence and how this affects students (OfS, 2018). In this regard, as shown by Beech (2017), several additional qualitative and quantitative data are reported, such as citations from the external quality assurance review and student union statements, other internal learning analytics, UCAS data and other national league tables and rewards. The assessment process is structured in three consecutive steps (DfE, 2017) and carried out by an independent panel of experts and assessors as aforementioned. During the first step, panel members will only look at the core metrics with attention to their distance from benchmarks (the flags) and using split metrics and contextual

The NSS records students’ opinions on several aspects of their degree programs in the final year of their academic career.

5

60

G. Barbato and M. Turri

Table 1 TEF core and supplementary metrics, description, and source of the data Areas Teaching quality (1)

Core and supplementary metrics Student satisfaction with teaching on their courses (1a)

Student satisfaction with assessment and feedback (1b)

Grade inflation (1c)

Learning environment (2)

Student satisfaction with academic support (2a)

Student retention on courses (2b)

Students outcomes (3)

Employment or further study (3a)

Highly skilled employment or further study (3b)

Above median earnings (3c)

Description and source National Student Surveya (NSS), questions 1 to 4 (at 2018): – Q1: Staff are good at explaining things. – Q2: Staff have made the subject interesting. – Q3: Staff are enthusiastic about what they are teaching. – Q4: My course has challenged me to achieve my best work. National Student Survey (NSS), questions 8 to 11 (at 2018): – Q8: The marking criteria have been clear in advance. – Q9: Marking and assessment has been fair. – Q10: Feedback on my work has been timely. – Q11: Have received helpful comments on my work. HESAb and ILRc student records: This metric provides information on the type of degrees awarded in order to give evidence on the grading policy. It measures the number/proportion of the different types of degrees awarded (1sts, 2:1 s, other degree classifications and unclassified degree awards) 10 years ago and in the most recent 3 years. National Student Survey (NSS), questions 12 to 14 (at 2018): – Q12: I was able to contact a member of the staff when I needed to. – Q13: I received sufficient advice and guidance in relation to my course. – Q14: Good advice was available when I needed to make study choices. HESA and ILR student records: This metric tracks students from the year they enter a HE provider to the next academic year. To be counted as continuing, a student must either have qualified or be recorded as actively studying on a HE program. DLHEd survey: Percentage of UK-domiciled leavers who say they are working or further studying at 6 months after graduation. DLHE survey: Percentage of UK-domiciled leavers who say they are in highly skilled employmente or studying at 6 months after graduation. LEOg dataset: Proportion of qualifiers in sustained employment (continued)

An Analysis of Methodologies, Incentives, and Effects of Performance. . .

61

Table 1 (continued) Areas

Core and supplementary metrics

Sustained employment or further study (3d)

Description and source who are earning over the median salary for 25 to 29 year-olds. LEO dataset: Proportion of qualifiers in sustained employment or further study 3 years after graduation.

a

The NSS records students’ opinions on their degree programs and on the overall academic support during the entire academic career by final year students on a voluntary basis b Higher Education Statistical Agency (HESA) c Individualized Learner Record (ILR) d Destinations of Leavers in Higher Education (DLHE) e The UK Standard Occupational Classification (SOC) classifies jobs within 1, 2 and 3 groups as ‘highly skilled’ g Longitudinal Education Outcomes (LEO) dataset Source: Office for Students (2018): TEF Framework. Year Four Procedural Guidance

data when necessary. The three metrics build up on NSS data present a weight of 0.5 while the others equal to 1. Based on this process an initial hypothesis on the rating is generated.6 Secondly, the provider submission and the supplementary metrics will be then considered to decide if the initial hypothesis can be confirmed or has to be modified (the second step). Both the first and the second assessment steps are operatively carried out within small groups of panel members that consist of (at least) two academics and (at least) one student. Each group will look at a set of HE providers. Finally (third step), a meeting of the full TEF panel will determine collectively the final rating (A “Gold,” “Silver,” or “Bronze” medal) based on the recommendations advanced by each group of panel members. A statement of findings is also provided to each HEI, in which the reasons behind the award are explained. Although metrics are undeniably more important in the overall assessment process, Beech (2017) has highlighted that approximately 25% of the initial hypotheses has been changed after the analysis of the provider submission.

5 Discussion and Conclusion Previous sections have descriptively illustrated the main features of the PE of research and teaching activities in England, giving particular attention to their patterns of implementation (policy rationales), methodology of assessment and intended (and not) effects on academics and universities. This section aims to analyze the two aforementioned experiences by using the theoretical lens of the Management Control Theory (MCT), and the two analytical dimensions introduced 6

So, for example, if a HE provider presents a total value of 2.5 based on its core metrics, should be assessed, at the end of the first step, with the “Gold” rating.

62

G. Barbato and M. Turri

in Sect. 2, namely (I) the degree of the measurability and attributability of the output and (II) the knowledge of the cause–effect relation producing the output (or transformation process). The evaluation of research will be discussed initially and subsequently that of teaching. In terms of evaluation methods, the assessment of research carried out in the REF is based on a peer-review process conducted by experts that assesses the quality of a research output (a publication, a patent) according to the consistency of theoretical arguments and research methodologies employed to achieve that specific research finding, in other words, according to knowledge of the transformation process that is behind the development of that research output (Frey et al., 2013). The strength of the peer-review method is thus the reference by the evaluators (usually other academics) to the same code of practices, competences, and knowledge that are embodied by the object of assessment (research products) (Turri, 2005). In this sense, the peer-review method balances elements coming from both the process and output control types described by the MCT in Sect. 2. An evaluation process focusing uniquely on outputs might be indeed problematic since the attributability and measurability of research outputs can vary, especially for team-based disciplines (e.g., natural sciences and medicine), in which the specific contribution of each researcher to the output might not be completely transparent and thus identifiable. Furthermore, the capacity of the few quantitative metrics available on research outputs (e.g., number of citations, journal impact factor) to exclusively and comprehensively assess the quality of research is controversial and the object of a still-open debate (Weingart, 2005; Butler, 2007; Rijcke et al., 2016). In contrast, knowledge of the cause–effect process leading to a research output should always be high due to the replicability and transparency requirements of science, favoring a process-oriented evaluation like peer review. Moreover, research outputs are “singularities,” in other words, unique and original, and cannot be assessed through standardized and mechanic methods (Rebora & Turri, 2013). Therefore, at least in theoretical terms, the REF methodology seems to be consistent with the nature of research and the values of the academic community. However, the literature has noted some dysfunctionalities related to the REF (Harley, 2002; Geuna & Martin, 2003; Rijcke et al., 2016). These unintended consequences are the unsurprising result of a “silent” drift in the conceptualization of research evaluation toward a mere quantitative evaluation of outputs (the output control typology of MCT). An important driver of the increasing importance of metrics can be primarily found in how scholars perceive research quality, which has increasingly become skewed toward specific quantitatively measurable properties of research outputs (e.g., articles vs books; journal’s impact factor). These are increasingly seen by academics (and university management) as objective metrics of good performance and thus used to select papers for the future REF submission. As underlined by Adams and Gurney’s study (Adams & Gurney, 2014, p. 8), academics tend to “favour journals with high average citation impact and among those journals they are persuaded that a high Impact Factor beats a convincing individual article.” Moreover, universities’ top management started to internally implement their own

An Analysis of Methodologies, Incentives, and Effects of Performance. . .

63

set of evaluation practices, often relying on quantitative metrics as well as setting targets, incentives and sanctions that usually affect career progression (UCU, 2013). Therefore, an excessive relying on measurable aspects of research outputs from both academics and university top management, can hamper the fragile balance between the specific features of the activity under assessment and the appropriate evaluative instruments as underlined by the MCT. This process can also generate unintended consequences on the quality and authenticity of individual (and organizational) research performances such as manipulation of data and low-quality papers to meet deadlines, as highlighted by Joynson and Leyser (2015). Second, as mentioned in the theoretical section, MCT emphasizes the scope of gaming behaviors that can arise from incentives when the typology of evaluation adopted does not fit well with the activity to be assessed. As underlined by Speklé and Verbeeten (2014, p. 132), if the quantitative metrics only provides a partial representation of a task/activity, the presence of strong incentives might “induce organizational actors to focus on target achievement rather than on organizational goals.” Regarding the REF, two main gaming practices can be identified from the literature. A first example is represented by the practice of universities hiring top researchers shortly before the REF census date to improve future REF ratings by increasing their research performance (Bence & Oppenheim, 2005; Stern, 2016; Technopolis, 2018). Besides highlighting a case of “gaming” this example clearly shows how the evaluation system will also misrepresent the actual level of research quality, leading thus to a case of performance paradox (Van Thiel & Leeuw, 2002), whereby the evaluative mechanism loses its capacity to discriminate between satisfactory or poor performance due to the blurring effect generated by this “gaming” behavior. A second example of performance paradox can be instead identified in the way universities develop the “research environment template” (see Sect. 3.2). Despite the standardized structure of this template, it has empirically shown how universities tend to misrepresent the actual research environment by dwelling on past achievements rather than current efforts and by using several writing “ornaments” with an uncommon ratio of adjectives to verbs (Thorpe et al., 2018). As underlined by Wilsdon et al. (2015, p. 129), “the narrative elements were hard to assess, with difficulties in separating quality in research environment from quality in writing about it.” Third, while peer review seems to naturally fit with the features of the research activity, it also entails relevant and increasing costs. Geuna and Martin’s (2003) analysis suggests that while initial benefits may outweigh the costs, a burdensome system such as the REF will probably generate decreasing returns over time, thus raising questions about its future sustainability and the efficiency of the peer-review methodology itself (Geuna & Piolatto, 2016). These unintended consequences have been partly softened and addressed by the evidence-based reviews after each evaluation exercise, with particular relevance of the last one (Stern review) illustrated at the end of Sect. 3.3. Regarding the evaluation of teaching activities carried out through the TEF, the analytical lens of the MCT allows to highlight a complex task whose assessment is

64

G. Barbato and M. Turri

anything but linear and effortless. Teaching activities are indeed characterized by a low and unclear attributability and measurability of outputs as well as by an unstable and unpredictable knowledge of the cause–effect relation producing the output. First, teaching is characterized by the intensive interdependency between teacher and student efforts, which are both crucial for the success of this activity (Turri, 2005). The success of teaching is certainly shaped by teaching/pedagogical competences displayed by the teacher, but it equally depends on students’ attitudes and efforts. These are also the result of prior educational paths and personal and intellectual capacities that are not necessarily generated within the teacher–student interaction. This feature implies not only a problem of attributability of outputs but also that the knowledge of the cause–effect relation is intrinsically low due to the instability, unpredictability and complexity of this activity. Second, while research activity naturally tends to be synthesized in a product whose assessment gives a good approximation of the research process behind it, the teaching activity does not display a similar representative output, and several activities might be recognized as such (Turri, 2005). Some can be measured through indicators (e.g., graduates’ employability or students’ dropout), while others can be quantified less (e.g., students’ learning gain) (Leiber, 2019). Based on what has been illustrated by the MCT, the features of teaching activity illustrated so far, would suggest that the mere evaluation of some quantitative outputs would not be enough to capture the complex nature of teaching. Nevertheless, the TEF is strongly oriented toward the assessment of a narrow set of outputs (Gunn, 2018), covering only those that are more easily measurable through quantitative measures, namely, student satisfaction and graduates’ employability. This entails that other more qualitative but still-crucial aspects of teaching, such as learning gains, teachers’ competences/ attitudes, learning strategies, and curriculum design (Leiber, 2019; Cui et al., 2019), are omitted from the assessment process, resulting in the abovementioned issue of “tunnel vision” (Smith, 1995). Furthermore, teaching outputs that are quantified and assessed through the TEF (student satisfaction and graduates’ employability) are claimed to be only limitedly shaped by the teaching process since other exogeneous factors intervene in their generation. Concerning the students’ satisfaction metrics (Table 1, no. 1a, 1b, 2a), it is claimed how students’ satisfaction and teaching quality are different constructs since satisfaction is influenced by factors that are beyond the teaching process itself (Spooren et al., 2013). Metrics regarding the employment of graduates (metrics 3a and 3b) are instead claimed to be partially affected by factors that do not depend on universities’ efforts in teaching activities but are related, for example, to the health of the local labor market and economy (Deem & Baird, 2020; UUK, 2019). The ambiguity on which factors really and ultimately affect teaching outputs under assessment (such as satisfaction and employability), might thus weaken the ability of the evaluative metrics to discriminate between a good or a poor performance, resulting in a potential “performance paradox” problem. Last, TEF metrics cover only undergraduate provision and mostly UK domiciled students, since international students are not included in employment metrics, although they represent approximately 20% of HE students in England. In summary, it is often claimed that the TEF

An Analysis of Methodologies, Incentives, and Effects of Performance. . .

65

measures teaching quality only indirectly, with questionable metrics and with a narrow perspective, resulting in a partial representation of teaching quality within universities (O’Leary & Wood, 2019; UUK, 2019). Concerning the emergence of unintended consequences related to the evaluation of teaching activities, even though evidences on the effects of the TEF on academics and universities are still limited, some potential risks might already be envisaged. First, since TEF metrics do not fully capture all relevant teaching dimensions but the evaluation exercise provides reputational incentives to universities (in the form of the “gold,” “silver,” or “bronze” medal), university management might implement strategies and invest resources with the narrow goal of essentially improving and prioritizing the activities measured through metrics (e.g., employability), thus risking to lose and not supporting a more holistic vision of teaching. This aforementioned unintended consequence is known as “effort substitution” (Speklé & Verbeeten, 2014). The first empirical evidence on the TEF illustrates similar tendencies. Cui et al. (2019) illustrate indeed how the TEF has certainly increased the internal centralization and standardization of teaching activities as well as the accountability of academics, with activities mainly directed at satisfying TEF metrics and not improving the overall L&T experience. Second, the high relevance of student satisfaction metrics could provide negative incentives for universities to actually lower teaching quality, since “innovative forms of teaching [. . .] often score low student satisfaction ratings, despite these methods often being highly effective in enhancing student learning” (RSS, 2016, p. 1). These “gaming behaviors” would hardly be spotted by the metric nature of TEF assessment, potentially undermining its effectiveness. Finally, MCT underlines that when the measurability/attributability of outputs and knowledge of the cause–effect process are low, input control, based on the selection and socialization of individuals towards that specific task/activity, can be more effective (Frey et al., 2013). Concerning the evaluation of teaching, this insight might be interpreted as a call to shift attention from exclusive output evaluation to the potential value of training young academics in teaching activities (instead of just focusing on research training and production) as well as faculty development. In conclusion, the interpretation of these two experiences of PE through the lens of the MCT certainly contributes to shed light on the importance of the relationship between the nature/features of the activity/task under evaluation and the most appropriate instrument/methodology to be used for its assessment. This relationship can indeed be crucial to understand more deeply potential and actual unintended consequences that PE systems can generate, as it has been illustrated in this chapter, and should thus properly be considered by both national and local policymakers during the design of PE systems. Acknowledgments We acknowledge the support by the Italian Ministry of Education, University, and Research through the PRIN 2015: “Comparing Governance Regime Changes in Higher Education: systemic performances, national policy dynamics, and institutional responses. A multidisciplinary and mixed methods analysis” (2015RJARX7).

66

G. Barbato and M. Turri

References Adams, J., & Gurney, K. A. (2014). Evidence for excellence: Has the signal overtaken the substance? An analysis of journal articles submitted to RAE2008. Digital Science. Barbato, G., & Turri, M. (2017). Understanding public performance measurement through theoretical pluralism. International Journal of Public Sector Management, 30(1), 15–30. Beech, D. (2017). Going for gold: Lessons from the TEF provider submissions. Higher Education Policy Institute (HEPI) Report 99. Bence, V., & Oppenheim, C. (2005). The evolution of the UK’s research assessment exercise: Publications, performance and perceptions. Journal of Educational Administration and History, 37(2), 137–155. Bevan, G., & Hood, C. (2006). What’s measured is what matters: Targets and gaming in the English public health care system. Public Administration, 84(3), 517–538. Butler, L. (2007). Assessing university research: A plea for a balanced approach. Science and Public Policy, 34(8), 565–574. Cui, V., French, A., & O’Leary, M. (2019). A missed opportunity? How the UK’s teaching excellence framework fails to capture the voice of university staff. Studies in Higher Education. https://doi.org/10.1080/03075079.2019.1704721 Deem, R., & Baird, J. (2020). The English teaching excellence (and student outcomes) framework: Intelligent accountability in higher education? Journal of Educational Change, 21, 215–243. Department for Education (Dfe). (2017, October). Teaching excellence and student outcomes framework specification. DfE. Diefenbach, T. (2009). New public management in public sector organizations: The dark sides of managerialistic enlightenment. Public Administration, 87(4), 892–909. Ferlie, E., Musselin, C., & Andresani, G. (2008). The steering of higher education systems: A public management perspective. Higher Education, 56(3), 325–348. https://doi.org/10.1007/s10734008-9125-5 Frey, B. S., Homberg, F., & Osterloh, M. (2013). Organizational control systems and pay-forperformance in the public sector. Organization Studies, 34(7), 949–972. Geuna, A., & Martin, B. R. (2003). University research evaluation and funding: An international comparison. Minerva, 41(4), 277–304. Geuna, A., & Piolatto, M. (2016). Research assessment in the UK and Italy: Costly and difficult, but probably worth it (at least for a while). Research Policy, 45(1), 260–271. Gunn, A. S. (2018). Metrics and methodologies for measuring teaching quality in higher education: Developing the teaching excellence framework (TEF). Educational Review, 70(2), 129–148. Harley, S. (2002). The impact of research selectivity on academic work and identity in UK universities. Studies in Higher Education, 27(2), 187–205. HEFCE. (2011). Research excellence framework 2014: Assessment framework and guidance on submissions, REF 02/2011. HEFCE. (2015). Research excellence framework 2014: Manager’s report. Henkel, M. (1999). The modernisation of research evaluation: The case of the UK. Higher Education, 38(1), 105–122. Hicks, D. (2012). Performance-based university research funding systems. Research Policy, 41(2), 251–261. Joynson, C., & Leyser, O. (2015). The culture of scientific research. F1000Research, 4(66), 1–11. Leiber, T. (2019). A general theory of learning and teaching and a related comprehensive set of performance indicators for higher education institutions. Quality in Higher Education, 25(1), 76–97. Martin, B. R. (2011). The research excellence framework and the ‘impact agenda’: Are we creating a Frankenstein monster? Research Evaluation, 20(3), 247–254. Moed, H. F. (2008). UK research assessment exercises: Informed judgments on research quality or quantity? Scientometrics, 74(1), 153–161.

An Analysis of Methodologies, Incentives, and Effects of Performance. . .

67

O’Leary, M., & Wood, P. (2019). Reimagining teaching excellence: Why collaboration, rather than competition, holds the key to improving teaching and learning in higher education. Educational Review, 71(1), 122–139. Office for Students. (2018). Teaching excellence and student outcomes framework (TEF) framework. Year four procedural guidance. OfS 45/2018. Ouchi, W. (1979). A conceptual framework for design of organizational control mechanism. Management Science, 25(9), 833–848. Penfield, T., Baker, M. J., Scoble, R., & Wykes, M. C. (2014). Assessment, evaluations, and definitions of research impact: A review. Research Evaluation, 23(1), 21–32. Rebora, G., & Turri, M. (2013). The UK and Italian research assessment exercises face to face. Research Policy, 42(9), 1657–1666. Research England. (2019). Guidance on submissions – REF 2021 01/2019. Rijcke, S. D., Wouters, P. F., Rushforth, A. D., Franssen, T. P., & Hammarfelt, B. (2016). Evaluation practices and effects of indicator use: A literature review. Research Evaluation, 25 (2), 161–169. Royal Statistical Society (RSS). (2016). Response to the Department for business innovation and skills’ technical consultation (year 2) on the teaching excellence framework. Retrieved from http://www.rss.org.uk/Images/PDF/influencing-change/2016/RSS-response-to-BIS-TechnicalConsultation-on-Teaching-Excellence-Framework-year-2.pdf Samuel, G. N., & Derrick, G. E. (2015). Societal impact evaluation: Exploring evaluator perceptions of the characterization of impact under the REF2014. Research Evaluation, 24(3), 229–241. Shattock, M. L. (2012). Making policy in British higher education 1945–2011. McGraw-Hill/Open University Press. Smith, P. (1995). On the unintended consequences of publishing performance data in the public sector. International Journal of Public Administration, 18, 277–310. Speklé, R. F., & Verbeeten, F. H. M. (2014). The use of performance measurement systems in the public sector: Effects on performance. Management Accounting Research, 25(2), 131–146. Spooren, P., Brockx, B., & Mortelmans, D. (2013). On the validity of student evaluation of teaching. Review of Educational Research, 83(4), 598–642. Stern, N. (2016). Building on success and learning from experience: An independent review of the research excellence framework. Technopolis. (2010). REF research impact pilot exercise: Lessons learned project: Feedback on pilot submission. Technopolis. (2015). REF accountability review: Costs, benefits and burden. Technopolis Group. Technopolis (2018). Review of the research excellence framework: Evidence report. Technopolis Group. Thorpe, A., Craig, R., Hadikin, G., & Batistic, S. (2018). Semantic tone of research ‘environment’ submissions in the UK’s research evaluation framework 2014. Research Evaluation, 27(2), 53–62. Turri, M. (2005). La valutazione dell’Università. Un’analisi dell’impatto istituzionale e organizzativo. Guerini e Associati. UCAS. (2018). The Teaching Excellence and Student Outcomes Framework (TEF) and demand for full-time undergraduate higher education. Retrieved from https://www.ucas.com/file/173266/ download?token¼OVbDbdKZ Universities UK. (2019). The future of the TEF: Report to the independent reviewer. Retrieved from https://www.universitiesuk.ac.uk/policy-and-analysis/reports/Documents/2019/future-of-thetef-independent-reviewer.pdf

68

G. Barbato and M. Turri

University and College Union. (2013). The research excellence framework (REF) - UCU survey report. Van Thiel, S., & Leeuw, F. (2002). The performance paradox in the public sector. Public Performance and Management Review, 25(3), 267–281. Weingart, P. (2005). Impact of Bibliometrics upon the science system: Inadvertent consequences? Scientometrics, 62(1), 117–131. Wilsdon, J., Allen, L., Belfiore, E., Campbell, P., et al. (2015). The metric tide: Report of the independent review of the role of metrics in research assessment and management. HEFCE.

A Longitudinal Analysis of the Relationship Between Central Government and Universities in France: The Role of Performance Measurement Mechanisms Giovanni Barbato, Clément Pin, and Matteo Turri

1 Introduction The goal of this chapter is to illustrate and discuss the evolution of the relationships between central government bodies and French universities in the last 30 years, with a particular focus on the role played by performance measurement mechanisms within them. First of all, the chapter will introduce the peculiarities of the French higher education system. In the second section, the significant changes that occurred at both a systemic (i.e. the entire higher education system) and corporate level (i.e. every university) of the governance in the last decades will be illustrated. The following will describe the functions they have assumed in the abovementioned evolution: • Strengthening and centralizing the evaluation of academic activities. • Contracts between the university (or aggregation of universities) and the ministry. • Interventions to promote mergers between universities and aggregation of universities. • Policies to promote excellence. This study is part of the PRIN project ‘Comparing Governance Regime Changes in Higher Education: systemic performances, national policy dynamics, and institutional responses. A multidisciplinary and mixed methods analysis’. G. Barbato · M. Turri (*) Dipartimento di Economia, Management e Metodi Quantitativi (DEMM), Università degli Studi di Milano, Milan, Italy e-mail: [email protected]; [email protected] C. Pin Laboratory for interdisciplinary evaluation of public policies, Sciences Po, Paris, France e-mail: [email protected] © The Author(s), under exclusive license to Springer Nature Switzerland AG 2022 E. Caperchione, C. Bianchi (eds.), Governance and Performance Management in Public Universities, SIDREA Series in Accounting and Business Administration, https://doi.org/10.1007/978-3-030-85698-4_4

69

70

G. Barbato et al.

For each instrument, particular care will be taken to highlight the performance dimensions involved and the effects induced. The changes that have taken place in France and their effects are examined through documents and reports. The analysis of the documents was enhanced and completed by interviews conducted in Paris in 2019 with privileged observers, either because they had the opportunity to directly observe the processes involved, due to a significant position, or because they conducted in-depth and extended studies on the topics of this chapter.

2 A Heterogeneous System The structure of higher education and research in France is marked by specific traits that are associated with the coexistence of institutions with different characteristics (horizontal differentiation). This is the effect of the historical legacy of double partition, i.e. universities and grandes écoles in higher education; organismes de recherche (Public research organizations—PROs) and university laboratories in research. This heterogeneity is also reflected in the system governance mechanisms, which are particularly articulated.1 The 75 French universities are directly funded by the government and are spread throughout the country and are attended by 62% of tertiary education students (HCERES, 2016). Thanks to the application of the directives of the 1999 Bologna Declaration, they have an educational offer similar to the other European countries, and are based on the ECTS (European Credit Transfer and Accumulation System). Besides universities there are the historical grandes écoles, which are relatively small and highly selective schools providing higher education for the nation’s future elites—tomorrow’s hauts fonctionnaires (senior civil servants), industry leaders, senior military officers, politicians, engineers and physicians. In fact, more than 500 public and private institutions are grouped under the name grandes écoles, including engineering schools, normal schools (ENS), commercial schools and veterinary schools. Hence there are not only the grandes écoles in the strict sense of the term (public and highly selective), but more generally the écoles (which may be both public and private and are distinct from universities because they are allowed to select their students). Alongside the universities, the French state has gradually created its own organismes de recherche in order to promote research. Nowadays, there are about 40 of them, specialised in various disciplines and subjects. Amongst the best known of these are the Centre national de la recherche scientifique (CNRS—quantitatively

1

The following paragraphs on the proportions of the French university system are taken with adaptations and updates from a study carried out on behalf of UNIRES by Antonietta Ciclista and Matteo Turri and published by Fondazione Crui: “How governance changes. Italian and European universities in comparison”, eds. Capano and Regini (2015).

A Longitudinal Analysis of the Relationship Between Central Government and. . .

71

the most important), the Institut national de la santé et de la recherche médicale (INSERM), the Institut national de la recherche agronomique (INRA) and the Institut national de recherche en informatique et en automatique (INRIA). The universities also have their own research units, called équipe d’accueil (EA) by the Ministry and based on the disciplinary logic. This duality of research units has been partially mitigated since the 1990s thanks to the creation of Unités mixtes de recherche (UMR), which links the CNRS units locally with other PROs. There are currently 830 UMRs (about 90% of CNRS units and about 95% of INSERM units are involved in a UMR). This chapter focuses on the changes that have affected the university sector; however, as we shall see, the double partition is a central element in influencing the reform processes.

3 Changes in Governance The governance of French universities has undergone significant changes both at the systemic and at the institutional (corporate) levels in recent decades. At the systemic level, the main actor has traditionally been the Ministère de l’enseignement supérieur, de la recherche et de l’innovation (MESRI).2 The French university system has been historically characterised by a strong centralisation at the Ministry, extensive prerogatives granted to academic staff organised in faculties, whereas the role of the institutions was so weak that universities were re-established only in 1968, after being abolished during the French Revolution (Musselin, 2001). The 1968 Faure Law redefined the role of universities around three principles: autonomy, participation and a multidisciplinary approach, thus giving universities the title of public scientific, cultural or professional establishments (EPSC). In fact, the law represents a first attempt to strengthen the institutional identity of universities. After about 20 years, a contract was established between the Ministry and universities in 1989. The aim was to give real value to the autonomy of universities, and to allow the government to fully exercise its responsibilities for supporting and rationalising the university system. The contractual mechanism is designed to revise relations between the government and universities by making them less top-down and more equal, based on four-year negotiations that result in an agreement (the

2 Some institutions are not monitored by MESRI: ‘Another 6 ministries are involved in supervising specific groups of higher education institutions: Ministry of Defence (supervises Ecole Polytechnique and other advanced technology institutions); Ministry of Health (supervises medical schools and other institutions offering health and social services programmes); Ministry of Agriculture (supervises Veterinary colleges and institutions offering Landscape, Agricultural engineering and Agrarian studies); Ministry of Environment (supervises Schools of civil engineering); Ministry of Culture and Communication (supervises Art schools and institutions teaching heritage and architecture); Ministry of Trade and Industry (Mining engineering schools)’. (ENQA, 2017).

72

G. Barbato et al.

contract) between the government and every HEI with the subsequent definition of the operating budget allocated (Chevaillier, 1998; Musselin, 2001). A further element of change is represented by the loi d’orientation et de programmation pour la recherche (loi n 2006-450 du 18 avril 2006, the so-called LOPR) which distinguishes the functions of strategic orientation from those of research programming. Consistent with this design is the establishment of the Agence nationale de la recherche (ANR), which is a public body under the authority of the MESRI in charge of funding research. The beneficiaries can be research institutions, universities or even private institutes. The law unifies the various existing evaluation bodies into a single agency called Agence d’évaluation de la recherche et de l’enseignement supérieur (AERES), which was reformed in 2013 and renamed Haut conseil de l’évaluation de la recherche et de l’enseignement supérieur (HCERES). Finally, in 2009, an important new player emerged in the governance of the French higher education and research system. This was the Commissariat général à l’investissement (CGI), which has since become the Secrétariat général pour l’investissement (SGPI), created to manage the Programmes d’Investissements d’Avenir (PIA), which were adopted in France in response to the 2008 financial and economic crisis. Unlike the institutional changes mentioned above, the role of this new player has not been formalised in a law that deals with universities and how they operate. The PIAs have not introduced statutory changes for universities, at least not initially and directly. They only introduced new funding instruments for universities. At the university level, since the aforementioned 1968 Faure Law, a key point for the change in governance has been the 1984 Savary Law (Law 1984-52), which introduced the current governance structure. The law specifies the structure of the universities, which consist of different components: schools, institutes, unités de formation et de recherche (UFR), departments and laboratories, research centres. In the spirit of the Faure and Savary laws both aimed at developing universities’ autonomy, in 1989 the Ministry began, as mentioned above, to stipulate contracts with universities to strengthen the coordination between the components of the same university around the settlement strategies carried out by the presidential teams. In line with the principles of New Public Management (NPM), the 2007 loi relative aux libertés et responsabilités des universités (n 2007-1199 du 10 août 2007, known as the LRU law) redirected the promotion of institutional autonomy by establishing their budgetary responsibility. In a context characterised by the spread of international rankings, the general objective of this law was to encourage universities and academics to enhance their performance in relation to relevant publications and thus to increase the international competitiveness of the French higher education and research system (Hoareau, 2011). The LRU law has decisively strengthened the corporate governance of HEIs. With regard to the management bodies, the law introduced the following changes:

A Longitudinal Analysis of the Relationship Between Central Government and. . .

73

– The Board of Directors is reduced in number and its powers are strengthened. – The Scientific Council and the Board of Education and University Life are redefined, thereby losing relevance to the Management Board. – The university presidents’ legitimacy is increased by the new electoral method and the functions assigned to them. Their term of office is renewable and they can be evaluated on the basis of the results achieved. The LRU law also requires universities to take on new responsibilities and competencies: they are allocated a global university budget, which includes the payroll (which used to be a ministerial responsibility); the possibility of using diverse sources of funding; more responsive management of the recruitment of professors, researchers and lecturers. In addition, the President can assign bonuses and tailor the service obligations of professors and researchers. Finally, the government may transfer full ownership of the movable and immovable property to universities. In 2013, the loi relative à l’enseignement supérieur et à la recherche (Law No. 2013-660, known as the ESR Law), reconsidered and partly mitigated some of the indications contained in the LRU. The new law increased the number of members of the Board of Directors (previously up to 30 members), leading to an increase in the representation of all groups with the exception of external members, whose number remains similar to that of 2007. Among the changes introduced by the law, it can be highlighted that all board members can now participate in the election of the chairperson. From an organisational point of view, the ESR Law has changed the division of responsibilities between the governing bodies of the universities. The Board of Directors, which retains responsibility for strategy, management and human resources, has been supplemented by two more consultative bodies, the conseil scientifique (CS) and the conseil des études et de la vie universitaire (CEVU). The ESR law has in fact introduced a change in competences by focusing the board’s activities on strategic issues and transforming the other two bodies into two committees (one for research and one for teaching), which together form the conseil académique. The Academic Board includes representatives of academic and administrative staff, as well as students and a minority of external members: in total the conseil académique consists of 40 to 80 members. In accordance with the structure of the LRU, the conseil académique brings participation and academic representation bodies back into corporate governance, thus limiting the weight of the Board of Directors (Conseil d’Administration). However, it is significant that on the basis of the strategy established by the law, the Board of Directors can veto the recruitment decisions of the conseil académique. The trajectory of development of corporate governance in recent decades, albeit with an uneven trend and some resistance, shows the progressive strengthening of internal governing bodies in at least partial harmony with the greater autonomy granted by the Ministry to universities.

74

G. Barbato et al.

4 Description of the 4 Policy Tools The process of change outlined above can be better understood and examined with specific reference to some coordination mechanisms between the government and universities specifically introduced and inspired by the issue of performance both in terms of measurement and enhancement: • Strengthening and centralizing the evaluation of academic activities. • Contracts between the university (or aggregation of universities) and the ministry. • Interventions to promote mergers between universities and aggregation of universities. • Policies to promote excellence. The combined action of these mechanisms, as we shall see, has influenced the relationships between central government bodies and French universities.

4.1

Evaluation

In the 1980s, France launched a series of evaluation initiatives promoted by a variety of bodies (Larédo, 1997; Chevaillier, 1998, 2004): • The Comité National d’évaluation (CNE), created in 1984 with financial and administrative autonomy, carried out evaluations of academic activities and management of universities through peer review and site visits until 2006. The result of the evaluation procedure was the publication of a final evaluation report. • The Comité national d’évaluation de la recherche (CNER), established in 1989, had as its main focus the evaluation of research activities at supra-institutional level. • At the Ministry, another evaluation body called Mission scientifique, technique et pédagogique (MSTP) evaluated the curricula also for accreditation purposes and managed the evaluation procedure to award individual academics with prizes and awards. • Finally, the Conseil national des universités (CNU), a consultative body set up in 1945, was reorganised in 1987 so to deal with questions relating to the recruitment and promotion of academic staff in universities, in collaboration with the Ministry. The 2006 LOPR law unified the various evaluation bodies into a single agency called Agence d’évaluation de la recherche et de l’enseignement supérieur (AERES). The adoption of the LOPR fits in the context of the application of the Loi organique pour les lois de finances (LOLF), which is the French translation of the rationale of the NPM. From this point of view, the creation of AERES is linked to the intention to refocus the government on strategic functions, leaving to the universities the tasks of management and the adoption of measurement performance

A Longitudinal Analysis of the Relationship Between Central Government and. . .

75

tools to support the relationship between the ministry and universities. The creation of AERES is also consistent with the requirements of the Bologna process and in particular with the Standards and guidelines for quality assurance in the European Higher Education Area adopted in 2005, which recommend the creation of an independent national agency to promote and coordinate national quality assurance systems. There are three main lines of assessment advocated by AERES: – The evaluation of the institutions (60–80 evaluations per year) aimed at assessing the consistency of the strategic direction taken, managerial skills (also in view of the recent financial autonomy), international openness and self-assessment capacity. This evaluation takes into account the diversity between institutions, is based on a self-evaluation report followed by an external visit, and results in the publication of a final report. The evaluation report is published together with a letter from the university in response to AERES’ comments and recommendations. – The evaluation of teaching takes into consideration the scientific and professional relevance of the educational offer. This includes a self-evaluation document to be drawn up by the university, an evaluation by the committee of experts appointed by AERES, and finally a validation phase that ends with the publication of the report. No site visit is required. The procedure is fully consistent with the guidelines issued by ENQA, and AERES is part of the European Quality Assurance Register (EQAR). – Lastly, the evaluation of the research units (about 700 research groups per year) consists of a self-evaluation phase followed by a visit by a panel of experts appointed by the agency. The evaluation results in a report that takes into account the quality of the scientific production and other aspects such as the contribution to industrial research, the connection with the local socio-economic context and the involvement in the international academic community. The creation of AERES introduced a major change in the French higher education and research landscape (Capano & Turri, 2016). Firstly, the evaluations of AERES highlighted the level of performance produced by universities, especially with regard to research. This formalisation of performance is accompanied, promoted and enhanced by the process of institutional consolidation of the universities, thus favouring a relaxation of academic autonomy in favour of stronger institutions that, being accountable for the performance produced, implement and support internal governance mechanisms. Secondly, a debate was opened on the use of evaluation in funding mechanisms (Musselin & Paradeise, 2009). Since 2009, the Ministry has developed a new method of calculating the dotation globale de fonctionnement of universities3 called Système de répartition des moyens à la

The ‘dotation globale de fonctionnement’ corresponds to the main part of the university funding from the government. In addition, there is a contractual allocation negotiated every 4 years, which represents about 20% of the allocations, and other specific allocations (contracts between the government and regions, Plan Campus, PIA etc.).

3

76

G. Barbato et al.

performance et à l’activité (SYMPA). It is not only a question of direct connection between performance and funding, which is in fact minimal and merely transient, since the results of the evaluation published on the AERES website affect the ability of the universities to attract external resources, as non-governmental bodies link funding to the achievement of an excellent evaluation in the AERES evaluation procedure. The link between performance and funding has thus sparked a strong debate that has led to the agency being called into question. Criticism has focused in particular on the creation of equalisations between universities and geographical areas, leading to the 2013 ESR law, which replaced AERES with the Haut conseil de l’évaluation de la recherche et de l’enseignement supérieur (HCERES).4 HCERES is an independent agency whose chairman is appointed by the President of the Republic after consultation with the parliamentary committees. HCERES reports directly to the Parliament from which it receives its operational funds (€17M in 2015). Evaluation procedures are carried out by external experts (around 4500 academics are involved annually) and 115 scientific delegates (researchers and part-time professors in charge of organising evaluations from a scientific point of view). It is organised in 4 activities: evaluation of institutions, evaluation of teaching (including doctoral teaching), evaluation of research units and territorial coordination. The new evaluation body retains in principle the competencies and operating procedures of its predecessor, with some variants associated with greater guarantees of competence of the experts involved in the evaluation procedures and the possibility for research units (often in collaboration with public research institutes) to request the evaluation from other evaluation bodies. A further novelty is that HCERES is required to evaluate not only universities but also sites representing geographical groupings of research, innovation and education institutions, companies and other stakeholder organisations. In essence, however, the main change is precisely the loosening of the link between evaluation and funding, which become independent mechanisms once again (Capano & Turri, 2016).

4.2

Contracts

Contracts are defined as a programme agreement between the Ministry and universities following negotiations. Contracts are in fact a policy instrument commonly used throughout the French public administration and are not specific to universities. Initially limited to research, the contractual policy was extended in 1989 to all aspects of universities, including the management of funding and human resources.5 The central phase of the contract is its preliminary negotiation, whereas its ex post evaluation has traditionally had little relevance. In this sense, therefore, although the

4 5

The suppression of the SYMPA mechanism was also significant. Circulaire n 89–079 du 24 mars 1989.

A Longitudinal Analysis of the Relationship Between Central Government and. . .

77

political instrument of the contract is consistent with NPM narratives, it is not fully aligned with this movement. Initially linked to a specific allocation of ministerial resources, the instrument has lost its financial nature over time and has increased its symbolic value, providing support for all institutional, ordinary and recurring relations between the Ministry and the universities. In particular, following the improvement of the mechanism of contracts, since 2014 the accreditation of universities has also been linked to their entitlement to award degrees. Despite the fact that it was created as an instrument to regulate relations between the Ministry and universities, the effects are significant above all on the institutional identity of the universities. In this regard, Paradeise wrote ‘the groundwork of writing reports and preparing negotiations fostered conversation between co-located faculties, which had previously ignored each other. It favoured cohesiveness of universities as organisations. It became increasingly difficult for scholars to bypass their president by using their own scientific and social relationships with Parisian ministerial departments’ (Paradeise, 2017, p. 6). Contracts have triggered a process that has gradually transformed universities into ‘full organisations’ (Krücken & Meier, 2006). This tool has in fact made it necessary for universities to analyse their situation and to develop greater self-awareness: in order to develop a strategy, universities need to know their strengths and weaknesses, and to understand how they intend to be in the future, in line with their vision, they have to decide how to organise their resources. The preparation and monitoring of contracts worked as actual learning mechanisms of university autonomy established by the LRU in 2007. With the progression of the negotiation cycles, policymakers have realized that their value does not concern the economic sphere (as only a limited amount of financial resources are actually transferred), but is linked to the strengthening of the institutional identity of universities. Through contracts, the top management of universities have improved and reinforced their governance capabilities. Currently, five-year contracts (the initial length being 4 years, instead) are designed to promote a strategic dialogue between the Ministry and the universities. The contract negotiation is led by the Direction générale à l’enseignement supérieur et à l’insertion professionnelle (DGESIP), which coordinates the action of the associated central administrative directorates. As of 2013, contracts have gone from individual universities to ‘sites’. In practice, contracts are signed by the Communautés d’universités (COMUE—cf. infra). Every contract consists of two separate sections: – A section common to all universities, describing the development policies for education, research and transfer shared at site level. The section includes the identification of some performance indicators with corresponding targets. Finally, there is a summary plan of the public resources that COMUE will receive in the five-year period. – A specific component for each individual university, which describes in specific chapters the contribution of this institution to the policy of the site. Each chapter includes indicators and targets, a financial annex and a list of the accredited educational offer.

78

G. Barbato et al.

The transition from university contracts to COMUE contracts was aimed at facilitating the joint management of curricula and research centres through the creation of new territorial umbrella organisations (Paradeise, 2017, p. 6). However, whilst the transition to site contracts has facilitated the work of the Ministry, it has been perceived as a new obligation for universities, especially because, as we will see, universities have had to operate with a double level of governance, i.e. the university itself and the COMUE.

4.3

Mergers

The origins of aggregation policies lie in the idea of building university poles using geographical proximity to solve the problem of double partition. This idea was incorporated in the 2006 LOPR with the aim of improving the attractiveness of France. This law created Pôles de recherche et d’enseignement supérieur (PRES) to give universities, grandes écoles and research institutions the possibility to coordinate and pool their activities and resources. PRES was merely an opportunity: no institution was required to create or join a PRES. With this instrument, the government’s intention was to encourage aggregations to enable France to have national champions in international rankings, overcoming ‘double partition’ and moving to multidisciplinary universities through PRES (Aust & Crespy, 2009). The launch of the PRES symbolises the collective awareness that the peculiarities of the structure of higher education and research in France make it difficult to gain visibility from abroad, to the detriment of the country’s competitive capabilities. However, the PRES experience was considered disappointing: it showed the reluctance of universities to really give up their expertise and independence (IGENIGAENR, 2015, pp. 59–60). In fact, the 2009 Programme d’investissements d’avenir (PIA), which will be examined in detail in the following paragraph, represented the financial instrument whereby the government intervened to support and incentivise the policy of territorial aggregations by linking access to some PIA schemes to the aggregations. The 2013 law, also as a result of the existing difficulties, has transformed PRES. By providing for the definition of a Stratégie nationale pour l’enseignement supérieur (StraNES), it established, among other things: a) the principle of territorial coordination of training provision and research strategy, based on a project shared by all public institutions that depend on the MESRI; b) the implementation of the Schéma régional de l’enseignement supérieur, de la recherche et de l’innovation (SRESI), whereby the regions are involved in the preparation of site contracts. As a result of the law, there are three possibilities for cooperation or aggregation: – Merger of two or more universities. – Aggregation in the form of participation in a community of universities and institutes (COMUE).

A Longitudinal Analysis of the Relationship Between Central Government and. . .

79

– Aggregation in the form of an association of public or private institutions or bodies contributing to the missions of the public service of education and research. The form of aggregation more generally adopted has been the COMUE. In the year 2015–2016 there were 5 associations and 20 COMUEs. As we will see below, mergers have instead taken place on impulse and in connection with specific PIA programmes. COMUEs are a different process from PRES because the voluntary dimension is missing: every university and public school is obliged to be a member of an aggregation in order to access PIA funds. The COMUEs have more formalised governance structures which in fact replicate but do not replace those already present at university level. Moreover, the establishment of COMUEs leads to the identification of the competences transferred from the pre-existing institutions to the new entity, and in the contracts (mentioned in the previous paragraph) universities are considered within COMUEs. In this sense, the drive towards COMUE effectively limits institutional autonomy by introducing a contradiction with pre-existing policies (Paradeise, 2017). However, as with the PRES, the overall evaluation of COMUE is negative: most of the COMUE statutes provide only some transfer of competencies or the attribution of strong competencies to COMUE, reproducing the criticism against PRES (IGEN-IGAENR, 2015). The requirements for participation in the PIA have led to the creation of weak, poorly integrated aggregations with limited competences. In fact, some COMUEs have broken up or changed their boundaries. In response to these limitations, the aggregation ordinance of 20186 establishes a less restrictive framework for aggregations, which gives universities the power to customise COMUE statutes and internal organisation (without necessarily duplicating that of universities).

4.4

Excellence-Driven Policies

This section deals with a different dimension from the previous three: whereas the first three concern legislative or regulatory changes regarding the institutional architecture, this section deals with financial instruments, in particular, the Initiatives d’excellence (IDEX) call launched in 2010 by the government in the frame of the PIA. The PIA was launched with the amending budget law of 9 March 2010 based on the recommendations of the committee chaired by Juppé and Rocard (two former prime ministers) with the aim of improving the long-term growth potential of the Ordonnance n 2018-1131 du 12 décembre 2018 relative à l’expérimentation de nouvelles formes de rapprochement, de regroupement ou de fusion des établissements d’enseignement supérieur et de recherche. 6

80

G. Barbato et al.

French economy. Its aim is to increase investment in four priority areas: higher education and research, industry and SMEs, sustainable development and the digital economy. This is a very important measure that will condition the development of existing measures, including those examined in the previous pages. The PIA is based on competitive calls to finance research and innovation projects, which are judged by a panel of international academics. The PIA has had three rounds with a total budget of 57 billion euros: – Round 1 in 2010 (PIA 1) with a budget of 35 billion euros – Round 2 in 2013 (PIA2) with a budget of 12 billion euros – Round 3 in 2017 (PIA3) with a budget of 10 billion euros Funding measures7 for universities and research amounted to a total of approximately 25 billion euros. The funds actually received by the universities are not the total budget. The total allocation constitutes the total provisioning of the fund, the annual interest earned on this fund constitutes the amount of funds actually allocated to the universities on a year-by-year basis. Among the various initiatives of the PIA, one of the most representative is the IDEX which aims (Ravinet, 2012) to establish 5 to 10 multidisciplinary centres of excellence in university education and world-class research in France. In practice, the measure promotes the process of territorial aggregation of universities, colleges, research institutes and partnerships with businesses. The aim is to achieve significant dimensions that can be internationally recognised, and be attractive on the basis of international parameters and criteria. The keywords of this call, one of the most generously funded in the PIA, are excellence, multidisciplinarity and internationalisation. The initiative follows in the footsteps of NPM: it focuses on a limited number of institutions, concentrating resources on them and drawing inspiration from successful models abroad in a benchmark perspective (Aust et al., 2018). This measure boosts competition within the French university system and overcomes the principles of equivalence and uniformity prevailing until then (Musselin, 2013) in order to concentrate resources on a limited number of selected candidates. The aim is to promote and enhance the performance of the French university system. It should be stressed that in France the largest part of the university funding is based on recurrent budgetary funding (dotation globale de financement). Over time, in addition to these recurrent resources, new non-recurring sources of state funding have gradually been added, based on competitive calls (initially on research projects, with the ANR, then real estate investments, with the Campus Plan launched in 2007). The PIA is a continuation of this dynamic, focusing on investments of various kinds such as research and institution enhancement. At the level of the university system it is necessary to underline some discontinuities that the PIA introduces:

Typically competitive financing of significant scale (approximately 1 billion euros for an average IDEX project, and from 5 to 25 million euros for a LABEX or EQUIPEX).

7

A Longitudinal Analysis of the Relationship Between Central Government and. . .

81

– The initiator is the government and not the Ministry of Education. In particular, management is entrusted to an SGPI (Secrétariat général pour l’investissement) under the authority of the Prime Minister. – The allocation of resources takes place through competitive procedures involving international experts deliberately chosen outside the French academy to promote high levels of competition. – The IDEX call is intended for PRES, then COMUE, and not for individual institutions (including academics or faculties/departments), thereby acknowledging and consolidating the institutional identity of the bodies receiving funding. These measures show the desire to introduce a marked discontinuity by overcoming the pre-existing structure based on the relationship between academia and ministry. This discontinuity based on international criteria goes beyond the management logic of the ministry (replaced by the Prime Minister), which is somehow considered too contiguous to the status quo of the existing university system. This approach allows for the interpretation of the autonomous decision of some universities to merge fully (e.g. Strasbourg, Lorraine and Aix-Marseille) or partially (e.g. Bordeaux and Montpellier). This phenomenon started in 2007 in the wake of the reforms on territorial aggregation which, however, were focused on the resources provided by the PIA programme. These mergers (well supported for the PIA) have created new larger players. However, with the exception of the University of Lorraine, the mergers have in fact only concerned universities, leaving aside the grandes écoles, thus not fully meeting the objective of bringing grandes écoles and universities closer together.

5 Conclusion There is no doubt that the relationship between central government bodies and French universities has changed significantly over the last 30 to 40 years. Other studies have examined this evolution highlighting the changes in university governance (Chatelain-Ponroy et al., 2013; Musselin 2014) and the transformation of the role of academics (Chevaillier, 2001; Pezzoni et al., 2012; Musselin, 2013). In this chapter, we have attempted to highlight the function of some operational tools (see Table 1) related to management and performance measurement, which differ from those of the previous administrative tradition, for the management of interinstitutional relations between Ministry and universities. The first operating system is the evaluation of the activities of universities, which represents a qualitative leap forward in the creation of AERES. A third party is responsible for producing evaluations in all its fields in order to provide reliable performance measurements through which to regulate the relationship between the Ministry and universities and to make universities more accountable. The contracts between the university (or aggregation of universities) and the Ministry have two features. First, they introduce contractual mechanisms in the

82

G. Barbato et al.

Table 1 Coordination mechanisms between the government and universities

National institution in charge

Objective

Main measures

Consequences on the university system

Evaluation AERES—formally independent agency (later named HCERES) Receive coordinated evaluations of university activities at the university level. Own an independent evaluation agency in line with the requirements of the Bologna process Evaluation of –Institutions (universities and écoles) –Teaching –Research units (of university and organismes de recherche)

Dissemination of evaluationrelated information and greater emphasis on research results. No direct connection with ministerial funding

Excellence-driven policies SGPI (agency under the authority of the Prime Minister)

Contracts Ministry of University and Research (MESRI)

Mergers Ministry of University and Research (MESRI)

Regulate relations between the Ministry and universities

Building university poles using geographical proximity to solve the problem of double partition

Establish 5 to 10 multidisciplinary centres of excellence in university education and world-class research in France

Five-year contracts describing the development policies for education, research and transfer shared at site level

PRES—Pôles de recherche et d’enseignement supérieur (2006—On a voluntary basis) COMUE—community of universities and institutes (2013— mandatory) Limits institutional autonomy by introducing a contradiction with pre-existing policies. The aim is to strengthen universities’ corporate governance

PIA, especially IDEX Promotion of aggregations on a territorial basis through the allocation of substantial conditional funding assigned through competitive procedures involving international experts Institutionalised competition between universities to access financial resources

Gradually strengthened the institutional identity of universities

relationship between the government and the universities by going beyond the use of legal and regulatory provisions. Secondly, many observers acknowledge that they represent a remarkable tool for legitimising and empowering the top management of universities and their institutional leadership (Paradeise, 2017). For the first time universities are called upon to view themselves as a unit in terms of performance and

A Longitudinal Analysis of the Relationship Between Central Government and. . .

83

to approach the external world as a unified entity in an institutional logic, and no longer in terms of disciplines and individual academic groups (Krücken & Meier, 2006). The initiatives to stimulate the creation of PRES and COMUE are explicitly inspired by the desire to overcome the double partition of the French university and research system (Aust & Crespy, 2009) in an effort that is above all a benchmark in relation to the outside world. This policy gained momentum under the Sarkozy presidency with the desire to improve the positioning of French universities in international rankings. The overview presented in this chapter places France in line with other countries in the commitment to reshape its systemic governance system by adopting a managerial approach where the keywords are performance measurement, evaluation, empowerment of institutions and their decision-making capacity, and where assonance with the principles of New Public Management (NPM) is certainly marked (Pollitt, 2007). There is thus a convergence towards the ‘steering at a distance’ where the government is oriented towards regulatory functions overcoming the pre-existing model based on direct interventions in university activities (van Vught, 1989; Capano, 2011; Shattock, 2014; Capano & Turri, 2016; Capano et al., 2016). This concept, although expressed in the legislation and related declarations, shows more than one sign of subsidence in its application. In reality, the steering at a distance template is applied only partially and with apparent discontinuity. In fact, evaluation, contracts and policies to promote aggregations have proved to be an ambivalent tool that is used more as a means to pressure universities than as an instrument of systemic steering. The lack of relation with funding allocation, along with the vicissitudes of the national evaluation agency, is the most evident expression of this tendency. Since its inception, AERES, while adopting an ambitious programme, autonomously limited its competences by excluding the assessment of individual performances of academics. AERES always operated in the absence of direct links between the outcome of the assessment and government funding. As it was based upon the initiative of external bodies, this link was among the main reasons why AERES was phased out and replaced by HCERES. The new agency has had as a pivotal element the absence of connection between the results of the evaluation and performance (Capano & Turri, 2016). Similarly, as illustrated, contracts with the ministry lost their economic value over time, while reinforcing their important role in the process of empowering the top management of universities and their institutional identity, and more recently in the establishment of the COMUE. Moreover, the emphasis is always on the initial moment of negotiation of the contract and much less on ex post verification of their achievement. Even the path that characterised the creation of the PRES and then the COMUE is not linear and the legislator’s intention, which was to favour consolidated unions and aggregations, is vulnerable to the opportunistic behaviour of the universities that constitute these unions more with the aim of attracting resources than to create permanent unions by reducing double partition. Moreover, the drive for aggregations on a territorial basis in fact contradicts the drive towards greater institutional

84

G. Barbato et al.

autonomy that has inspired the governance reforms mentioned in paragraph 2 (Paradeise, 2017). The fourth instrument under consideration introduces, at least potentially, a discontinuity to this disconnection between legislative intention and actual implementation. The marked link with economic resources and the replacement of the Ministry of Education by the Presidency of the Council of Ministers are certainly a novelty, although a judgement on the effectiveness of the instrument is probably premature. An overall and conclusive look can identify, more than the radical replacement of a pre-existing model with a new model inspired by the logic of the NPM, a stratification of several models and logics in which pre-existing and new actors operate in order to favour or contrast the reforms. At least in part, the use of tools and mechanisms inspired by managerial logic has contributed to a progressive focusing on the performance of universities in a way that is partially connected to their greater institutional autonomy. However, the EUA report (2017), which places the French universities among the most backward in terms of institutional autonomy, confirms that this process is still in progress and has a bumpy ride. As Paradeise (2017) effectively sums up Yet, conflict after conflict, adjustment after adjustment, the perception of the missions of universities and academics is transformed, a culture of performance is slowly taking hold.

References Aust, J., & Crespy, C. (2009). Napoléon renversé? Institutionnalisation des Pôles de recherche et d’enseignement supérieur et réforme du système académique français. Revue française de science politique, 59(5), 107. Aust, J., Mazoyer, H., & Musselin, C. (2018). Se mettre à l’IDEX ou être mis à l’index: Conformations, appropriations et résistances aux instruments d’action publique dans trois sites d’enseignement supérieur. Gouvernement et action publique, 7(4), 9–37. Capano, G. (2011). Government continues to do its job. A comparative study of governance shifts in the higher education sector. Public Administration, 89(4), 1622–1642. Capano, G., & Regini, M. (Eds.). (2015). Come Cambia la Governance: Università Italiane ed Europee a Confronto, Fondazione CRUI. Accessed Sep 26, 2021, from https://www.crui.it (home page). Capano, G., Regini, M., & Turri, M. (2016). Changing governance in universities: Italian higher education in comparative perspective. Palgrave-MacMillan. Capano, G., & Turri, M. (2016). Same governance template but different agencies. Types of evaluation agencies in higher education. Comparing England, France and Italy, Higher Education Policy. Chatelain-Ponroy S., Mignot-Gérard S., Musselin C., & Sponem S. (2013). Reforms in French Public Universities. How does commitment to performance match with commitment to public values? Chevaillier, T. (1998). Moving away from central planning: using contracts to steer higher education in France. European Journal of Education, 33(1), 65–76. Chevaillier, T. (2001). French academics: Between the professions and the civil service. Higher Education, 41, 49–75.

A Longitudinal Analysis of the Relationship Between Central Government and. . .

85

Chevaillier, T. (2004). The changing role of the State in French higher education: from curriculum control to programme accreditation. Accreditation and Evaluation in the European Higher Education Area, 5, 159–174. ENQA. (2017). ENQA agency review: High council for the evaluation of Research and Higher Education (HCERES). Accessed Sep 26, 2021, from https://www.enqa.eu/wp-content/uploads/ HCERES-review-report_FINAL.pdf. HCERES, Self-evaluation report – 2016 French High Council for Evaluation of Research and Higher Education (HCERES). Hoareau, C. (2011). Globalization and dual modes of higher education policymaking in France: Je t’aime moi non plus. Center for Studies in Higher Education Research & Occasional Paper Series: CSHE.2.11 University of California, Berkeley. IGEN-IGAENR, Rapport annuel des inspections générales IGEN-IGAENR, 2015. Krücken, G., & Meier, F. (2006). Turning the university into an organizational actor, globalization and organization: World society and organizational change (pp. 241–257). Oxford University Press. Larédo, P. (1997, January). L'impact des programmes communautaires sur le tissu scientifique et technique français, Ministère de la Recherche et de la Technologie. Musselin, C. (2001). The Long March of French Universities. Routledge. Musselin, C. (2013). How peer-review simultaneously empowers the academic profession and university managers: evolution of the relationships between the state, the universities and the professiorate. Research Policy, 42(2), 1165–1173. Musselin, C., & Paradeise, C. (2009). France: From incremental transitions to institutional change. In C. Paradeise, E. Reale, I. Bleiklie, & E. Ferlie (Eds.), University governance: Western European comparative perspectives (pp. 21–49). Springer. Musselin, C. (2014). Empowerment of French Universities by funding and evaluation agencies. In Organizational transformation and scientific change: The impact of institutional restructuring on universities and intellectual innovation (Research in the Sociology of Organizations) (Vol. 42, pp. 51–76). Emerald Group Publishing Limited. https://doi.org/10.1108/S0733558X20140000042002 Paradeise, C. (2017). The French HER system and the issue of evaluation, Italian Political Science, Special Issue on Assessment in Europe. Pezzoni, M., Sterzi, V., & Lissoni, F. (2012). Career progress in centralized academic systems: Social capital and institutions in France and Italy. Research Policy., 41, 704–719. Pollitt, C. (2007). The new public management: An overview of its current status. Administrație și management public, 8, 110–115. Ravinet, P. (2012). Chapitre 16. La politique d'enseignement supérieur: Réformes par amplification et rupture dans la méthode. In de Maillard, J. (Ed.), Politiques publiques 3: Les politiques publiques sous Sarkozy (pp. 361–380). Presses de Sciences. Shattock, M. (2014). International trends in university governance. Routledge. van Vught, F. (1989). Governmental strategies and innovation in higher education. Jessica Kingsley.

Adopting a Dynamic Performance Governance Approach to Frame Interorganizational Value Generation Processes into a University Third Mission Setting Federico Cosenz

1 Introduction and Research Objectives Managing Higher Education Institutions (HEIs) in the contemporary environment is a complex task that requires the use of effective performance management mechanisms to support the fulfilment of the core missions—i.e., Education and Research— such organizations daily pursue. University Management research has traditionally adopted an internally focused perspective aimed at exploring and testing how performance management tools can be adapted to the organizational attributes of HEIs. Such an adaptation allows academic decision makers to frame and assess value generation processes related to Education and Research activities—including their underlying administrative operations—thus pursuing strategies oriented to primarily improve the organizational performance of Universities (Broadbent, 2007; Guthrie & Neumann, 2007; Miller, 2007; Bianchi & Cosenz, 2013; Cosenz, 2013, 2014; Angiola et al., 2018). In recent years, drawing on the “Entrepreneurial University” model (Etzkowitz et al., 2000; Gulbrandsen & Slipersaeter, 2007; Cosenz, 2011; Thorp & Goldstein, 2013; Guerrero et al., 2016), the literature on Higher Education policy and management recognized and discussed a third critical role—complementary with those of knowledge transfer and development (i.e., Education and Research)—played by Universities in the current socioeconomic environment, which further stresses the entrepreneurial aptitude of these institutions (Ricci et al., 2019). This role acknowledges HEIs as fundamental engines for the socioeconomic growth and innovation of

F. Cosenz (*) Department of Political Sciences and International Relations, University of Palermo, Palermo, Italy e-mail: [email protected] © The Author(s), under exclusive license to Springer Nature Switzerland AG 2022 E. Caperchione, C. Bianchi (eds.), Governance and Performance Management in Public Universities, SIDREA Series in Accounting and Business Administration, https://doi.org/10.1007/978-3-030-85698-4_5

87

88

F. Cosenz

a given local area where the so-called “knowledge economy”1 characterizes the developmental conditions of its economic sectors (Thorp & Goldstein, 2013). In this context, by collaborating with other local/regional players into policy networks (Rhodes, 1990, 2017), HEIs are gradually taking a proactive role in long-term value generation processes through the commercialization of knowledge and engagement in entrepreneurial activities (Urbano & Guerrero, 2013). Such a role—nowadays widely known as “Third Mission” (or Third Stream/Function/ Role/Leg)—also found an institutional endorsement by both the European Commission and OECD (2012, p. 1) whose guiding framework declares: “higher education is facing unprecedented challenges in the definition of its purpose, role, organization, and scope in society and the economy. The information and communication technology revolution, the emergence of the knowledge economy, the turbulence of the economy and consequent funding conditions have all thrown new light and new demands on higher education systems across the world.” As such, fulfilling the Third Mission poses greater challenges, opportunities, and threats for HEIs which are now called to foster partnerships, networking, and collaborations with other local/regional players, as well as sustainability, economy, and social engagement (Gulbrandsen & Slipersaeter, 2007; Laredo, 2007; Perkmann et al., 2013; Trencher et al., 2014; Rippa & Secundo, 2019). In terms of performance management system design, the Third Mission of Universities implies the adoption of a broader perspective for assessing the longterm results generated to enhance the socio-economic development of the surrounding environment, thus including the collaborative governance settings and associated performance emerging from the aggregated contribution of an HEI and its nonacademic partners (e.g., public, private and nonprofit organizations, utilities, agencies, local authorities, and civil participants) involved in specific value creation processes (Bianchi et al., 2019; Bianchi & Vignieri, 2020). However, the relevance of designing performance management systems according to a broader perspective is hampered by the greater complexity of a fragmented value creation chain where multiple actors—entailing additional performance variables—intervene at political/ strategic and operational level (Cepiku et al., 2012). Third Mission activities are carried out under interorganizational coordination which includes the strategic and operational interdependences between the University and its nonacademic partners to generate long-term results oriented to foster the socioeconomic progress of the local area and its community living standards (Bianchi & Tomaselli, 2015; Bianchi et al., 2019; Xavier & Bianchi, 2019; Bianchi & Vignieri, 2020). Particularly, managing those value creation processes operationalized through interorganizational coordination requires the exploration of the collaborative governance mechanisms— formally or informally—established by the University with its network partners

Powell and Snellman (2004, p. 199) defined the “knowledge economy” as “production and services based on knowledge-intensive activities that contribute to an accelerated pace of technical and scientific advance, as well as rapid obsolescence. The key component of a knowledge economy is a greater reliance on intellectual capabilities than on physical inputs or natural resources.”

1

Adopting a Dynamic Performance Governance Approach to Frame. . .

89

(Aversano et al., 2018). As argued by Bianchi et al. (2019), differently from the evaluation of the organizational performance focused on the outputs intended as short-term results produced by a single HEI (e.g., graduations, enrolments, and publications), the performance of collaborative governance settings is measured through outcome indicators conceived as long-term results—or impacts—generated by the aggregated contribution of multiple organizations including the University (e.g., academic partnerships, spin-offs, and lower unemployment rate). With the intent to fuel the scientific debate on University Third Mission and those methodological approaches to evaluate its emerging outcomes, this chapter aims to explore and discuss how a Dynamic Performance Governance (DPG) approach may support collaborative governance structures in managing the interorganizational coordination (e.g. University–Industry–Government partnerships) and associated performance from the HEI perspective (Bianchi, 2016; Bianchi et al., 2017, 2019; Xavier & Bianchi, 2019). The DPG approach—borrowed from the work of Bianchi (2016) and Bianchi et al. (2017, 2019)—is oriented to broaden the scope of traditional performance management systems applied to Universities by focusing on the sustainable community outcomes generated by Third Mission activities, thereby exploring how consistently HEIs and their local/regional partners collaborate and contribute to implementing network policies (Bianchi, 2016; Bianchi et al., 2017, 2019; Xavier & Bianchi, 2019; Bianchi & Vignieri, 2020). To this end, the chapter initially provides an overview of Third Mission definitions and the methods proposed by scholars for evaluating associated results. Then, drawing on the insights and shortcomings emerging from this literature review, the DPG approach is described and applied—through an illustrative example—to Third Mission activities characterizing academic network performance, policy design settings, and interorganizational value generation processes. Eventually, the chapter summarizes the main findings related to the use of DPG in evaluating Third Mission activities, and concludes with future research perspectives.

2 Reviewing the Concept of Third Mission in HEIs In recent years, the definition of Third Mission has interested many scholars proposing multiple notions of the multifaceted and complex activities carried out by HEIs under this sobriquet. In Higher Education policy and management research, every attempt of defining these activities converges toward the ultimate impact they should generate on the surrounding socioeconomic context. However, Third Mission remains an ambiguous concept, as it depends on the structure of academic activities, on its role in its geographical area (and associated socioeconomic conditions), as well as on the country’s institutional framework (Laredo, 2007). While in some countries Third Mission activities are widely recognized and are an integral part of the ministerial evaluation mechanism aimed at distributing public funding to HEIs (e.g., Finland, Sweden, the UK, Spain, USA, Canada, and Australia), in others these

90

F. Cosenz

Fig. 1 The “Triple Helix” framework (Leydesdorff & Etzkowitz, 1996)

activities are partially included—or even not yet identified—in the wider institutional mission of Universities. According to a broader perspective, Pinto et al. (2016, p. 315) argue that the Third Mission “refers to an additional function of the universities in the context of knowledge society. The university is not only responsible for qualifying the human capital (Education—the first mission) and for producing new knowledge (Research—the second mission). Universities must engage with societal needs and market demands by linking the university’s activity with its own socio-economic context. Today universities develop their strategies around these three missions. Academics debate negative effects and the effective integration of these missions in a coherent institutional framework. Governments develop third mission policies allocating funding to this role while policy-makers and experts are implementing specific indicators.” In line with the above statement, two main approaches for defining University’s Third Mission have been proposed. The first one is the “Triple Helix” model which revolves around the University– Industry–Government interdependencies, intended as the key actors affecting the level of knowledge within a given socioeconomic context (Etzkowitz, 2008). Figure 1 portrays the Triple Helix framework. The second approach is suggested by Molas-Gallart et al. (2002, p. 2) who define Third Mission as: “all activities concerned with the generation, use, application and exploitation of knowledge and other university capabilities outside academic environments.” Building on the Triple Helix model and the rise of the knowledge-based society paradigm, approximately since 2000, the Third Mission theory received greater attention by Intellectual Capital (IC) scholars who developed an evolutionary process made up of consequential stage perspectives in IC theory advance, related to the role of Universities in today’s society (Dumay & Garanina, 2013; Esposito et al., 2013; Elena-Perez et al., 2014; Bisogno et al., 2018; Bonollo & Zuccardi Merli, 2018; Secundo et al., 2019; Nicolò et al., 2020). At the end of this process, the fourth-stage IC perspective incorporates the development of knowledge and its sharing into an ecosystem (national, regional, or local), thus encouraging HEIs to

Adopting a Dynamic Performance Governance Approach to Frame. . .

91

create and strengthen relationships with their local communities with the intent to contribute to the enhancement of their socioeconomic development through a knowledge sharing based paradigm (Sanchez & Elena, 2006; Dumay & Garanina, 2013; Secundo et al., 2015, 2017, 2018; Frondizi et al., 2019). Since its conceptualization (Leydesdorff & Etzkowitz, 1996), the Triple Helix model also experienced significant innovations leading to enlarge the set of stakeholders intervening—at different levels—in the generation of social and economic value (Amaral & Magalhães, 2002; Miller et al., 2014). Namely, together with the original helixes (i.e., University–Industry–Government), a Quadruple Helix model included “media-based and culture-based public” and the “civil society” (Carayannis & Campbell, 2012; Miller et al., 2014; McAdam & Debackere, 2018), thereby acknowledging the relevance of value co-production (or co-creation) processes in the knowledge society (Bovaird, 2007; Osborne et al., 2016; Sicilia et al., 2016; Bianchi et al., 2017; McAdam et al., 2017). Aiming to embrace a focus on sustainability issues, Carayannis et al. (2012) also added a fifth helix corresponding to the “natural environments of society.” These helixes are expected to support a systemic view in the design of those policies oriented to enhance the effectiveness of regional innovation processes conducted by the University with its nonacademic partners, thus operationalizing the Third Mission approach at all institutional levels (Frondizi et al., 2019). With the intent to shed light on the new role played by Universities in the knowledge society, Frondizi et al. (2019) have recently conducted a literature review exploring the multiple definitions of this novel academic mission. Table 1 summarizes the main results that emerged from this review. Based on the above background, examples of Third Mission activities include University–Industry–Government relations, technology transfer, academic entrepreneurship, knowledge commercialization, collaborative research, and academic consulting. Each of the above examples implies the collaboration and coordination between the University and other nonacademic stakeholders for generating value (Manes Rossi et al., 2018; Guerrero et al., 2019). In recent years, alongside the institutionalization of Third Mission activities strengthening the entrepreneurial orientation of Universities within the knowledgebased society, the need for measuring the value generated by carrying out these activities strongly emerged in many countries (Barnabè & Riccaboni, 2007; Olcay & Bulu, 2017). Spanning the boundaries of the University organizational setting toward the analysis of a broader value creation ecosystem—wherein multiple institutions interact and collaborate for generating outcomes at local and regional levels—implies additional challenges and complexities in terms of performance management, policy design, and implementation. For this reason, the next section introduces the DPG approach (Bianchi, 2016; Bianchi et al., 2019; Bianchi & Vignieri, 2020) as a method for exploring and managing the Third Mission activities of HEIs into such a broader value creation ecosystem, thus providing an interorganizational perspective in setting policy coordination and collaborative governance aimed to foster sustainable socioeconomic development and better community outcomes (Bovaird, 2007; Torfing & Ansell, 2017; Bianchi & Vignieri, 2020).

92

F. Cosenz

Table 1 Main definitions of the new role played by HEIs in local/regional areas (Frondizi et al., 2019, p. 7) Author(s) (year) Molas-Gallart et al. (2002)

Source Science and Technology Policy Research

Concept – Third stream – Third leg

Gunasekara (2006)

Journal of Technology Transfer

– Third role – University engagement

Pilbeam (2006)

Journal of Higher Education Policy and Management

– Third stream income

Business/Higher Education Round Table (2006)



– Community engagement – Third mission

HEFCE (2008)



– Third stream

Webber and Jones (2011)

Journal of Higher Education Policy and Management

– Third constituent of higher education

Definition Third stream/leg activities “are concerned with the generation, use application and exploitation of knowledge and other university capabilities outside academic environments” (p. iv). “Third role [is] performed by universities in animating regional economic and social development” (p. 102). “The universities engagement approach points to a developmental role performed by universities in regional economic and social development that centres on the intersection of learning economies and the regionalisation of production and regulation” (p. 103). “Revenues from the commercial exploitation of university intellectual assets (third stream income)” (p. 297). “Communities engagement has a broad vista that extends beyond business and economic aspects. Universities have a wider view of engagement which includes social, economic, environmental and cultural dimensions of capacity building” (p. 3). “Third Mission activities of universities seek to generate, apply and use knowledge and other university capabilities outside academic environments” (p. 4). “Third stream refers to work to increase the impact of higher education on economic development and the strength and vitality of society as a third stream of activity alongside, and complementary to, teaching and research” (p. 26). “Third constituent of higher education can be described as consisting of universities’ relations with and contributions to other sectors of society” (p. 17). (continued)

Adopting a Dynamic Performance Governance Approach to Frame. . .

93

Table 1 (continued) Author(s) (year) Bornmann (2013)

Source Journal of the American Society for Information and Science Technology

Concept – Societal impact of research

SánchezBarrioluengo (2014)

Research Policy

– Social and Business Engagement

Watson and Hall (2015)

International Journal of Academic Research in Management

– Third stream

Guerrero et al. (2015)

Research Policy

– Third Mission

Definition “Societal impact of research is concerned with the assessment of social, cultural, environmental, and economic returns (impact and effects) from results (research output) or products (research outcome) of publicly funded research” (p. 217). “Social and Business Engagement is seen as reflecting the changing nature of scientific knowledge and the natural tendency for academia to adapt in response to social changes” (p. 2). “Third Stream agenda is a critical strategy in the pursuit of enriched learning, enhancing student employability and much needed revenues” (p. 48). “The entrepreneurial university serves as a conduit of spillovers contributing to economic and social development through its multiple missions of teaching, research, and entrepreneurial activities” (p. 748).

3 A Dynamic Performance Governance Approach to Frame Third Mission Activities Dynamic Performance Governance (DPG) is an approach “able to support policy networks to pursue sustainable community outcomes” (Bianchi et al., 2019, p. 2). This approach uses the methodological perspective of Dynamic Performance Management (Bianchi, 2010; Bianchi, 2016) for spanning the organizational boundaries of a single institution with the intent to foster both consistency and learning in policy design, implementation, and interorganizational coordination at a policy network level (Bianchi & Tomaselli, 2015; Bianchi et al., 2017, 2019). For this reason, such a method can be particularly valuable to explore the value generation mechanisms related to Third Mission activities, as these are characterized by complex operational interactions between the HEI and other community stakeholders, forming a collaborative ecosystem aimed to pursue goals of sustainable development (Cosenz et al., 2020). According to DPG principles (Bianchi et al., 2019), the pursuit of sustainable development implies focusing on a multidimensional view of Third Mission

94

F. Cosenz

Fig. 2 Complementary dimensions of the academic network performance (adapted from Bianchi et al., 2019)

performance which entails a balance among the “academic network success,” “time,” and “space” perspectives. As shown in Fig. 2, the “academic network success” depends on three main dimensions: a competitive, a financial, and a social and environmental dimension (Coda, 2010). Both the competitive and the social/ environmental dimensions should be oriented to sustain satisfactory financial results in the long term by better fulfilling those community needs associated with the role of HEIs in the knowledge society (Coda, 2010; Guthrie & Neumann, 2007; Bianchi et al., 2019). Besides, perspectives related to “time” and “space” must be considered when framing and exploring the Third Mission performance of HEIs. Regarding the “time” perspective, an improvement in short-term performance should not be obtained to the prejudice of long-term results. Too often, unbalanced time-horizon policies result in an evanescent short-term improvement, that hides worse undesired effects in the long run. Thus, balancing the short with the long-term performance in policy design and implementation requires the adoption of a consistent methodological approach to performance management and measurement. Such an approach is related not only to balancing short- and long-term goals, but also to measure the outcomes of current and often inertial policies on the change in both organizational structures and external contextual conditions affecting the local/regional ecosystem where the University and its nonacademic partners (co-)operate (Mastilak et al., 2012; Bianchi, 2016). In this context, while the organizational performance of a single University can be gauged through the use of output measures, e.g., graduations, enrolments, and publications, the academic network performance focuses on outcome indicators, intended as long-term impacts generated by the aggregated contribution of multiple partner organizations including the University, e.g., academic spin-offs, University–Industry–Government partnerships, and technology

Adopting a Dynamic Performance Governance Approach to Frame. . .

95

transfer (Bianchi et al., 2019). In this regard, as remarked by Bianchi et al. (2019), while the outputs produced by a single HEI affect the endowment of its strategic resources (e.g., enrolled students, research outputs, academic spin-offs, and citations), the outcomes generated by a plurality of local/regional actors—including the HEI—are likely to influence those shared strategic resources of the community (e.g., employment rate, human capital, local businesses, social capital, and pollution). Concerning the “space” perspective, a sustainable development based on Third Mission activities emerges from the search for consistency and mutual dependencies between not only the multiple outputs offered by different organizational units, departments, and faculties of a University, but also the University’s network and its local area performance (Bianchi, 2016; Bianchi & Williams, 2015). As Bianchi et al. (2019, p. 4) remarked, this perspective is oriented to frame the strategic dialogue between the multiple stakeholders collaborating in the same regional area for jointly producing public value in terms of products and services according to a sustainable development view. The DPG approach applies the Dynamic Performance Management perspective to enhance performance governance, thus supporting policymaking through the adoption of interpretive lenses for exploring how and why performance measures change over time, as an effect of undertaken policies, stakeholders’ actions, and external conditions (Bianchi, 2016; Bianchi et al., 2019). As for Third Mission activities in HEIs, the use of DPG supports a systemic and interorganizational perspective to frame academic network value creation processes (see Fig. 3) according to a collaborative policy design mechanism that primarily focuses on the sustainable development of the local area and, subsequently, of the single University (Bianchi et al., 2019). This mechanism allows collaborative governance participants to build consensus and accountability around the policies and associated actions for pursuing goals of sustainable development, as well as to nurture the implementation of value co-creation programs and citizens’ participation in policy design settings (Bovaird, 2007; Osborne et al., 2016; Bianchi et al., 2017). The DPG approach applied to Third Mission activities is oriented to explore how an academic network achieves its results by analyzing and measuring those critical drivers leading to performance. As such, it contributes to designing performance indicators focused on the combination between the consumption of shared and unshared strategic resources and the associated community outcomes. While unshared resources are owned and managed by a single network’s organization (e.g., the HEI), shared resources include common goods and assets for the use of the entire community (e.g., employment rate, human capital, and local businesses). Although they can only be influenced by a plurality of actions by multiple stakeholders and, hence, are complex to manage, these latter are quite relevant for building and supporting the performance of the local area according to a sustainable development perspective (Bianchi et al., 2019). As shown in Fig. 4, the DPG approach aims to make the relation between resource accumulation/depletion processes and corresponding outcomes explicit through the identification of performance drivers—linked to the critical success factors for better achieving the outcomes of the local area under control—on

96

F. Cosenz

Fig. 3 A systemic perspective to frame academic network value creation processes

Fig. 4 A dynamic performance governance perspective (Bianchi, 2010, 2016; Bianchi et al., 2019)

which academic network decision makers may act to affect end results (Bianchi, 2010, 2016). In other words, it investigates how results are achieved in terms of resource allocation and consumption, as well as how these results, in turn, create value fueling the corresponding resources over time (Bianchi et al., 2017). Measuring performance drivers provides a deeper understanding of how this value is generated along the academic network value chain, since it focuses on the

Adopting a Dynamic Performance Governance Approach to Frame. . .

97

coordination mechanism between the various stakeholders acting and collaborating into the network. These drivers represent the main factors driving performance governance, and their assessment also provides an understanding of how a University—as well as its non-academic partners—is generating value in correspondence with those critical success factors able to affect community outcomes (Bianchi, 2016). They are measured as ratios between the current and the desired strategic resource levels (Bianchi, 2010). Assessing these value drivers enables academic network decision makers to better identify the causal determinants and related interplays producing a major effect on end results. Consequently, desired corrective actions applied to such drivers can be promptly undertaken in the short term. The emerging DPG models are developed by constructing feedback structures— used in System Dynamics modelling (Morecroft, 2015; Sterman, 2000)—able to frame the causal interdependences among the relevant variables (i.e., strategic resources, performance drivers, and end results) of the governance structure under observation (Bianchi, 2016). These feedback structures explain the rationale underlying the behavior of the variables forming the loops by highlighting both drivers and policy levers to influence the current state of the system (Bianchi, 2016; Sterman, 2000). Consequently, similarly to other modelling approaches supporting decision-making, DPG models serve as cognitive tools to explore the value creation processes affecting community outcomes, as well as to better understand how the stakeholder network reacts to implemented policies in terms of performance governance (Bianchi, 2016). In particular, the DPG approach uses insight (or policybased) System Dynamics models which have proven to be effective in supporting a descriptive perspective in policy analysis and performance management, thereby communicating and sharing an understanding of the causes and implications underlying the observed governance system among network participants (Bianchi et al., 2019). In the next section, an illustrative example applying DPG to Third Mission activities—namely, University–Industry–Government partnerships—is showed and discussed to provide explanatory evidence on the use of this approach in this specific policy network context.

4 Applying Dynamic Performance Governance to Third Mission Activities: An Illustrative Example This section aims to provide an illustrative example of how to apply the DPG approach to Third Mission activities. Drawing on the Triple Helix model (Leydesdorff & Etzkowitz, 1996), this example frames the role played by HEIs in establishing University–Industry–Government (UIG) partnerships with the intent to strengthen the socioeconomic ecosystem of a given local area according to a sustainable development perspective (Etzkowitz, 2003, 2008).

98

F. Cosenz

In particular, UIG partnerships are oriented to support the entrepreneurial vocation of the University, thus fostering its capability to generating new academic spinoffs and, as a result, increasing the local businesses operating in the same area (Van Looy et al., 2003; Hayter, 2013; Graham, 2014; Guerrero et al., 2016; McAdam et al., 2016; Fuster et al., 2019). In this case, the academic network focuses on the implementation of entrepreneurial activities by graduates and specific professional profiles trained and supported by the University, who collaborate with local businesses, R&D centers, science and technological parks, governmental bodies, and other public organizations. As such, UIG partnerships aim to generate a supportive ecosystem for the academic community and its surroundings, in order to develop, share, grasp, and use new knowledge that can give rise to academic spin-offs. As argued by Fuster et al. (2019, p. 219), academic spin-offs “are an important vehicle of knowledge transfer from Universities that take advantage of innovations and creating new high-quality employment and accelerating the productivity of regional economies. Policymakers are increasingly investing in universities to foster the creation of innovative start-ups in the hope of producing areas of economic growth and the resulting initiatives are predicated on the idea, using successful well-known examples such as Silicon Valley, that a well-structured entrepreneurial university ecosystem automatically leads to the emergence of successful business ecosystems.” In the vein depicted by Lubik et al. (2013) and Guerrero et al. (2019), the proposed example assumes that, besides the University, the academic network is formed by its graduates, local businesses, local governmental bodies, and residents. These community actors are called to implement interorganizational coordination at a policy network level, thus forming a collaborative governance setting aimed to jointly affect the value creation processes leading to the growth of their socioeconomic ecosystem (Jongbloed et al., 2008). While the economic development focuses on starting academic spin-offs resulting in new local businesses, the social growth depends on the capability of these new firms to hire and retain resident talents, thus fostering a lower unemployment rate in the local area (Jongbloed, 2008; Hayter, 2013). The emerging DPG model is portrayed in Fig. 5. Remarkably, it identifies the main factors framing how UIG partnerships affect the socioeconomic performance of the local area, and divides them into the corresponding layers (i.e., strategic resources, performance drivers, and end results) according to a systemic perspective. Such a perspective entails the identification of the causal interdependences among these factors, thus providing academic network decision makers proper interpretive lenses to understand how the system works and reacts to implemented policies (Bianchi, 2016). The feedback connections between end results and strategic resources—underlying this specific interorganizational value creation process—are emphasized through gray colored variables. Implementing UIG partnerships calls for a more explicit request of specific professional profiles to be educated and trained by the University. These professional profiles may depend on the search for peculiar skills and abilities required by nonacademic partners (i.e., local businesses and governmental bodies, or involved public institutions) in order to meet those new requirements of the current local labor

Fig. 5 Applying DPG to university–industry–government partnerships into a third mission setting

Adopting a Dynamic Performance Governance Approach to Frame. . . 99

100

F. Cosenz

market. Then, the University may produce efforts oriented to introduce and implement innovative curricula of study aimed to fulfil these requirements, thereby increasing the number of graduates holding these professional skills. In this context, a performance driver—i.e., the graduates’ ratio—helps to measure the fraction of graduates over the resident population which, in turn, influences the change in human capital employable in the local area. With the joint support of both local businesses and public spending policies, an increase in human capital may give rise to new UIG opportunities evaluated through another performance driver, i.e., the UIG opportunity ratio. Namely, this driver aims to gauge the match between the human capital educated by the University and the professional requirements of its nonacademic partners, thus resulting in new opportunities for starting UIG partnerships. As such, it affects the change in new UIG partnerships, whose corresponding stock is evaluated through the UIG partnership ratio. This latter identifies a performance driver evaluating the aptitude of the academic network to establish new UIG agreements oriented to start academic spin-offs, as well as to support their development in the local area. In the medium/long-term, the raise in academic spin-offs is likely to foster the economic development of the area by increasing the stock of local businesses thereinto operating and producing value for the benefit of the community. Particularly, this value may firstly imply new job opportunities for the residents of the local area, whose assessment is carried out by matching the local businesses’ job offerings with the resident population living thereinto. Other conditions being equal, an improvement of this performance driver leads to a decrease in the unemployment rate of the local area which, in turn, implicitly enhances its attractiveness toward nonresidents in terms of working standards, business opportunities, and career perspectives. Eventually, an increase in both local businesses and resident population may strengthen the local taxpayers’ capacity to contribute to public finance, thereby feeding back into public funding to keep improving the UIG partnership system and, as a result, the socioeconomic development of the local area. The application of DPG to frame the collaborative governance structure which affects UIG partnerships for pursuing sustainable public value generation enables academic network decision makers to share and nurture a common view over the factors and associated mechanisms leading to local area performance. This view facilitates the identification of those policy levers on which—depending on the specific role played into the network—the HEI and its nonacademic partners may act to generate an influence on community outcomes. Such policy levers provide for more effective allocation and consumption of the shared and unshared strategic resources—identified in the DPG model—under the control of the academic network actors. For instance, involved local businesses may address their efforts, on the one side, toward a greater engagement of governmental bodies and public institutions in establishing UIG projects aimed to foster the local economy and competitiveness and, on the other side, toward a more explicit request of specific professional profiles to the University. By acknowledging these requests, governmental bodies and public institutions are called to proactively engage in UIG partnerships and allocate adequate public funding for the development of new academic spin-offs. However, this

Adopting a Dynamic Performance Governance Approach to Frame. . .

101

policy direction should not be in contrast with the pursuit of other community outcomes, thus highlighting the emergence of managing a critical trade-off between pursuing a too rapid and boundless increase in new businesses in the short-term, and the associated rise of pollution harmful for the local natural environment. In the same way, the HEI may innovate and adapt its educational programs to the specific requirements of UIG partners, thereby training professional profiles who can be easily employed in the local labor market. Once this common view has been acknowledged and shared among network members, the possibility to assess both performance drivers and outcomes further allows academic network decision makers to overcome potential conflicts and enhance interorganizational coordination, as well as to promote a shared learning process in policy design and implementation (Bianchi et al., 2019). Drawing on the illustrative example herein described, the next section highlights the main findings emerging from the adoption of DPG in assessing Third Mission activities, and proposes possible avenues for contributing to the development of this research stream.

5 Advantages and Limitations of Using DPG to Frame Third Mission Activities The previous section described an illustrative example of how to apply the DPG approach to manage and affect the academic network value generation processes underlying Third Mission activities in Universities. Such an approach supports the academic network—formed by the HEI and other local/regional stakeholders—to measure and assess the outcomes produced by collaborative policies oriented to foster the socioeconomic development of a local area according to a sustainabilitybased perspective. These outcomes are intended as the end results originating from Third Mission activities. Focusing on the described example related to UIG partnerships, the DPG application offers an insightful and effective approach to support academic network stakeholders in framing Third Mission activities in terms of policy design and interorganizational coordination. The adoption of this method may facilitate a deeper understanding of the complexity characterizing Third Mission initiatives through a qualitative exploration of the systemic interdependencies between involved regional actors. The possibility to jointly explore the multiple causal interdependences affecting Third Mission outcomes also enables to foster a collaborative strategic learning process, thus supporting the settlement of emerging conflicts, e.g., in terms of shared strategic resource negotiation or consumption by network participants. Besides, the DPG approach emphasizes the identification of academic network actors and their specific role in developing Third Mission activities. In this perspective, the purpose of DPG is to support network participants to frame the systemic structure of the regional context—where Third Mission activities generate their

102

F. Cosenz

impacts—that influences its performance over time (Bianchi et al., 2019). The collaborative process of designing the DPG chart and learning about the observed regional system could even be of more benefit than the chart itself since the collaborative designing process promotes greater learning about the causes and effects of a given system than the chart on its own would. As for the limitations of DPG applied to University Third Mission, it is worth underlining the so-called “ossification” process affecting the implementation of cultural evolutions in managing HEIs when external changes show up and a plurality of managerial perspectives is required. “Ossification” processes are recurrently detected in public sector systems traditionally characterized by loosely coupled interplays giving rise to severe resistances to changes and an idiosyncratic approach to address common interorganizational complexities. Such a fragmented setting may limit the use of DPG as a method encouraging collaboration, shared understanding of complex policy issues, and participatory decision-making. In such policy settings, it is not easy to involve the different academic network partners—operating and interacting at a different level alongside the value creation chain—in the collaborative governance process. When they adopt a too constrained view on the common goals to pursue and disagree in negotiating resources, DPG may result inconsistent to generate the required cohesion for fostering the strategic coordination between the policy network participants. Consequently, this may limit a shared enhancement in their strategic learning processes, and the empirical evidence about the learning outcomes of DPG and its effectiveness may result insufficient. Another potential disadvantage related to the use of DPG refers to the inclination to create unnecessary large frameworks to explore complex policy network settings. In this perspective, large DPG models can be nearly impossible to understand and use as supportive tools. In addition, as argued by Hayden (2006), holistic approaches may result inappropriate for modelling social systems—such as academic networks—which are subject to outer influences and asymmetrical actions of external operators, as well as exposed to wicked contextual issues. Policy networks are quite complex to understand, and governance participants frequently encounter difficulties in detecting their interacting elements (e.g., resources, drivers, and outcomes) due to factors such as time and spatial separation between cause and effect, incorrect or limited information.

6 Concluding Remarks and Future Research Perspectives The chapter has initially provided a review of the literature reporting a plurality of definitions applied to the “Third Mission” concept. The review emphasized that, although there is no convergence toward a fully accepted definition (Frondizi et al., 2019), this concept embraces all the activities in which an HEI plays a proactive role, in collaboration with other local/regional partners (e.g., public and private organizations, governmental bodies, agencies, and utilities), aimed to generate any sort of long-term impact on the local/regional area, its community, economy, and

Adopting a Dynamic Performance Governance Approach to Frame. . .

103

environment. As such, the core attributes identifying Third Mission activities include: (i) a proactive role of the University into the local stakeholder’s network; (ii) a collaborative governance structure formed by the academic network partners involved in policy design and implementation processes; and (iii) the generation of long-term impacts (i.e., outcomes) as a result of implemented network policies affecting the social, economic, and environmental conditions of the area. Building on these attributes characterizing the development of Third Mission activities, the chapter then focused on the scientific debate related to the methods and approaches for managing and measuring the outcomes emerging from these specific activities. To this end, the adoption of the DPG approach (Bianchi et al., 2017, 2019) has been proposed, as it may provide a valuable methodological contribution to the design of outcome-based performance management systems applied to complex governance structures. The proposed approach uses a systemic perspective—focused on the identification of causal interdependencies between critical factors affecting results over time—to support policy networks in the pursuit of sustainable community outcomes (Bianchi et al., 2019). Such a perspective—which inspires the design of DPG models—is oriented to enhance the interorganizational coordination among network participants by offering a shared view of the different roles, and corresponding responsibilities, played by each stakeholder in the generation of sustainable community outcomes (Bianchi et al., 2017). In addition, DPG provides the academic network decision makers with a cognitive framework highlighting shared and unshared strategic resources, output, and outcome measures, as well as those performance drivers affecting them (Bianchi et al., 2019). Particularly, these remarks have been drawn up by applying the DPG approach to a well-recognized Third Mission example, i.e., UIG partnerships (Etzkowitz, 2008). Such an application adopted a qualitative modelling perspective to depict the performance governance setting related to UIG partnerships, which involved the identification not only of strategic resources, performance drivers, and end results, but also of the causal interplays among them (Fig. 5). Building on the insights emerging from the described illustrative example on UIG partnerships, more research can be developed to further assessing the DPG effectiveness in supporting academic networks dealing with Third Mission activities. In particular, future research perspectives may deepen the analysis related to the use of DPG for assessing Third Mission activities by designing, exploring, and testing quantitative system dynamics models (Sterman, 2000; Morecroft, 2015; Bianchi, 2016; Cosenz & Noto, 2016; Cosenz, 2018). These models focus on the quantification of the causal interplays among the variables of the observed system to simulate their behavior over time (Sterman, 2000). In this way, these simulation models—and associated simulation scenarios—may support a deeper and more rigorous analysis in terms of performance management and policymaking (Wolstenholme, 1999; Bianchi et al., 2019). Since simulation modelling requires detailed data, this research avenue also entails the development of field analyses and case studies to be conducted in real-world contexts, thereby leading to more empirical evidence of the effectiveness of this approach to support academic networks in evaluating Third

104

F. Cosenz

Mission activities. Also, the DPG approach may be applied to other Third Mission examples (e.g., knowledge commercialization, collaborative research projects, and academic consulting) which, unlike UIG partnerships, pose different challenges and requirements in terms of performance governance and policy network structure. These future research perspectives may ultimately contribute to the current debate oriented to explore and assess Third Mission activities in HEIs.

References Amaral, A., & Magalhães, A. (2002). The emergent role of external stakeholders in European higher education governance. In A. Amaral, V. L. Meek, & I. M. Larsen (Eds.), Governing higher education: National perspectives on institutional governance (pp. 1–21). Kluwer Academic Publishers. Angiola, N., Bianchi, P., & Damato, L. (2018). Performance management in public universities: Overcoming bureaucracy. International Journal of Productivity and Performance Management, 67(4), 736–753. Aversano, N., Manes Rossi, F., & Tartaglia Polcini, P. (2018). performance measurement systems in universities: A critical review of the Italian system. In E. Borgonovi, E. Anessi-Pessina, & C. Bianchi (Eds.), Outcome-based performance management in the public sector (pp. 269–288). Springer. Barnabè, F., & Riccaboni, A. (2007). Which role for performance measurement systems in higher education? Focus on quality assurance in Italy. Studies in Educational Evaluation, 33(3–4), 302–319. Bianchi, C. (2010). Improving performance and fostering accountability in the public sector through system dynamics modelling: From an ‘external’ to an ‘internal’ perspective. Systems Research and Behavioral Science, 27(4), 361–384. Bianchi, C. (2016). Dynamic performance management. Springer. Bianchi, C., Bereciartua, P., Vignieri, V., & Cohen, A. (2019). Enhancing urban brownfield regeneration to pursue sustainable community outcomes through dynamic performance governance. International Journal of Public Administration. https://doi.org/10.1080/01900692.2019. 1669180 Bianchi, C., Bovaird, T., & Loeffler, E. (2017). Applying a dynamic performance management framework to wicked issues: How coproduction helps to transform young people’s services in Surrey County Council, UK. International Journal of Public Administration, 40(10), 833–846. Bianchi, C., & Cosenz, F. (2013). Designing performance management systems in academic institutions: A dynamic performance management view. In Proceedings of ASPA 2013 Annual Conference, New Orleans, LA, 19–23 September. ASPA. Bianchi, C., & Tomaselli, S. (2015). A dynamic performance management approach to support local strategic planning. International Review of Public Administration, 20(4), 370–385. Bianchi, C., & Vignieri, V. (2020). Dealing with “abnormal” business growth by leveraging local area common goods: An outside-in stakeholder collaboration perspective. International Journal of Productivity and Performance Management. https://doi.org/10.1108/IJPPM-07-2019-0318 Bianchi, C., & Williams, D. W. (2015). Applying system dynamics modeling to foster a cause-andeffect perspective in dealing with behavioral distortions associated with a city’s performance measurement programs. Public Performance & Management Review, 38(3), 395–425. Bisogno, M., Dumay, J., Manes Rossi, F., & Tartaglia Polcini, P. (2018). Identifying future directions for IC research in education: A literature review. Journal of Intellectual Capital, 19, 10–33.

Adopting a Dynamic Performance Governance Approach to Frame. . .

105

Bonollo, E., & Zuccardi Merli, M. (2018). Performance reporting in Italian Public University: Activities in support of research, teaching and the “third mission”. In E. Borgonovi, E. AnessiPessina, & C. Bianchi (Eds.), Outcome-based performance management in the public sector (pp. 307–330). Springer. Bornmann, L. (2013). What is societal impact of research and how can it be assessed? A literature survey. Journal of the American Society for Information and Science Technology, 64(2), 217– 233. Bovaird, T. (2007). Beyond engagement and participation: User and community coproduction of public services. Public Administration Review, 67(5), 846–860. Broadbent, J. (2007). If you can’t measure it, how can you manage it? Management and governance in higher educational institutions. Public Money and Management, 27(3), 193–198. Business/Higher Education Round Table. (2006). Universities’ third mission: Communities engagement, B-HERT. Position Paper; B-HERT. Carayannis, E., & Campbell, D. (2012). Mode 3 knowledge production in quadruple helix innovation systems. Springer Briefs in Business. Carayannis, E., Barth, T., & Campbell, D. (2012). The quintuple helix innovation model: Global warming as a challenge and driver for innovation. Journal of Innovation and Entrepreneurship, 1, 1–12. Cepiku, D., Mussari, R., Poggesi, S., & Reichard, C. (2012). Special issue on governance of networks: Challenges and future issues from a public management perspective editorial. Journal of Management and Governance, 18, 1–7. Coda, V. (2010). Entrepreneurial values and strategic management. Essays in management theory. Palgrave Macmillan. Cosenz, F. (2011). Sistemi di governo e di valutazione della performance per l’azienda “Università”. Giuffrè. Cosenz, F. (2013). The “Entrepreneurial University”: A preliminary analysis of the main managerial and organisational features towards the design of planning & control systems in European Academic Institutions. Management Research and Practice, 5(4), 19–36. Cosenz, F. (2014). A dynamic viewpoint to design performance management systems in academic institutions: Theory and practice. International Journal of Public Administration, 37(13), 955–969. Cosenz, F. (2018). Supporting public sector management through simulation-based methods: A dynamic performance management approach. International Review of Public Administration, 23(1), 20–36. Cosenz, F., & Noto, G. (2016). Applying system dynamics modelling to strategic management: A literature review. Systems Research and Behavioral Science, 33(6), 703–741. Cosenz, F., Rodrigues, V. P., & Rosati, F. (2020). Dynamic business modeling for sustainability: Exploring a system dynamics perspective to develop sustainable business models. Business Strategy and the Environment, 29(2), 651–664. Dumay, J., & Garanina, T. (2013). Intellectual capital research: A critical examination of the third stage. Journal of Intellectual Capital, 14(1), 10–25. Elena-Perez, S., Leitner, K. H., Secundo, G., & Martinaitis, Ž. (2014). Shaping new managerial models in European universities: the impact of reporting and managing IC. In P. Ordonez De Pablos & L. Edvinsson (Eds.), Intellectual capital in organizations: Non-financial reports and accounts (pp. 150–165). Routledge. Esposito, V., De Nito, E., Pezzillo Iacono, M., & Silvestri, L. (2013). Dealing with knowledge in the Italian public universities: The role of performance management systems. Journal of Intellectual Capital, 14(3), 431–450. Etzkowitz, H. (2003). Innovation in innovation: The Triple Helix of university-industry-government relations. Social Science Information, 42(3), 293–337. Etzkowitz, H. (2008). The Triple Helix: University-industry-government innovation in action. Routledge. Etzkowitz, H., Webster, A., Gebhardt, C., & Terra, B. R. C. (2000). The future of the university and the university of the future: Evolution of ivory tower to entrepreneurial paradigm. Research Policy, 29(2), 313–330.

106

F. Cosenz

European Commission and OECD. (2012). A Guiding Framework for Entrepreneurial Universities. OECD. Frondizi, R., Fantauzzi, C., Colasanti, N., & Fiorani, G. (2019). The evaluation of universities’ third mission and intellectual capital: Theoretical analysis and application to Italy. Sustainability, 11 (12), 3455. Fuster, E., Padilla-Meléndez, A., Lockett, N., & del-Aguila-Abra, A. R. (2019). The emerging role of university spin-off companies in developing regional entrepreneurial university ecosystems: The case of Andalusia. Technological Forecasting and Social Change, 141, 219–231. Graham, R. (2014). Creating university-based entrepreneurial ecosystems: Evidence from emerging world leaders. Massachusetts Institute of Technology. Guerrero, M., Cunningham, J. A., & Urbano, D. (2015). Economic impact of entrepreneurial universities’ activities: An exploratory study of the United Kingdom. Research Policy, 44(3), 748–764. Guerrero, M., Herrera, F., & Urbano, D. (2019). Strategic knowledge management within subsidised entrepreneurial university-industry partnerships. Management Decision, 57(12), 3280–3300. Guerrero, M., Urbano, D., Fayolle, A., Klofsten, M., & Mian, S. (2016). Entrepreneurial universities: Emerging models in the new social and economic landscape. Small Business Economics, 47 (3), 551–563. Gulbrandsen, M., & Slipersaeter, S. (2007). The third mission and the entrepreneurial university model. In A. Bonaccorsi & C. Daraio (Eds.), Universities and strategic knowledge creation: Specialization and performance in Europe (pp. 112–143). Edward Elgar Publishing. Gunasekara, C. (2006). Refraiming the role of Universities in the development of regional innovation system. Journal of Technological Transfer, 31(1), 101–113. Guthrie, J., & Neumann, R. (2007). Economic and non-financial performance indicators in Universities. Public Management Review, 9(2), 231–252. Hayden, F. G. (2006). The inadequacy of Forrester System dynamics computer programs for institutional principles of hierarchy, feedback, and openness. Journal of Economic Issues, 40 (2), 527–535. Hayter, C. S. (2013). Harnessing university entrepreneurship for economic growth factors of success among university spin-offs. Economic Development Quarterly, 27(1), 18–28. HEFCE (Higher Education Funding Council for England). (2008). Strategic plan 2006-11. HEFCE. Jongbloed, B. (2008). Indicators for mapping University-regional interactions. Paper presented at the ENID-PRIME Indicators Conference, Oslo, Norway, May 26–28. Jongbloed, B., Enders, J., & Salerno, C. (2008). Higher education and its communities: Interconnections, interdependencies and a research agenda. Higher Education, 56(3), 303–324. Laredo, P. (2007). Revisiting the third mission of universities: Toward a renewed categorization of university activities? Higher Education Policy, 20(4), 441–456. Leydesdorff, L., & Etzkowitz, H. (1996). Emergence of a Triple Helix of University-industrygovernment relations. Science and Public Policy, 23(5), 279–286. Lubik, S., Garnsey, E., Minshall, T., & Platts, K. (2013). Value creation from the innovation environment: Partnership strategies in university spin-outs. R&D Management, 43(2), 136–150. Manes Rossi, F., Nicolò, G., & Tartaglia Polcini, P. (2018). New trends in intellectual capital reporting: Exploring online intellectual capital disclosure in Italian universities. Journal of Intellectual Capital, 19(4), 814–835. Mastilak, C., Matuszewski, L., Miller, F., & Woods, A. (2012). Evaluating conflicting performance on driver and outcome measures: The effect of strategy maps. Journal of Management Control, 23(2), 97–114. McAdam, M., & Debackere, K. (2018). Beyond “Triple Helix” towards “Quadruple Helix” models in regional innovation systems. R&D Management, 48(1), 3–6. McAdam, M., Miller, K., & McAdam, R. (2016). Situated regional university incubation: A multilevel stakeholder perspective. Technovation, 50–51, 69–78.

Adopting a Dynamic Performance Governance Approach to Frame. . .

107

McAdam, M., Miller, K., & McAdam, R. (2017). University business models in disequilibrium– engaging industry and end users within university technology transfer processes. R&D Management, 47(3), 458–472. Miller, B. A. (2007). Assessing organizational performance in higher education. Jossey-Bass. Miller, K., McAdam, M., & McAdam, R. (2014). The changing university business model: a stakeholder perspective. R&D Management, 44(3), 265–287. Molas-Gallart, J., Salter, A., Patel, P., Scott, A., & Duran, X. (2002). Measuring third stream activities. Final report to the Russell Group universities. SPRU, University of Sussex. Morecroft, J. D. W. (2015). Strategic modelling and business dynamics: A feedback systems approach. Wiley. Nicolò, G., Manes Rossi, F., Christiaens, J., & Aversano, N. (2020). Accountability through intellectual capital disclosure in Italian Universities. Journal of Management and Governance. https://doi.org/10.1007/s10997-019-09497-7 Olcay, G. A., & Bulu, M. (2017). Is measuring the knowledge creation of universities possible? A review of university rankings. Technological Forecasting and Social Change, 123, 153–160. Osborne, S. P., Radnor, Z., & Strokosch, K. (2016). Co-production and the co-creation of value in public services: A suitable case for treatment? Public Management Review, 18(5), 639–653. Perkmann, M., Tartari, V., McKelvey, M., Autio, E., Broström, A., D’Este, P., Fini, R., Geuna, A., Grimaldi, R., & Hughes, A. (2013). Academic engagement and commercialisation: A review of the literature on university–industry relations. Research Policy, 42, 423–442. Pilbeam, C. (2006). Generating additional revenue streams in UK universities: An analysis of variation between disciplines and institutions. Journal of Higher Education Policy and Management, 28(3), 297–311. Pinto, H., Cruz, A. R., & de Almeida, H. (2016). Academic entrepreneurship and knowledge transfer networks: Translation process and boundary organizations. In L. Carvalho (Ed.), Handbook of research on entrepreneurial success and its impact on regional development (pp. 315–344). IGI Global. Powell, W. W., & Snellman, K. (2004). The knowledge economy. Annual Review of Sociology, 30 (1), 199–220. Rhodes, R. A. W. (1990). Policy networks: A British perspective. Journal of Theoretical Politics, 2 (3), 293–317. Rhodes, R. A. W. (2017). Network governance and the differentiated polity. Oxford University Press. Ricci, R., Colombelli, A., & Paolucci, E. (2019). Entrepreneurial activities and models of advanced European science and technology universities. Management Decision, 57(12), 3447–3472. Rippa, P., & Secundo, G. (2019). Digital academic entrepreneurship: The potential of digital technologies on academic entrepreneurship. Technological Forecasting and Social Change, 146, 900–911. Sanchez, P., & Elena, S. (2006). Intellectual capital in universities. Journal of Intellectual Capital, 7 (4), 529–548. Sánchez-Barrioluengo, M. (2014). Articulating the ‘three-missions’ in Spanish universities. Research Policy, 43(10), 1760–1773. Secundo, G., Massaro, M., Dumay, J., & Bagnoli, C. (2018). Intellectual capital management in the fourth stage of IC research: A critical case study in university settings. Journal of Intellectual Capital, 9, 157–177. Secundo, G., Ndou, V., Del Vecchio, P., & De Pascale, G. (2019). Knowledge management in entrepreneurial universities: A structured literature review and avenue for future research Agenda. Management Decision, 57(12), 3226–3257. Secundo, G., Perez, S. E., Martinaitis, Z., & Leitner, K. H. (2015). An intellectual capital maturity model (ICMM) to improve strategic management in European universities. Journal of Intellectual Capital, 16(2), 419–442.

108

F. Cosenz

Secundo, G., Perez, S. E., Martinaitis, Z., & Leitner, K. H. (2017). An intellectual capital framework to measure universities’ third mission activities. Technological Forecasting & Social Change, 123, 229–239. Sicilia, M. F., Guarini, E., Sancino, A., Andreani, M., & Ruffini, R. (2016). Public services management and co-production in multi-level governance settings. International Review of Administrative Sciences, 82(1), 8–27. Sterman, J. (2000). Business dynamics: Systems thinking and modeling for a complex world. Irwin/ McGraw-Hill. Thorp, H., & Goldstein, B. (2013). Engines of innovation: The entrepreneurial university in the twenty-first century. UNC Press Books. Torfing, J., & Ansell, C. (2017). Strengthening political leadership and policy innovation through the expansion of collaborative forms of governance. Public Management Review, 19(1), 37–54. Trencher, G., Yarime, M., McCormick, K. B., Doll, C. N. H., & Kraines, S. B. (2014). Beyond the third mission: Exploring the emerging university function of co-creation for sustainability. Science and Public Policy, 41(2), 151–179. Urbano, D., & Guerrero, M. (2013). Entrepreneurial universities: Socioeconomic impacts of academic entrepreneurship in a European context. Economic Development Quarterly, 27, 40–55. Van Looy, B., Debackere, K., & Andries, P. (2003). Policies to stimulate regional innovation capabilities via university-industry collaboration: An analysis and an assessment. R&D Management, 33(2), 209–229. Watson, D., & Hall, L. (2015). Addressing the elephant in the room: Are universities committed to the third stream agenda. International Journal of Academic Research in Management, 4(2), 48–76. Webber, R., & Jones, K. (2011). Re-positioning as a response to government higher education policy development–an Australian case study. Journal of Higher Education Policy and Management, 33(1), 17–26. Wolstenholme, E. (1999). Qualitative vs quantitative modelling: The evolving balance. Journal of the Operational Research Society, 50, 422–428. Xavier, J. A., & Bianchi, C. (2019). An outcome-based dynamic performance management approach to collaborative governance in crime control: Insights from Malaysia. Journal of Management and Governance. https://doi.org/10.1007/s10997-019-09486-w

The Third Mission Strategies Disclosure Through the Integrated Plan Natalia Aversano, Giuseppe Nicolò, Giuseppe Sannino, and Paolo Tartaglia Polcini

1 Introduction In the last decades, Higher Education Institutions (HEI) have undergone a process of profound social, economic, and political transformation mainly driven by the New Public Management (NPM), Bologna Process and the rise of new Knowledge Economy (Sánchez & Elena, 2006; Guthrie & Neumann, 2007; Martin-Sardesai et al., 2020). Accordingly, universities have been propelled to adopt a more entrepreneurial attitude (Clark, 1998; Rubens et al., 2017). They have been encouraged to interact with the wide ecosystem under a broader economic perspective, explore new financing sources, and align their traditional academic duties with economic and social development objectives (Etzkowitz et al., 2000; Trencher et al., 2014). Therefore, alongside the traditional two academic missions of teaching and research, a third mission emerged, mainly of “the generation, use, application and exploitation of knowledge and other university capabilities outside academic environments” (Molas-Gallart et al., 2002, pp. 2–3). As a result, universities started to abandon the historical ivory tower to be engaged with multiple stakeholders such as industries, governments, and other public institutions to develop a common path of value creation via technology transfer and innovation, continuing education, and social engagement (Trencher et al., 2014; Secundo et al., 2016). In this way, the third mission has been progressively integrated with the other two missions in a holistic perspective.

N. Aversano (*) · G. Nicolò · P. Tartaglia Polcini Department of Management & Innovation Systems, University of Salerno, Fisciano, Italy e-mail: [email protected]; [email protected]; [email protected] G. Sannino Economics Department, University of Campania “Luigi Vanvitelli”, Capua, Italy e-mail: [email protected] © The Author(s), under exclusive license to Springer Nature Switzerland AG 2022 E. Caperchione, C. Bianchi (eds.), Governance and Performance Management in Public Universities, SIDREA Series in Accounting and Business Administration, https://doi.org/10.1007/978-3-030-85698-4_6

109

110

N. Aversano et al.

From this standpoint, the knowledge produced and transmitted in academia through education and research activities is also disseminated to the whole ecosystem through innovative mechanisms and networks, including licensing intellectual property, creating start-up and spin-off ventures, and developing synergies with private entities (Vorley & Nelles, 2009). Due to its relevance for the social and economic development, in most countries (e.g. USA, UK, Finland, Australia, Germany), policymakers have embarked on a regulatory process to elicit universities’ engagement in third mission activities (Rosli & Rossi, 2016). In such a way, they pay particular attention to the commercialization of research outcomes through patents, start-up and spin-off companies, collaborations, and partnerships (Rosli & Rossi, 2016; Rubens et al., 2017). In tune, performance measurement systems and public funds allocation schemes have been gradually shaped to include third mission activities evaluation (Kallio et al., 2016; Rosli & Rossi, 2016). As such, the third mission has acquired a pivotal relevance for universities, requiring adopting a long-term strategic perspective. In particular, universities have been urged to identify appropriate strategies, resources, and structures to effectively and efficiently implement, develop, and monitor third mission activities (Nelles & Vorley, 2008). Attuned, they are called to incorporate third mission functions and goals in the broader institutional strategy—in an integrated way— which contemplates both the interconnections and links with the other academic missions (research and teaching) and the economic and social impact exerted by the third mission activities on the wide ecosystem (Nelles & Vorley, 2008; Pinheiro et al., 2015). Nevertheless, despite its relevance, little research so far has been devoted to university strategic planning from a third mission perspective (Secundo, Lombardi, & Dumay, 2018; Lombardi et al., 2019). Therefore, the present research intends to enhance the understanding of third mission strategic relevance in HEIs’ context. In doing so, it attempts to offer newer insights on how Italian public universities are incorporating third mission strategies within the “Integrated Plan”, a strategic planning tool recently introduced in the Italian University context. To this aim, a content analysis—based on a disclosure index adapted from previous studies and integrated by the guidelines issued by the National Agency for Evaluation of Universities (ANVUR, 2018)—is conducted on the “Integrated plan” 2020–2022 prepared by a sample composed of the largest Italian Public universities. Italy represents a relevant case study opportunity, as Italian policymakers are devoting massive importance to universities’ third mission. Indicators to evaluate the efficiency and effectiveness of third mission activities have been introduced in both: (1) the universities’ periodic accreditation process, where since 2015 the “Third Mission and Social Impact Annual Document”—“Scheda SUA TM-IS” has been considered as an integral part of the periodic evaluation process of the quality of universities’ study courses and sites (ANVUR, 2015a, 2017) and (2) the national “Research Quality Assessment” (VQR), where, after a trial period covering the first

The Third Mission Strategies Disclosure Through the Integrated Plan

111

two assessments (VQR1 and VQR2), the VQR3 (planned to evaluate the period 2015–2019) shall also include third mission activities evaluation, following guidelines prepared by ANVUR (2018). Accordingly, the study provides an innovative contribution to the international debate on the third mission in universities, investigating the Integrated Plan to assess the extent to which Italian public universities are disclosing information about third mission-related strategies. The chapter is presented in six sections. After this introduction, the remainder of the chapter unfolds as follows: the next section contains an overview of the rise of the third mission and the entrepreneurial university characters review; the third section outlines the Italian context, while the fourth explains the data collected and the methodology applied; the fifth section presents and discusses the content analysis; last, the concluding remarks focus on the implications and limitations of the research.

2 Literature Review 2.1

Moving Towards the Third Mission and the Entrepreneurial Character of Universities

As a result of a backdrop of social, political, and economic changes driven by NPM, Bologna Process, and the rise of new Knowledge Economy, universities’ role has dramatically changed (Guthrie & Neumann, 2007; Kallio et al., 2016; Aversano, Nicolò, et al., 2020). Universities have been spurred to pursue higher standards of quality in teaching and research and to compete—in the global arena—implementing innovative processes and recruiting the best students and researchers (Sánchez & Elena, 2006; Parker, 2011; Kallio et al., 2016). In tune, they have been provided with more financial and organizational autonomy in a context of increasing budget constraints and—in turn—encouraged to rethink their strategies and operations to find new and alternative funding sources (Parker, 2011; Sam & Van Der Sijde, 2014). As a result, universities have started to adopt a more open perspective to the external environment and the global economy, engaging with business and industry to exploit—in a more profitable way—the knowledge they internally produce (Parker, 2011; Trencher et al., 2014; Kallio et al., 2016). In this evolving context, the rise of the knowledge economy further challenged universities to be at the forefront for national economic growth and social progress by means of crafting innovative solutions and technological paradigms (Etzkowitz & Leydesdorff, 2000; Rubens et al., 2017). They have been called to enhance the regional and national economic performance and ameliorate the quality of life and society’s services at large (Molas-Gallart et al., 2002; Trencher et al., 2014).

112

N. Aversano et al.

The Triple Helix Model, grounded on the illustration of the different institutional configurations of university–industry–government relationships, has been widely used to explain the evolution of the universities and the pivotal role acquired in the innovative context depicted by the knowledge-based society (Etzkowitz et al., 2000; Etzkowitz & Leydesdorff, 2000; Goldstein, 2010). According to the Triple Helix perspective, as accumulators, producers, and providers of knowledge, universities should become the backbone for national economic growth and social progress. This requires breaking down the organizational, cultural, and normative barriers which have traditionally abstracted them from the other important spheres of society such as industry and government (Etzkowitz et al., 2000; Etzkowitz & Leydesdorff, 2000). In such a way, universities should go beyond teaching and research activities, starting to be engaged with industry and government in developing common value creation paths that address society’s need. In this perspective, the sphere of universities should contribute to creating an integrated, innovative environment, based on a continuous exchange of knowledge, skills, information flows, and human resources with the other two societal spheres, government, and industry (Etzkowitz & Leydesdorff, 2000; Secundo et al., 2016). This should result in the creation of partnerships, collaborations, spin-off, start-up, patent licensing, and other tri-lateral initiatives necessary to foster knowledge-based economic development and social transformation (Etzkowitz & Leydesdorff, 2000; Goldstein, 2010). Therefore, according to the Triple Helix model, the university can be considered as “a cost effective and creative inventor and transfer agent of both knowledge and technology” (Etzkowitz et al., 2000, p. 314), which should necessarily act in tune with government and industry to the benefit of the wide ecosystem (Secundo et al., 2016). Combining these aspects portrays a snapshot underpinning the third mission and the entrepreneurial university’s rise. The entrepreneurial university configures a hybrid organization that integrates the third mission of economic development and social engagement alongside the two traditional missions of scientific research and higher education, adopting an economic rationalist approach to its policies and strategies (Etzkowitz et al., 2000; Etzkowitz & Leydesdorff, 2000; Parker, 2011). This university type represents a response to the evolving funding environment, innovation systems, and global economic scenario (Clark, 1998; Secundo, Beer, et al., 2017). Accordingly, it is identifiable as a “knowledge-based agent” (Lombardi et al., 2019, p. 3387), which enables the dialogue between science and society and creates value for the ecosystem by enhancing technology development, innovation, and economic growth (Clark, 1998; Secundo, Beer, et al., 2017; Lombardi et al., 2019). Entrepreneurial universities rely upon three main dimensions of the third mission: (1) technology transfer and innovation; (2) continuing education or lifelong learning; and (3) social engagement (Trencher et al., 2014; Secundo et al., 2016; Secundo, Beer, et al., 2017). Technology transfer is applying information, technological innovation, and research into practical use (Frondizi et al., 2019). It configures a process of exchange of knowledge, capabilities, and competencies between HEIs, which has the final

The Third Mission Strategies Disclosure Through the Integrated Plan

113

purpose of ensuring the development and commercialization of research outputs and the creation of socio-economic value (Secundo, Beer, et al., 2017). Continuing education or lifelong learning is the process of enlarging educational offer by providing courses such as postgraduate and professional development programmes that may not be solely oriented to achieve an academic qualification but also to foster entrepreneurship and innovation (Molas-Gallart et al., 2002; Mahrl & Pausits, 2011; Secundo, Beer, et al., 2017). Last, social engagement embraces the batch of mutual relationships and collaborations established between universities and the communities they operate, based on a free exchange of knowledge and resources (Mahrl & Pausits, 2011). It also includes the programmes realized to preserve the welfare of employees and students, as well as the public initiatives such as cultural and scientific events, seminars, training days, meeting and forums HEIs organize to create network and connections with stakeholders’ community (Secundo et al., 2016; Lombardi et al., 2019; Aversano, Di Carlo, et al., 2020).

2.2

Theoretical Background and Prior Research

The Stakeholder theory describes the complex bundle of relationships surrounding the organizations and all the actors who are likely to influence or be influenced by their activities (Freeman, 1984; Jongbloed et al., 2008; Kaur & Lodhia, 2018). According to this theory, the performance and the continued viability of any type of organization strictly depend from its ability to incorporate the view and expectations of its stakeholders in both strategies and processes (Boesso & Kumar, 2007). This discourse is particularly relevant in the HE context where universities—as public sector entities mainly relying on public funds to deliver public services— should be concerned with the needs of a wide range of stakeholders (e.g. students, citizens, donors, research centres, private firms, other public institutions, foundations) interested in their activities and performance (Manes Rossi et al., 2018; Ramirez et al., 2019; Nicolò, 2020). The third mission’s rise has dramatically expanded universities’ boundaries, questioning their canonical mission and roles (Pinheiro et al., 2015; Secundo et al., 2016; Nicolò, 2020). The traditional social contract between society and the university—according to which the latter was called to provide excellent education and research programmes to the benefit of the former—has been significantly challenged (Jongbloed et al., 2008). Universities have been increasingly called to foster their stakeholders’ active engagement in their activities to draw common patterns of value creation that enable society’s economic, social, and cultural growth (Jongbloed et al., 2008; Secundo et al., 2016; Secundo, Massaro, et al., 2018). Following this perspective, they are increasingly intertwined and interdependent with a forum of external stakeholders at the local, national, and international levels whose interests and concerns need to be addressed through proactive engagement

114

N. Aversano et al.

pathways (Pinheiro et al., 2015; Secundo et al., 2016; Secundo, Massaro, et al., 2018; Kaur & Lodhia, 2018). Therefore, in this scenario, universities are strongly pressed “to shift from an administrative focus to a strategic one” (Secundo, Massaro, et al., 2018, p. 158). Their success largely depends on the extent to which the resources and the actors involved in the third mission activities are strategically integrated and interconnected with those related to the traditional missions of teaching and research, following a holistic approach oriented to the creation of value for the broad ecosystem (Nelles & Vorley, 2008; Vorley & Nelles, 2009). A strategic approach to the third mission is fundamental to bolster stakeholders’ engagement and is essential to transform public universities “into more enterprise-like institutions” (Secundo, Massaro, et al., 2018, p. 158). To this aim, a comprehensive strategic plan in which both long-term strategic and short-term operational objectives related to the three missions are identified is of utmost importance (Nelles & Vorley, 2008; Vorley & Nelles, 2009). It may help universities to improve the decision-making process and the accountability towards stakeholders, by means of portraying a clearer picture of: the objectives of the third mission activities; the related issues to be addressed; the stakeholders involved in processes; and the possible strategic alternatives to be pursued to fulfil the objectives (Nelles & Vorley, 2008; Mazzara et al., 2010; Sullivan & Richardson, 2011). However, despite its relevance, research is still lagging behind in investigating university strategic planning tools, especially regarding the third mission dimension. Abdulkareem et al. (2012) investigated the relationship between strategic plan implementation and internal efficiency in Nigerian universities, observing that the implementation of strategic plans—ensuring the participation of a high number of stakeholders in universities’ decision-making processes—allows universities to enhance efficiency in terms of increase in students’ graduation rate and decrease in wastage rate. Also Sullivan and Richardson (2011) recognized the need to foster the engagement of both internal and external stakeholders within the strategic plan implementation process; accordingly, they proposed a model of an integrated strategic plan which aligns institutional mission and vision-based strategic initiatives with continuing higher education practice and outcomes assessment. Further, other scholars focused the attention on strategic plans as tools to manage Intellectual Capital (IC)—through increased stakeholder engagement—and foster value creation through third mission activities such as social engagement, internationalization, and regional development (Secundo, Lombardi, & Dumay, 2018; Lombardi et al., 2019). Another specific strand of scholars, in the Italian context, focused on the threeyears university performance plan as a tool to define expected performance levels and formalize institutional mission, strategic and operational objectives to improve decision-making and discharging broader accountability and transparency, also regarding IC resources (Siboni et al., 2013; Allini et al., 2017, 2019). They observed that Italian universities still attach limited strategic value to such tool (Allini et al., 2017), although they recognize the importance of IC resources, such as developing relationships with external partners (Siboni et al., 2013). Moreover, the early

The Third Mission Strategies Disclosure Through the Integrated Plan

115

transition to accrual accounting, the geographical location, and the implementation of management accounting tools are identified as factors that affect the comprehensiveness of disclosure provided through performance plans (Allini et al., 2019). The introduction of the new Three-Years Integrated Plan within the Italian public university system (D.Lgs. 150/2009; ANVUR, 2015a) makes the scope for universities to adopt a holistic perspective in defining institutional objectives and strategies and promoting the stakeholders’ engagement (Allini et al., 2020; Aversano, Nicolò, et al., 2020). It is based on an extended and integrated concept of performance— linked to the three missions—which is conceived as “the ability of universities to interact dynamically with other entities (both private and public) in a mutually beneficial and sustainable relationship” (Aversano, Nicolò, et al., 2020, p. 6). In this way, such document marked a step forward in the strategic planning process of Italian universities as it requires systematic planning of administrative activities related to performance, transparency, and anti-corruption, in tune with both the strategies to be implemented in conducting institutional activities (teaching, research, and third mission) and the necessary economic and financial planning (ANVUR, 2015a). The introduction of the Integrated Plan is giving a new impulse to research in the university realm. Accordingly, scholars started to investigate the university ThreeYears Integrated Plan to examine disclosure patterns in a legitimacy (Allini et al., 2020) or IC (Aversano, Nicolò, et al., 2020) perspective. However, little is known about this tool’s potential to integrate information about third mission strategies alongside those related to the other two missions. To address this gap, this study investigated a sample of the largest Italian Public universities to assess to what extent they are integrating third mission-related strategies within the Three-Years Integrated plan 2020–2022.

3 The Italian Context: The Third Mission and the Integrated Plan In Italy, the third mission activities’ evaluation criteria are included in the qualitative evaluation of scientific research.1 The first concrete opportunity was represented by the 2004–2010 Research Quality Assessment (VQR 2004–20102) in which eight third mission indicators were defined: some related to the activities of economic valorization of knowledge (number of patents, number of spin-off companies, participation in incubators, and consortia for technology transfer), and others related to the enhancement of knowledge for the well-being of society (management of archaeological sites, museum poles).

1 2

https://www.anvur.it/en/activities/vqr/ https://www.anvur.it/attivita/vqr/vqr-2004-2010/

116

N. Aversano et al.

Table 1 Evaluation areas third mission/social impact Strategic objectives of third mission/social impact Enhancement of research Production of public goods 1. Industrial property management 5. Management of heritage and cultural (patents, plant rights) activities (archaeological excavations, museum poles, musical activities, buildings and archives historic buildings, historical libraries and newspaper libraries, theatres and facilities sports) 2. Spin-off companies 6. Public health activities (clinical trial, non-interventional studies and empowerment, support structures) 3. Third-party activities 7. Continuing education, lifelong learning (research/consultancy contracts with external and open teaching clients, which were not considered among the (continuing education activities, continuing revenues deriving from competitive projects) medical education activities, certification of skills activities, school-work alternation, massive open online course) 4. Intermediary structures 8. Public engagement (technology transfer offices, placement offices, (organization of concerts, theatrical perforincubators, science parks, consortia and associ- mances, sporting events, exhibitions, organiations for the third Mission) zation of research sharing initiatives, policy making) Source: ANVUR (2018)

This first experimentation led to the draft of the Manual (ANVUR, 2015b) for the evaluation of the third mission, which entails the collection of standardized and comparable data related to both the research enhancement activities (patents, spinoff, third-party contracts and agreements, intermediary structures) and the production of social and cultural public goods (public engagement, cultural heritage, continuous training, clinical trials). With the second Research Quality Assessment 2011–2014 (VQR 2011–20143), this manual was used as a reference for evaluating the third mission. Subsequently, in keeping with the third Research Quality Assessment 2015–2019 (VQR 2015–20194), in 2018, new guidelines were developed for “the Third Mission and Social Impact” (ANVUR, 2018). The new guidelines sought to maintain a conservative approach concerning the indicators (Scheda SUA-TM) used in the past for the previous assessment cycle (VQR 2011–2014). However, based on the requests of the universities, some sections have been revised. Above all, the third mission activities defined as the production of public goods have been enlarged, strengthening the detection of their social, cultural, and economic impact. The new guidelines decline the third mission’s evaluation in eight areas related to the Enhancement of research and the Production of public goods (Table 1).

3 4

https://www.anvur.it/attivita/vqr/vqr-2011-2014/ https://www.anvur.it/attivita/vqr/vqr-2015-2019/

The Third Mission Strategies Disclosure Through the Integrated Plan

117

The guidelines require Italian universities to indicate in the Scheda SUA-TM/IS the strategic lines and their third mission activities’ main objectives. Universities must provide information on the third mission’s positioning within the main documents (statute, strategic plan, or other programmatic documents). Therefore, if the Scheda SUA-TM/IS is correctly filled, it allows reading the data on individual areas from a systemic perspective. This holistic approach is in line with the recent reforms that characterized Italian Higher Education, where more attention has been devoted to ensuring the coherence between planning activities and reporting outcomes to ensure transparency and broader accountability (Aversano et al., 2017; Allini et al., 2020). For these reasons, in 2015, ANVUR issued the Guidelines for the integrated management of Italian State universities’ performance cycle to prepare a new plan called the “Integrated Plan” (ANVUR, 2015a). Integrated Plan aims to ensure greater consistency with the strategic planning system (multi-year) and with the university’s economic-financial (annual) system. The guidelines identify the Integrated Plan as the tool to operationally decline both the indications in the strategic documents and the initiatives to improve the university’s management processes’ effectiveness and efficiency. The document, anchoring university performance with its three missions (teaching, research, and third mission), is composed of five sections (1. Strategic framing of the university; 2. Organizational performance; 3. Analysis of the risk areas; 4. Communication and transparency; 5. Individual performance). Also, the Integrated Plan guidelines provide a list of indicators related to research, teaching, and the third mission that recall the set of indicators developed by ANVUR within the VQR model and procedures of “Self-assessment Periodic Assessment and Accreditation” (AVA procedures5). In January 2019, ANVUR issued new “Guidelines for the integrated management of university performance and budget cycles”. This document represents an important step forward in defining an increasingly integrated programming of universities. Specifically, ANVUR suggests that universities should pursue ever more precise integration between strategic objectives and resource planning. This aspect was subject to an interesting experimental intervention in the management systems used to implement the 2019–2021 Budget, and it was possible to create a specific link between strategic objectives and budgets.

4 Methodology and Data Collection To ascertain to extent to which Italian public universities are disclosing information about third mission-related strategies, the Integrated Plans prepared by a sample of Italian public university for the period of 2020 to 2022 have been analysed. Within a

5

https://www.anvur.it/en/activities/ava/

118

N. Aversano et al.

total sample composed of 61 Italian public universities, the present research focuses on the mega and big state universities with over 20,000 students enrolled. More specifically, according to the last ranking6 of Italian universities (2019/2020) carried by the CENSIS (Center for Social Investment Studies), ten universities fall in the category of mega universities (over 40,000) and 16 are included in the category of big universities (from 20,000 to 40,000). We chose to focus the attention on the largest Italian universities because, according to scholars (Gallego-Alvarez et al., 2011; Ramirez et al., 2019; Nicolò et al., 2020), larger universities are more politically visible and face a greater forum of stakeholders. They are more prone to disclose financial and non-financial information to mitigate external pressures and meet stakeholders’ information needs (Manes Rossi et al., 2018; Aversano, Nicolò, et al., 2020; Nicolò et al., 2020). Accordingly, a higher amount of information about third mission-related strategies is expected to be found on the Integrated Plans produced by larger Italian public universities. The Italian Universities’ integrated plans 2020–2022 were downloaded from each university’s official website and examined through content analysis (Krippendorff, 2018) performed manually by two of the four authors of this chapter. Content analysis has been defined as a “search technique for obtaining replicable and valid inferences from data based on their context” (Krippendorff, 1980, p. 31). It is a mainstream research method in financial and non-financial disclosure studies as it allows to examine published information in a convenient, objective, and systematic manner (Guthrie et al., 2004; Cuozzo et al., 2017). The content analysis is based on a coding list which has been defined taking into consideration both some previous studies that analysed the third mission in an IC perspective (Sangiorgi & Siboni, 2017; Secundo, Perez, et al., 2017; Di Berardino & Corsi, 2018; Ndou et al., 2018) and the information requested by the ANVUR guidelines related to “the third mission and social impact” (ANVUR, 2018). As shown in Table 2, the final checklist comprises 26 third mission-related items grouped in 8 categories. To quantify the information gathered through the content analysis and assess the level of third mission strategy disclosure sampled Italian Public universities provide through the Integrated Plan, an unweighted disclosure index, based on a dichotomous approach, was developed (Manes Rossi et al., 2018; Aversano, Nicolò, et al., 2020). This choice considers all items equally important and limits the potential for subjectivity weaknesses (Manes Rossi et al., 2018; Aversano, Nicolò, et al., 2020). Accordingly, a score of one was attributed if the item was disclosed on the document, and a value of zero was assigned otherwise. Therefore, the disclosure index was calculated as follows:

6

https://www.censis.it/formazione/la-classifica-censis-delle-universit%C3%A0-italianeedizione-20192020

The Third Mission Strategies Disclosure Through the Integrated Plan

119

Table 2 Third mission categories 1. Intellectual property Patents rights Publications No. of researchers No. of PhD students No. of research fellow Growth of scientific staff 2. Spin-off Spin-off Start-up 3. Activities for third parties Programme contracts/conventions/agreements: With external entities 4. Public and social engagement Social engagement and community involvement Welfare policies 5. Heritage assets Museums Historical libraries Theatres Historical buildings

5. Heritage assets Museums Historical libraries Theatres Historical buildings 6. Continuing education, training, and open courses Post-graduation, high education, and specialization programmes Continuing training courses and vocational training courses Massive Open Online courses (MOOC)/elearning 7. Intermediation structures Third mission structures and resources Incubators Research consortia and cluster Partnerships and collaboration Placement 8. Internationalization Internationalization of teaching staff (incoming and outgoing) Internationalization of student (incoming and outgoing) Joint degrees/double degree programmes

Source: Our elaboration

Pl Third mission strategies disclosure Index =

i = 1 di

l

,

where d ¼ 1 if the item was disclosed and 0 otherwise; l ¼ the maximum number of items (26 items). The information was searched in the first section of the document related to the university’s strategic framing because it is the more coherent area with the information to be searched. In this initial section of the Integrated Plan, the university indicates the administrative activity’s main development lines. These lines also refer to the institutional mandate of the university in terms of research, teaching, and third mission, with explicit evidence of the actions programmed and undertaken to improve the quality of research and the positioning in the national and international context, as requested by evaluation (VQR) and self-evaluation (SUA-RD) rules.

120

N. Aversano et al.

Table 3 Content analysis results. Individual descriptive statistics Third mission items Intellectual property Spin-off Activities for third parties Public and social engagement Heritage assets Continuing education, training, and open courses Intermediation structures Internationalization Total

N. of items 6 2 1 2 4 3

Mean 3 1 0.7 1.35 1.12 1.69

Variance 2.23 0.69 0.23 0.38 0.72 0.68

Min 0 0 0 0 0 0

Max 6 2 1 2 3 3

Disclosure Index 0.47 0.50 0.70 0.67 0.28 0.56

5 3 26

2.38 1.92 12.92

1.78 1.23 23.70

0 0 0

5 3 22

0.48 0.63 0.50

Source: Our elaboration

5 Findings Results summarized in Table 3 show that the mean disclosure index was 0.50 and that the universities disclosed on average about 12.92 third mission strategies-related items though their integrated plans. The highest score is 0.85 (22 of 26 third mission items disclosed), and the lowest score is 0 (0 of 26 third mission items disclosed). The lowest score refers to a university that has not provided any information on the third mission’s strategies. These general results show a medium level of disclosure, evidencing how the Integrated Plan may represent a useful tool to communicate to the internal and external stakeholder information about the strategies, the actors, the structures involved in the third mission activities (Nelles & Vorley, 2008). However, as already noted by Aversano, Nicolò, et al. (2020), in some universities, the Integrated Plan is still far from being accepted as a common reporting practice for the sake of both internal and external stakeholders. While some universities are exploiting the potential of this tool to incorporate important information about the third mission strategies, others convey a minimal amount of information to comply with the minimum requirements expressed in guidelines (ANVUR, 2015a). Table 3 also evidences similar paths of disclosure for the eight categories considered: Activity with third parties (0.70) is the most disclosed category, followed by Public and social engagement (0.67) and Internationalization (0.63). These three categories are closely linked to each other and can be framed in the wider Social Capital concept (Secundo, Perez, et al., 2017). More specifically, the high result detected for the disclosure index “Activity with third parties” may be interpreted as the effect of an institutional pressure exerted by the ANVUR with the guidelines for the “the third mission and social impact” (ANVUR, 2018). On the other hand, results related to “Public and social engagement” underline the efforts made by Italian Public universities to evidence the strategies implemented to increase community involvement via mutual relationships

The Third Mission Strategies Disclosure Through the Integrated Plan

121

and collaborations based on a free exchange of knowledge and resources (Mahrl & Pausits, 2011) and public initiatives such as cultural and scientific events, seminars, training days, meeting, and forums (Ndou et al., 2018; Aversano, Nicolò, et al., 2020). The engagement with territory also signals the willingness to achieve a consensus among the stakeholders to develop a shared strategic integrated plan (Secundo, Perez, et al., 2017). This is mirrored by the special attention devoted to the “internationalization” category. Indeed, to pursue the social and regional development agenda, universities must adopt coherent internationalization strategies that allow to shift the focus of the knowledge transfer from a single organization to its ecosystem and remove cultural barriers among universities and external stakeholders (Secundo, Massaro, et al., 2018). On the contrary, the category less disclosed is related to the Heritage assets (0.28). This category’s low disclosure index contrasts with the ANVUR guidelines (2018) requirements that ascribe great importance to this aspect. This result also conflicts with the study of Aversano, Di Carlo, et al. (2020) that recognized heritage assets as a relevant strategic area for assessing the third mission ad social impact of the university. Aversano, Di Carlo, et al. (2020) assert that in the light of the third mission, universities have the duty of preserve, protect, make accessible, and manage adequately the heritage assets held. However, the absence of information about the heritage assets can be explained by the serious difficulty of identifying adequate and transparent accounting treatment for this type of asset (Aversano, Di Carlo, et al., 2020). In fact, in Italy, several universities are still carrying out an identification process of the main heritage assets held by them, and they are still searching for the best way to recognize, evaluate, and disclose them. Table 4 shows the results of the content analysis for each third mission items. The most disclosed third mission components are: Number of researchers (85%, Intellectual capital property), Partnerships and collaboration (77%, Intermediation structures); Social engagement and community involvement (77%, Public and social engagement); Joint degrees/double degree programmes (73%, internationalization); Internationalization of student (69%, internationalization). The least disclosed third mission items are: Theatres (0%, Heritage assets); Growth of scientific staff (8%, Intellectual property); and Research consortia and cluster (27%, Intermediation structures). These findings corroborate the study’s results of Aversano, Nicolò, et al. (2020). Italian universities’ integrated plans provide adequate information about the human resources involved in teaching activities and learning processes (e.g. researchers) representing the fundamental drivers of innovation and strategic renewal (Secundo, Massaro, et al., 2018). The results also show that the integrated plans do not provide enough information about the intermediation structures involved in the technology transfer and innovation (e.g. research consortia and cluster). On the other hand, these findings confirm the results of the previous part of the analysis, evidencing that universities are widely disclosing social engagement and community involvement, international programmes for students’ mobility,

122

N. Aversano et al.

Table 4 Content analysis results for individual third mission items Third mission items Intellectual property Patents rights Publications No. of researchers No. of PhD students No. of Research fellows Growth of scientific staff Spin-off Spin-off Start-up Activities for third parties Programme contracts/conventions/agreements: with external entities Public and social engagement Social engagement and community involvement Welfare policies Heritage assets Museum Historical libraries Theatres Historical buildings Continuing education, training, and open courses Post-graduation, high education, and specialization programmes Continuing training courses and vocational training courses Massive open online courses (MOOC)/e-learning Intermediation structures Third mission structures and resources Incubators Research consortia and cluster Partnerships and collaboration Placement Internationalization Internationalization of teaching staff (incoming and outgoing) Internationalization of student (incoming and outgoing) Joint degrees/double degree programmes

No. of Universities

% of Universities

14 14 22 12 9 2

54% 54% 85% 46% 35% 8%

15 11

58% 42%

17

65%

20 15

77% 58%

8 9 0 12

31% 35% 0% 46%

14

54%

16 14

62% 54%

11 8 7 20 16

42% 31% 27% 77% 62%

13

50%

18 19

69% 73%

Source: Our elaboration

internationalization of students and partnerships, and collaboration via integrated plans. In particular, social engagement disclosure is important because it creates a social entrepreneurship environment in which there is a mutually beneficial

The Third Mission Strategies Disclosure Through the Integrated Plan

123

exchange of knowledge and resources between universities and their larger communities (local, regional, national, and global) (Mahrl & Pausits, 2011). Further, the disclosure of partnerships and collaboration is fundamental, showing stakeholders how the university actively engages with the external environment to create socio-economic value and develop strong connections with enterprises and other public institutions (e.g. national and regional governments and other universities) in light of the triple helix paradigm (Ndou et al., 2018). Moreover, as already evidenced by the study of Aversano, Nicolò, et al. (2020), the disclosure of international programmes for students’ mobility demonstrates how a university moves the knowledge created by several countries’ communities. This is made by developing relationships with external entities and expanding its human capital over organizational boundaries by engaging external stakeholders (such as international students) who become part of the university’s human capital and the whole ecosystem (Secundo, Massaro, et al., 2018).

6 Conclusion and Future Research Teaching and research activities have traditionally been considered as the utmost university’s functions. However, in recent years, universities have been progressively encouraged to exert a more direct role in supporting economic development and directly impacting the local community and society at large to be engaged in the so-called third mission (Molas-Gallart et al., 2002). Accordingly, incorporating the third mission in the broader institutional strategy has become a fundamental key success factor for modern universities. The three missions should not be treated as isolated spheres but included in a holistic perspective in the comprehensive institutional strategic plan (Nelles & Vorley, 2008; Vorley & Nelles, 2009). The recent introduction of the Integrated Plan in the Italian public university context (ANVUR, 2015a, 2015b) makes the scope for investigating to what extent universities are adopting an integrated approach to strategic management, sharing with both internal and external stakeholders their vision, objectives, and strategies that contemplate third mission relevance. Therefore, this study provides an innovative contribution to the international debate on third mission strategic relevance, offering fresh insights into how Italian public universities incorporate third mission strategies within the Integrated Plan. Connecting third mission-related strategies to the Integrated plan can contribute to a deeper practical and theoretical understanding of universities’ entrepreneurial entities. Accordingly, the results show that the Integrated Plans can be considered a valid, although not exhaustive, tool to reflect third mission-related goals and strategies to the benefit of both internal and external stakeholders. However, while some universities are exploiting this instrument’s potential to adopt a more proactive approach toward the third mission—including most of the strategic issues and policies to

124

N. Aversano et al.

improve the engagement with the ecosystem—others still lag behind. A prevalent attitude to include information to highlight social engagement, community involvement, and internationalization commitment emerge, while limited attention towards heritage assets’ strategic relevance is detected, despite the importance attached to them by recent ANVUR guidelines (ANVUR, 2018). This result echoes Allini et al. (2020, p. 88) who contended that these tools “still have to prove their ability to convey a full rethinking of the Universities’ approach to the changes that nowadays influence their performance and accountability in its broadest sense”. The study has some practical implications for policymakers, public sector practitioners, and academics. The third mission is gaining progressive prominence in both universities’ periodic accreditation process and research assessment exercises. Thus, policymakers might consider this study as a basis to examine to what extent Italian public universities are attributing importance to third mission-related strategies within their Integrated Plans. On the one hand, overall findings may represent a stimulus for policymakers in pursuing their strategy to encourage the integration of the three missions’ strategic planning, considering the interrelations and interdependencies between them. Accordingly, the Integrated Plan may stimulate universities to adopt a holistic approach in defining goals, objectives, and strategies related to research, teaching, and third mission results. On the other hand, our results evidence some pitfalls that may help policymakers and public sector practitioners formulating further guidelines and recommendations that attribute major attention to aspects (e.g. heritage assets), which Italian universities in their Integrated Plans still neglect. Last, academics may use or adapt the framework employed in this research to replicate the study in other geographical contexts or other strategic planning tools adopted in other countries. However, this study does include some limitations that may represent starting points for future research. For example, the data collected were necessarily limited to a single period (2020–2022), so the study does not capture third mission disclosure trends. Future research can replicate the analysis but can instead focus on third mission disclosure trends or determine how and to what extent contextual factors affect this kind of disclosure. Moreover, the analysis presented is based on an unweighted disclosure index, while future research might conduct a similar analysis by adopting a weighted method based on stakeholders’ perceptions and comparing the results with those of the present study. Further analysis may also investigate stakeholders’ perception about third mission disclosure information needs through ad hoc surveys.

References Abdulkareem, A. Y., Akinnubi, O. P., & Oyeniran, S. (2012). Strategic plan implementation and internal efficiency in Nigerian Universities. European Scientific Journal, 8(3), 244–257.

The Third Mission Strategies Disclosure Through the Integrated Plan

125

Allini, A., Caldarelli, A., & Spanò, R. (2017). La disclosure nei Piani della Performance delle università italiane. Intenti simbolici verso approcci sostanziali di legittimazione. Management Control, 1(1), 37–59. https://doi.org/10.3280/MACO2017-001003 Allini, A., Caldarelli, A., Spanò, R., & Zampella, A. (2019). Legitimating efforts in performance plans. Evidences on the thoroughness of disclosure in the Italian Higher Education setting. Management Control, 26, 143–168. Allini, A., Spanò, R., Zampella, A., & Meucci, F. (2020). Integrated performance plans in higher education as means of accounting change. Insights into the Italian context. Management Control, 1, 88–110. ANVUR. (2015a). Linee Guida per la gestione integrata del Ciclo della Performance delle università statali italiane. Retrieved from http://www.anvur.org/attachments/article/806/Linee %20Guida%20Atenei.pdf ANVUR. (2015b). La valutazione della terza missione nelle università italiane. Retrieved from http://www.anvur.it/attachments/article/26/Manuale%20valutazione%20terza~.pdf ANVUR. (2017). Accreditamento periodico delle sedi e dei corsi di studio universitari - linee guida. Retrieved from https://www.anvur.it/attivita/ava/accreditamento-periodico/linee-guidaper-laccreditamento-periodico ANVUR. (2018). Linee guida SUA-Terza Missione e Impatto Sociale delle Università italiane. Retrieved from www.anvur.it/wp-content/uploads/2018/11/SUA-TM_Lineeguida.pdf Aversano, N., Di Carlo, F., Sannino, G., Tartaglia Polcini, P., & Lombardi, R. (2020). Corporate social responsibility, stakeholder engagement, and universities: New evidence from the Italian scenario. Corporate Social Responsibility and Environmental Management. https://doi.org/10. 1002/csr.1934 Aversano, N., Manes-Rossi, F., & Tartaglia Polcini, P. (2017). Performance measurement systems in universities: A critical review of the Italian system. In E. Borgonovi et al. (Eds.), Outcomebased performance management in the public sector (pp. 269–287). Springer. https://doi.org/ 10.1007/978-3-319-57018-1_14 Aversano, N., Nicolò, G., Sannino, G., & Tartaglia Polcini, P. (2020). The integrated plan in Italian public universities: New patterns in intellectual capital disclosure. Meditari Accountancy Research, 28(4), 655–679. https://doi.org/10.1108/MEDAR-07-2019-0519 Boesso, G., & Kumar, K. (2007). Drivers of corporate voluntary disclosure: A framework and empirical evidence from Italy and the United States. Accounting, Auditing & Accountability Journal, 20(2), 269–296. Clark, B. (1998). Creating entrepreneurial universities: Organisational pathways of transformation. IAU Press and Pergamon. Cuozzo, B., Dumay, J., Palmaccio, M., & Lombardi, R. (2017). Intellectual capital disclosure: A structured literature review. Journal of Intellectual Capital, 18(1), 9–28. Di Berardino, D., & Corsi, C. (2018). A quality evaluation approach to disclosing third mission activities and intellectual capital in Italian universities. Journal of Intellectual Capital, 19(1), 78–201. Etzkowitz, H., & Leydesdorff, L. (2000). The dynamics of innovation: From national systems and ‘mode 2’ to a triple helix of university-industry-government relations. Research Policy, 29, 109–123. Etzkowitz, H., Webster, A., Gebhardt, C., Terra, B., & Cantisano, R. (2000). The future of the university and the university of the future: Evolution of ivory tower to entrepreneurial paradigm. Research Policy, 29(2), 313–330. Freeman, R. E. (1984). Strategic management: A stakeholder approach. Pitman. Frondizi, R., Fantauzzi, C., Colasanti, N., & Fiorani, G. (2019). The evaluation of universities’ third mission and intellectual capital: Theoretical analysis and application to Italy. Sustainability, 11(12), 3455. https://doi.org/10.3390/su11123455 Gallego-Alvarez, I., Rodríguez-Domínguez, L., & García-Sánchez, I. M. (2011). Information disclosed online by Spanish universities: Content and explanatory factors. Online Information Review, 35(3), 360–385.

126

N. Aversano et al.

Goldstein, H. (2010). The ‘entrepreneurial turn’ and regional economic development mission of universities. Annals of Regional Science, 50(2), 453–477. Guthrie, J., & Neumann, R. (2007). Economic and non-financial performance indicators in universities: The establishment of a performance-driven system for Australian higher education. Public Management Review, 9(2), 231–252. Guthrie, J., Petty, R., Yongvanich, K., & Ricceri, F. (2004). Using content analysis as a research method to inquire into intellectual capital reporting. Journal of Intellectual Capital, 5(2), 282–293. Jongbloed, B., Enders, J., & Salerno, C. (2008). Higher education and its communities: Interconnections, interdependencies and a research agenda. Higher Education, 56(3), 303–324. Kallio, K. M., Kallio, T. J., Tienari, J., & Hyvönen, T. (2016). Ethos at stake: Performance management and academic work in universities. Human Relations, 69(3), 685–709. Kaur, A., & Lodhia, S. (2018). Stakeholder engagement in sustainability accounting and reporting: A study of Australian local councils. Accounting, Auditing & Accountability Journal, 31(1), 338–368. Krippendorff, K. (1980). Content analysis: An introduction to its methodology. Sage. Krippendorff, K. (2018). Content analysis: An introduction to its methodology. Sage. Lombardi, R., Massaro, M., Dumay, J., & Nappo, F. (2019). Entrepreneurial universities and strategy: The case of the University of Bari. Management Decision, 57(12), 3387–3405. https://doi.org/10.1108/MD-06-2018-0690 Mahrl, M., & Pausits, A. (2011). Third Mission indicators for new ranking methodologies. Evaluation in Higher Education, 4(1), 43–65. Manes Rossi, F., Nicolò, G., & Tartaglia Polcini, P. (2018). New trends in intellectual capital reporting: Exploring online intellectual capital disclosure in Italian universities. Journal of Intellectual Capital, 19(4), 814–835. Martin-Sardesai, A., Guthrie, J., & Tucker, B. P. (2020). What you see depends on where you look: Performance measurement of Australian accounting academics. Accounting, Auditing & Accountability Journal. https://doi.org/10.1108/AAAJ-08-2019-4133 Mazzara, L., Sangiorgi, D., & Siboni, B. (2010). Public strategic plans in Italian local governments: A sustainability development focus? Public Management Review, 12(4), 493–509. Molas-Gallart, J., Salter, A., Patel, P., Scott, A. & Duran, X. (2002), Measuring third stream activities: Final report of the Russell group of universities, SPRU Science and Technology Policy Research Unity, University of Sussex. Ndou, V., Secundo, G., Dumay, J., & Gjevori, E. (2018). Understanding intellectual capital disclosure in online media big data: An exploratory case study in a university. Meditari Accountancy Research, 26(3), 499–530. Nelles, J., & Vorley, T. (2008). Entrepreneurial architecture in UK higher education institutions: Consolidating the third mission. In DRUID, 25th Celebration conference, June (pp. 18–20). CBS. Nicolò, G. (2020). The rise of intellectual capital disclosure from private to public sector: The case of Italian public universities. Collana Accounting & Business Studies. Nicolò, G., Manes-Rossi, F., Christiaens, J., & Aversano, N. (2020). Accountability through intellectual capital disclosure in Italian universities. Journal of Management and Governance. https://doi.org/10.1007/s10997-019-09497-7 Parker, L. (2011). University corporatisation: Driving redefinition. Critical Perspectives on Accounting, 22(4), 434–450. Pinheiro, R., Langa, P. V., & Pausits, A. (2015). One and two equals three? The third mission of higher education institutions. European Journal of Higher Education, 5(3), 233–249. Ramirez, Y., Merino, E., & Manzaneque, M. (2019). Examining the intellectual capital web reporting by Spanish universities. Online Information Review, 43(5), 775–798. Rosli, A., & Rossi, F. (2016). Third-mission policy goals and incentives from performance-based funding: Are they aligned? Research Evaluation, 25(4), 427–441.

The Third Mission Strategies Disclosure Through the Integrated Plan

127

Rubens, A., Spigarelli, F., Cavicchi, A., & Rinaldi, C. (2017). Universities’ third mission and the entrepreneurial university and the challenges they bring to higher education institutions. Journal of Enterprising Communities: People and Places in the Global Economy, 11(3), 354–372. Sam, C., & Van Der Sijde, P. (2014). Understanding the concept of the entrepreneurial university from the perspective of higher education models. Higher Education, 68(6), 891–908. Sánchez, M. P., & Elena, S. (2006). Intellectual capital in Universities. Improving transparency and internal management. Journal of Intellectual Capital, 7(4), 529–548. Sangiorgi, D., & Siboni, B. (2017). The disclosure of intellectual capital in Italian universities: What has been done and what should be done. Journal of Intellectual Capital, 18(2), 354–372. Secundo, G., Beer, D., Schutte, C. S., & Trencher Passiante, G. (2017). Mobilising intellectual capital to improve European universities’ competitiveness: The technology transfer offices’ role. Journal of Intellectual Capital, 18(3), 607–624. Secundo, G., Dumay, J., Schiuma, G., & Trencher Passiante, G. (2016). Managing intellectual capital through a collective intelligence approach: An integrated framework for universities. Journal of Intellectual Capital, 17(2), 298–319. Secundo, G., Lombardi, R., & Dumay, J. (2018). Intellectual capital in education. Journal of Intellectual Capital, 19(1), 2–9. Secundo, G., Massaro, M., Dumay, J., & Bagnoli, C. (2018). Intellectual capital management in the fourth stage of IC research: A critical case study in university settings. Journal of Intellectual Capital, 19(1), 157–177. Secundo, G., Perez, S. E., Martinaitis, Ž., & Leitner, K. H. (2017). An intellectual capital framework to measure universities’ third mission activities. Technological Forecasting and Social Change, 123, 229–239. Siboni, B., Nardo, M. T., & Sangiorgi, D. (2013). Italian state university contemporary performance plans: An intellectual capital focus? Journal of Intellectual Capital, 14(3), 414–430. Sullivan, T. M., & Richardson, E. C. (2011). Living the plan: Strategic planning aligned with practice and assessment. The Journal of Continuing Higher Education, 59(1), 2–9. Trencher, G., Yarime, M., McCormick, K. B., Doll, C. N., & Kraines, S. B. (2014). Beyond the third mission: Exploring the emerging university function of co-creation for sustainability. Science and Public Policy, 41(2), 151–179. Vorley, T., & Nelles, J. (2009). Building entrepreneurial architectures: A conceptual interpretation of the third mission. Policy Futures in Education, 7(3), 284–296.

Transferring Knowledge to Improve University Competitiveness: The Performance of Technology Transfer Offices Pina Puntillo, Franco Rubino, and Stefania Veltri

1 Introduction Recent studies in the literature have focused on the changing role of the university with the inclusion, beyond the first (teaching) and second mission (research), of a third mission, focused on universities contribution to social development and economic growth (Etzkowitz & Leydesdorff, 2000). Some authors conceptualized the fourth mission, that is, the new mission of today’s universities to be entrepreneurial (Kretz & Sá, 2013; Boffo & Cocorullo, 2019), and the literature has coined the term entrepreneurial university, towards which universities are moving (Etzkowitz & Leydesdorff, 2000; Guerrero et al., 2016; Philpott et al., 2011). Part of the literature has questioned the ideological underpinnings of the entrepreneurial university, talking about “McUniversity” (Hayes & Wynyard, 2002) or “academic capitalism” (Slaughter & Leslie, 1997) because of the “commercialization” of the university mission it involves, but the majority of the researchers have outlined the benefits of an entrepreneurial university for economic and social development (Etzkowitz & Leydesdorff, 2000; Fayolle & Redford, 2014; Guerrero et al., 2016; Lombardi et al., 2019). Generally, “third mission” activities comprise three dimensions performed by universities in relation to external environments: technology transfer and innovation, continuing education, and social engagement (Secundo et al., 2017).

All authors wrote the chapter, but their primary individual contributions are reflected as follows: Section 1 is to be ascribed to Franco Rubino, Sections 2 and 3 are to be ascribed to Pina Puntillo, Sections 4, 5, and 6 are to be ascribed to Stefania Veltri. P. Puntillo (*) · F. Rubino · S. Veltri Department of Business Administration and Law, University of Calabria, Arcavacata di Rende, CS, Italy e-mail: [email protected]; [email protected]; [email protected] © The Author(s), under exclusive license to Springer Nature Switzerland AG 2022 E. Caperchione, C. Bianchi (eds.), Governance and Performance Management in Public Universities, SIDREA Series in Accounting and Business Administration, https://doi.org/10.1007/978-3-030-85698-4_7

129

130

P. Puntillo et al.

Among these, the key role is played by technology transfer. Technology is the key component of innovation, and innovation is the single most important engine of long-term competitiveness, growth, and employment (EC, 2001). Universities are key actors in producing, sharing, and transferring knowledge to industry (EC, 2003). The term “technology transfer” (TT) is broad and not easily measurable (Agrawal, 2001). Bremer (1999) defines technology transfer as the transfer of the results of research from universities to the commercial sector. There are varieties of means to transfer research results from universities to society. The traditional form is by publishing research results, while the commercial forms are by licensing, patenting, or spin-offs (Hsu et al., 2015). In order to diffuse an entrepreneurial culture of research, encourage the dissemination of scientific outcomes and support scientists through the stages of commercialization of the results of their study, several universities have established technology transfer offices (TTOs). The role of TTO in transferring knowledge has been widely discussed by scholars with contradictory opinions. Part of the literature enhances the key role of TTO in promoting innovation (Siegel et al., 2004), while other authors underline a marginal role of TTO in driving academics to start new ventures (Siegel & Wright, 2015; Huyghe et al., 2016; Meoli & Vismara, 2016). In our research, we change the perspective switching from the role of TTOs to their efficiency in fostering university innovativeness and thus to the right way to measure the performance of TTOs, in terms of their capability to achieve the knowledge transfer objectives of universities (Vinig & Lips, 2015).1 We applied a research model, based on Cesaroni and Piccaluga (2016), to investigate which knowledge transfer strategy adopts the TTO of the University of Calabria (Unical) and how this strategy affects the Unical third mission performance. The case study of the TTO of the University of Calabria (Unical) is interesting to deal with, as Unical recorded important performance results related to technology transfer in terms of patents, university spin-offs and industry income. These results are more important as Unical is located in a region (Calabria, in the south of Italy), whose parameters record negative values in economic and entrepreneurial development terms. The tools used were document analysis and semi-structured interviews with Unical executives directly involved in technology transfer from the university to the territory. We highlight the paths that have led Unical to achieve success in technology transfer and testify how this improved university innovation capability

1

There are several measures used by studies focused on measuring the TTO performance, and the majority of them based their measures on the income generated by knowledge transfer commercialization. Other studies measured outputs in terms of contributions via the labor force (Bessette, 2003; Chrisman et al., 1995; Elliott et al., 1988), revenues obtained from patents, R&D collaborations (Siegel et al., 2003), spillover effects (Audretsch & Lehmann, 2005), or total university earnings (Goldstein, 1990). A few studies related the economic impact of TTOs to the change in the gross domestic product (Martin, 1998; Roessner et al., 2013; Guerrero et al., 2015).

Transferring Knowledge to Improve University Competitiveness: The. . .

131

and, in turn, its competitiveness, as innovation is a strategic asset for the competitiveness of individuals, organizations, and countries (Secundo et al., 2017). The paper is original as it analyzes universities in the third area and in their innovation capability, employing particular performance measures, such as the number of patents, number of spin-offs, still understudied compared to other traditional (financial) performance measures or to specific performance measures for universities, such as performance in teaching and research.

2 Literature Review and Research Question The knowledge produced in universities can spur business innovation, foster university competitiveness, and promote economic and social development through the creation of the next generations of entrepreneurial engineers for academic entrepreneurship (Algieri et al., 2013; Iacobucci et al., 2020). The results of university research need to be successfully “transferred” from the university to commercial application in order for university and society to benefit (Agrawal, 2001). This process is also known as technology transfer or valorization of research results (Bremer, 1999). The majority of universities in the Western world have incorporated technology transfer in the university objectives, besides the traditional goals of education and research, and the need for universities to produce and transfer knowledge has been underlined at the political level (EC, 2003). University technology transfer is defined as a process in which science, knowledge, or capabilities are transferred or moved from one entity to another for the purpose of further development or commercialization (Lundquist, 2003; Swamidass & Vulasa, 2008). To assist and stimulate technology transfer using methods beyond classical methods, the majority of universities have established Technology Transfer Offices (TTOs), which achieve technology transfer through intellectual properties, licensing, patenting, and the creation of start-up companies (Hsu et al., 2015). Furthermore, business incubators have been included in the university organizational structure or have been established as an independent structure outside a university, but operating in its name in order to facilitate the passage of knowledge and know-how from academia to business (Algieri et al., 2013). In the literature, there are two opposite positions regarding the role played by TTOs in fostering university innovation and competitiveness. While a large body of the literature focuses on how these offices create a structural internal environment able to foster the creation of academic spin-offs, so finding a positive role of TTOs in driving academics to start new ventures (Algieri et al., 2013; Cesaroni & Piccaluga, 2016; Secundo et al., 2017; Iazzolino et al., 2019; Iacobucci et al., 2020), another stream of the literature evidences that the presence of TTOs plays a marginal role in fostering innovation (Fini et al., 2009; Siegel & Wright, 2015; Meoli & Vismara, 2016), identifying different factors from TTOs, such as faculty quality and the

132

P. Puntillo et al.

subsequent involvement of the academic inventor (Di Gregorio & Shane, 2003), and highlighting that many TTOS operate inefficiently (Rasmussen et al., 2006). In our opinion, the question is not whether TTOs play a key role in fostering university innovation and competitiveness, but how efficient TTOs are in the process of transferring knowledge and how the knowledge transfer (KT) strategy adopted affects the university third mission performance. Hence the research question (RQ): RQ: How does the KT strategy adopted by TTOs affect the university third mission performance?

3 Research Methodology To answer this research question, we used Yin’s (2014) case study methodology. Compared to quantitative approaches, the qualitative approach of a case study allows researchers to discover and understand the relationships between variables in complex processes (Lombardi et al., 2019). In detail, the type of case study applied in the paper is an explorative case study (Chiucchi, 2012), as it intends to explore a “how” question, investigated in the light of an interpretive model.2 The following subsections describe the research context, the main features of the TTO selected as a case study, and the methods used for data collection.

3.1

The Research Context

We focus on the TTOs of Italian universities. Italian universities have engaged in knowledge transfer activities since the 1990s, initially by a few pioneering institutions, and lately by almost the totality of academic institutions and, to date, Italy presents an organized system of university knowledge transfer, which reflects the strong motivation and diffused interest that Italian universities show to engage in such activities (Cesaroni & Piccaluga, 2016). Formally, TTOs in Italy were established by national law D.L. 27/7/1999 no. 297 and then regulated by the D.M. 8/8/2000. Italian universities have progressively increased the number of TTOs over time and actually were 65 in 2017 from 3 in 2000. The specific policies of technology transfer set out by the TTOs mainly address the creation of university spin-offs (USOs) (96.8%), intellectual properties management (95.2%) and licensing activities (87.3%) (Netval Surveys, 2018, 2019). This is why the growth of TTOs

2

Explorative case studies are carried out to acquire useful indications in an area not (or partially not) explored and their findings constitute just preliminary interpretations of a phenomenon. The theoretical paradigm underlying our research is the interpretive model. In the light of interpretivism, sociological phenomena cannot simply be observed, but must also be interpreted by the researcher (Ryan et al., 2002).

Transferring Knowledge to Improve University Competitiveness: The. . .

133

and USOs in a country with low R&D investments (as a percentage of GDP), but at the same time high performance of its researchers (in terms of publication per capita) makes Italy an interesting case, since its evolution can be paradigmatic for other countries involved in catching-up processes in the field of innovation and knowledge-based entrepreneurship (Cesaroni & Piccaluga, 2016; Iacobucci et al., 2020). In the research, a single case study was used (Philpott et al., 2011; Lombardi et al., 2019). In detail, we selected the University of Calabria, a large public university (20,000–40,000 students), situated in Southern Italy, in which diverse disciplines from the sciences and humanities co-exist as equals.3 Established in 1968, with the aim to play a key role in the development of the surrounding territory, it is a residential university which includes third mission aims in its strategic plans (strategic plans of the University of Calabria 2016–2018; 2017–2019; 2018–2020). This particular case site was selected as it represents the traditional comprehensive university evident in the European context, because institutional policy acknowledges its third academic mission and, above all, because Unical’s LiO is a representative case for TTO (Iazzolino et al., 2019).

3.2

The Liaison Office at the University of Calabria

The Liaison Office (LiO) of the University of Calabria is one of the oldest TTOs founded, having been established in 2003. Its mission is to valorize research. In detail, LiO carries out several activities to address its mission, such as a) scouting of funding opportunities to support research; b) protecting Intellectual Property Rights (IPR); c) creating spin-off/start-up companies; d) promoting and supporting mutually beneficial relationships with government, industry, civil society organizations and other cultural, research and academic institutions.4 Over the years, LiO achieved important results in terms of funded research projects (350, with almost 90 million euros grants within 2014–2020 framework program); patents (over 100 patents filed over the years, 37 active patents, 8 licensed) and spin-off companies launched (46 active spin-off/start-up companies), and also it established several network relationships with innovation “ecosystem” stakeholders. By the innovative ecosystem at the University of Calabria, we mean the 14 departments, 147 research groups, 127 research laboratories, about 30,000 students, the network relationships with investors (BA & VC), accelerators/incubators, local 3

In Italy, research, teaching and third mission activities, supported by administrative activities, are delivered by state (public) and non-state universities approved by the national Ministry of Education (Siboni et al., 2013). The public universities are almost equally distributed in Northern, Central and Southern Italy (36%, 29% and 35%, respectively), while most are medium- and small-sized (42% and 39% respectively) (Mazzotta et al., 2020). 4 The data reported in the Section derive from the interview to the head of LiO and a brief presentation provided by the head of LiO and are updated to 20 May 2020.

134

P. Puntillo et al.

actors of the economy system (Chamber of Commerce and Unindustria), banks, large companies and SME, and specific initiatives for stimulating the establishment of new companies, namely Start Cup Calabria and UniCaLab.5 In 2010 LiO promoted the foundation of the TechNest, which is the business incubator at the University of Calabria, with the mission to promote the creation of new innovative companies based on high knowledge and technology, to accompany and accelerate the incubated companies and to support entrepreneurship based on the excellence of research and innovation aimed at creating high-qualified jobs. Technest releases several services, such as professional services (legal, financial, etc.), networking (partnership, open innovation, etc.), funding and investment scouting, acceleration of process design, tutoring, coaching, mentoring, and training, and “hard” facilities (rooms, network services, etc.). Over the years, TechNest has incubated over 30 companies since 2010 (currently 11), collected about €6 ml in private equity investments; the currently incubated companies achieved over €3 ml of revenues and assumed over 50 employees. In 2017, LiO included Public Engagement (PE), the other soul of the third mission, in its aims (PE aims were included in the Unical strategic plan in 2018). LiO co-founded APEnet, the Italian academic network on Public Engagement deals with scientific communication and dissemination (newsletter, website, social network, etc.) activities, among which we can recall the initiative “SuperScienceMe – Research in your reach,” the European Researchers’ Night project funded by the European Commission (H2020) both for the 2018–2019 and 2020 editions, in which over 20.000 students visited 190 laboratories and experiments. LiO has dedicated human and financial resources. As for the human resources, it is a large TTO, with seven full-time equivalent (FTEs) is also a large TTO with its specialized staff versus 5.8 FTEs at the national level. The main aim of LiO is research valorization, and its specialized staff is equally divided into the three main areas devoted to the exploitation of academic research, that is intellectual property (20%), licensing and contracting (30%) and spin-off creation (20%). The remaining 30% is devoted to other tasks, among which we can include training and public engagement (PE). As for the financial resources, LiO is entitled to a specific annual budget from the Unical university (54.1% of universities belonging to the Netval association entitle a budget to their TTOs) of almost €50.000 (from the interview with the Head of LiO). Over the years, the LiO capability to earn income autonomously improves. Table 1 shows the percentages of the funding source of the total annual income credited to LiO.

5

Start Cup Calabria is a business competition among high technological value ideas which are willing to make innovative enterprises. It is part of the National Award on Innovation and, since 2009, 750 ideas evaluated and over 30 enterprises started. UniCalab are contamination labs in which students of different disciplines attend a one-year experience aimed at learning how to turn their innovative ideas into enterprises. The program consists of four phases (scouting, academy, pre-acceleration & competition, acceleration) and achieved results in terms of students involved (around 100), teams activated (10), awards (3), companies per year (2).

Transferring Knowledge to Improve University Competitiveness: The. . .

135

Table 1 Composition of the total annual income of LiO

Resources provided by Unical (budget for LiO + costs for the FTEs) Self-financing from projects + thirdparty entries Self-financing from patents/knowhow Total annual income

2015 (values) €357,000 (11 FTEs) €0 € 8.000 €365,000

% 97.8

2.2

2016 (values) €357,000 (11 FTEs) € 40,000 €0

2017 (values) €273,000 (8 FTEs) € 100.000 € 9000

€397,000

€381,000

% 89.9 10.1

% 71.6 26.2 2.3

Source: Questionnaire draft from LiO for the Netval Surveys, 2018 and 2019

3.3

Data Collection

The research was carried out from March–July 2020. The qualitative tools used to analyze the case study were document analysis (Bowen, 2009) and interviews (Qu & Dumay, 2011), which were performed to analyze the perceptions, criticalities, aspects of greater complexity and benefits of the role of TTOs. The use of multiple sources of information allows the authors to triangulate information, making them reliable (Chiucchi, 2012). Document analysis is a low-cost way to obtain empirical data, not affected by the research process. Being created independently of the research agenda, they can lack details and could also be biased: for this reason, we combined data from documents with data from interviews and observation to minimize bias and establish credibility (Bowen, 2009). As for our case study, we examined the microdata of the University of Calabria LiO related to the Netval survey of 2018 (on 2016 data) and 2019 (on 2017 data),6 the strategic plans of Unical 2016–2018; 2017–2019; 2018–2020; the complete 2019 and 2018 Netval survey, documents related to the key figures of Unical Liaison Office-LiO (the university of Calabria’s TTO). As for the interview, we established a research protocol to define the type of interview, the people to interview, the average expected length of each interview and the questions to ask, and thus to ensure reliability (Yin, 2014). As for the type of interview, semi-structured interviews were chosen due to their flexibility and propensity to allow interviewees to disclose important yet sometimes hidden information. This allows stressing the interviewees’ perspective, as it enables them to express their opinion in their own style and words and to participate in the interview process together with the interviewer, so producing questions and answers within a complex interpersonal discussion (Qu & Dumay, 2011). No materials and questions were shared prior to the interview, while the transcript was shared only with the interviewee.

6

Netval is the Italian University Network for the Valorization of Research (see http://www. netval.it), which annually, for 16 years, has carried out a survey on the valorization of the research results addressed to universities’ technology transfer offices and research centers.

136

P. Puntillo et al.

Table 2 Sources used for the case study Source University documents

National documents Interviews (1 h)

Documents/Key figures involved with technology transfer (TT) Microdata of the University of Calabria LiO related to the Netval survey of 2018 and 2019 Documents related to the key figures of Unical LiO The strategic plans of Unical 2016–2018; 2017–2019; 2018–2020 The complete 2019 and 2018 Netval survey The current head of LiO The current university of Calabria delegate to TT The former University of Calabria delegate to TT

Period March– July 2020 March– July 2020 March– July 2020 March– July 2020 March– July 2020 March– July 2020 March– July 2020

Semi-structured interviews were performed with the executives involved in the knowledge transfer strategy, management and communication of LiO and Unical. In detail, the interviews were undertaken with the current head of LiO, with the current university of Calabria delegate to technology transfer, with the former University of Calabria delegate to technology transfer.7 Semi-structured interviews, approximately 1 h in duration, were conducted by the authors in meetings with the selected people organization and the general manager. Direct questions were asked regarding the personal background of each interviewee, their role in the university technology transfer process, and their view on the impact of the chosen technology transfer strategy on university’s innovation and competitiveness, together with several questions addressed to deepen our knowledge on the functioning of the LiO and to better comprehend the knowledge transfer strategy adopted (see Appendix). Table 2 lists the sources used for the case study. The interviews were fully transcribed and, together with the selected university and national documents, they were interpreted through a research framework adapted by Cesaroni and Piccaluga (2016) to address the research question, namely which is the KT strategy adopted by LiO and how this strategy affects the university third mission performance. The results were then discussed among the authors to achieve investigator triangulation, and a first draft of the results was produced and discussed with the people interviewed to validate the results (Yin, 2014).

7

Mimicking the well-known separation in university between academic and administrative workers, the head of the LiO belongs to the administrative workers, while the delegates to the technological transfer are appointed by the Rector and are selected among full professors of the university.

Transferring Knowledge to Improve University Competitiveness: The. . .

Phase 1 The generation of Intellectual Property

137

Phase 3 Phase 2 The search of a balance The valorization of research outcomes among the different forms of KT

Aim Building a large and strong patent portfolio

Aim Focus on licensing or spin-off companies or research contracts

Third mission Raising awareness on the relevance of KT

Third mission Knowledge commercialization

Aim To identify interactions and synergies among the three different form of KT: licensing, spin-offs, research contracts

Third mission Academic engagement

Fig. 1 The three-phase model of knowledge transfer. Source: our elaboration on Cesaroni and Piccaluga (2016)

4 The Research Framework To analyze the KT strategy adopted by the LiO and how this strategy affects Unical’s third mission performance, we built a theoretical model based on Cesaroni and Piccaluga’s (2016) model. In their research Cesaroni and Piccaluga (2016), drawing on the Balderi et al. (2007) model, identified three main clusters of universities with homogeneous knowledge transfer activities (measured by licenses, contracts, earnings, Spin-offs, Science & Technology Parks, Incubators), then they used their model to position Italian universities with respect to knowledge transfer.8 Figure 1 illustrates the three-phase model of knowledge transfer (KT), adapted from Cesaroni and Piccaluga (2016). Cluster 1 and cluster 3 are at the extremes of a straight-line, identifying the former all universities in phase 1 of KT process, characterized by lower outcomes of KT activities, the latter all universities in the phase 3 of KT process, which are strongly involved in technological research, showing the highest values of KT activities also thanks to their longer-lasting experience in this field. Cluster 2 is in an intermediate position with respect to Cluster 1 and Cluster 3, in terms of both research and KT activities. Universities belonging to cluster 2 are strongly involved in KT activity, but they still conceive the third mission as knowledge commercialization, and their main effort is directed towards the valorization of research outcomes, which can take three different forms: licensing, spin-off companies, and research contracts. In this

8

The authors also identified a residual fourth category made of universities that are scarcely involved in KT, or engaged in KT activity too recently to generate significant outcomes (Cesaroni & Piccaluga, 2016). In their research, universities were so distributed: the 20% belonged to cluster 1; 24.3% to cluster 3 and 34.3% to cluster 2.

138

P. Puntillo et al.

Table 3 Formal mechanisms of technology transfer Mechanism Licenses

Options

Research agreements

Cooperative research agreements Patents/Intellectual property

Spin-off firms

Definition An official agreement in which the legal rights to utilize and invention for commercial purposes are sold by the licensor (the university) to a licensee, usually an established company, in return for revenues An agreement in which the licensor (university) allows a potential licensee the possibility of evaluating the technology and negotiating the terms of the license agreement Agreements with which external parties (mainly firms) finance research activities within universities retaining the ownership of (possible) patents generated by these research activities Agreements funded by industry based on a real cooperation of the industry (intellectual property rights in co-ownership) The granting of a property right by a sovereign authority to an inventor. This grant provides the inventor exclusive rights to the patented process, design, or invention for a designated period in exchange for a comprehensive disclosure of the invention High-tech firms funded (1) by university professors/researchers and/or a PHD student/temporary research fellow/student involved in longterm research on a specific theme, the object of the creation of the spinoff firm and/or (2) based on IP of the university and/or (3) whose equity is partially owned by the university

second phase, the three forms of KT are thus perceived by universities as strict alternatives, so that each university might prefer one form over the others, according to the availability of resources and capabilities and the specific goal assigned to the KT strategy. Universities belong to Cluster 3, instead, conceive the third mission as academic engagement, definable as “knowledge-related collaboration by academic researchers with non-academic organizations” (Perkmann et al., 2013, p. 424), and measurable by formal but also informal mechanisms of knowledge transfer, such as contract research, consulting, and networking with firms (Perkmann & Walsh, 2008). The distinction is relevant as academic engagement—and knowledge transfer in its broader sense—reflects a broader view of university knowledge-related activities, whose goal is to favor local and national economic development and only indirectly to gain financial returns (Cesaroni & Piccaluga, 2016). As it can be seen from Fig. 1, each KT strategy is characterized by a different aim, which reflects on the way in which each TTO implements its knowledge commercialization capability, that is, on the way in which academic research is exploited through formal mechanisms such as patenting (intellectual property), licensing/ contracting/options and spin-off creation. Table 3 illustrates the formal mechanisms of technology transfer used to measure the KT performance of a TTO.

Transferring Knowledge to Improve University Competitiveness: The. . .

139

Table 4 Contextualizing the main technology transfer output data of Unical LiO Output knowledge transfer indicators Intellectual property rights/patents Average number of invention disclosures Average priority patent applications Average number of patents granted to universities Average active patents in portfolio Average amount of intellectual property (IP) expenditure Licenses/options/contracts Average number of licenses/options executed Average number of active licenses in portfolio Average licensing/options/contracts revenues Spin-off firms Average number of spin-offs created Total number of spin-offs created: 110 (115) Average number of active spin-offs Total number of active spin-offs: 1373 (1749)

Years (data)

Italy

LiO

2018 (2016) 2019 (2017) 2018 (2016) 2019 (2017) 2018 (2016) 2019 (2017) 2018 (2016) 2019 (2017) 2018 (2016) 2019 (2017)

11.7 n.a. 6.8 7.8 7.8 6.1 74 84 € 65,600 € 70,300

25 n.a. 7 8 1 4 44 51 €30,000 €40,000

2018 (2016) 2019 (2017) 2018 (2016) 2019 (2017) 2018 (2016) 2019 (2017)

2.1 2.4 9.1 11.1 €36,000 €45,000

0 0 9 12 €62,000 €78,000

2018 (2016) 2019 (2017) 2018 (2016) 2019 (2017) Half 2018

1.7 1.9 22 29

3 3 37 39 42

Source: Our elaboration from Netval Surveys (2018, 2019); LiO Questionnaire for the Netval Surveys (2018 and 2019)

5 Main Findings To measure the LiO performance, we used output indicators for the three main KT activities, namely Intellectual property rights/patents; licenses/options/contracts; spin-off firms. The indicators used are those provided by LiO for the Netval Surveys 2018 and 2019.9 Table 4 compares the output knowledge transfer indicators calculated for LiO with the same indicators provided at the national level.

9 The fourteenth (fifteenth) edition of the Netval survey report, published in 2018 (2019), refers to data about 2016 (2017) from 62 (60) Italian universities, accounting for 75.5% (62.5%) of the total number of students and 85.7% (84.4%) of the total number of professors. It should be underlined that the answers to the Netval survey have been processed only for the public universities (excluding telematics universities, universities for foreigners and Public Research Centers). Universities did not answer all the questions, thus the number of slight changes for each question answered. The questionnaire provides input measures, information on TTOs organization and output measures, on which we focus.

140

P. Puntillo et al.

As for the patents, the path starts from the invention disclosures (the first recording of the invention), passing through the priority patent application (the date of filing of the first application) to end with the patents filed. Both invention disclosures and priority patent applications are in line with (or better than) the national data; only for the patents (the number of patents granted annually and the number of active patents in the portfolio), the datum is lower than the national one. The average amount of intellectual property (IP) expenditure is lower than the national one, but increased in the three-year period (in 2016 + 16.6% compared with 2015; in 2017 + 33.3% compared with 2016).10 As for the licensing/options/contracts, we can observe a lack of licenses/options executed in 2018 and 2019, balanced out by a number of active licenses in the LiO portfolio in line with the average number at the national level, while the amount of revenues from licenses/options/contacts are higher than the average national number and increased in 2017 (+25.8%). They show the following composition: 17% of revenues comes from active licenses/options, 19% of revenues from active research and consulting agreements and 64% of revenues from active cooperative research agreements. As for the beneficiaries, half the revenues return to the inventor(s), 35% to the inventor(s)’ department, 15% to LiO. As for the spin-offs, the average number of spin-offs created in the year and the average number of active spin-offs related to the LiO is higher than the national average numbers. As for the number of new spin-offs created, it should be underlined that this value appears consistent and stable over the years, as the value for 2009 was 3.2% (Algieri et al., 2013). Furthermore, the active spin-offs of the University of Calabria are spin-offs of consolidated experience (8.4 average age in years, the third-best results after Friuli-Venezia Giulia with 9 average age and Emilia Romagna with 8.6 average age). Spin-offs at the University of Calabria are the object of continuous investments, as testified by the project which won the call of the local Chamber of Commerce addressed to improve the competitiveness of Unical spinoffs through a digitalization process and the two Unical spin-offs winning the national prize of innovation for two consecutive years, 2017 and 2018. Going inside the datum, we can say that the majority of newly created firms belong to the ICT industry (38%), followed by service innovation (26%), life sciences (14%) automated industry (12%), Energy and environment (10%). Spin-offs are closed equity firms, which do not follow a dividend policy, instead, they reinvest the capital in the production process (from the interview to the head of LiO). The departments that contributed more to the development of spin-offs are the department of Computer Science, Modeling, Electronics, and Systems Engineering (Dimes) and Mechanical, Energy and Management Engineering (Dimeg) with a quota of 60%, so confirming that spin-offs are set up mainly by scientific departments. This led, owing to the background of the founders and their competencies, to a focus on technical aspects, important for the first phase, shadowing organizational, management, and marketing

10

Patents are the main voice of the IP rights, which also includes utility models (3), trademarks (3) and software (1).

Transferring Knowledge to Improve University Competitiveness: The. . .

141

competencies, important for the growth phase and the business performance (Iazzolino et al., 2019; strategic plans of the University of Calabria 2016–2018; 2017–2019; 2018–2020). Summarizing, we can say that the performance of LiO is fair in two of the three areas of knowledge transfer (patents and licensing/options/contracts), while it is good in the third (spin-off) area. As for the patents indicators, the LiO values are good for the first two phases of the patent process (invention disclosure and priority patent application) while they are lower than the national values for the third phase (number of active patents in portfolio) and for the amount of IP average expenditure.11 Anyway, over the years, LiO filed over 100 patents. As for the licensing/ options/contracts, we can observe that the LiO seems not to consider investing in licensing as a priority, as shown by the lack of licenses/options executed in 2018 and 2019. Anyway, the number of active licenses in the LiO portfolio is in line with the average number of active licenses in the portfolio at the national level, and the revenues from licenses/options/contacts are higher than the average national number. The University of Calabria’s result in spin-off creation, better than the national average number, is really important, owing to the circumstance that the creation of spin-off firms is mainly concentrated in the North of Italy (47.3% in 2016; 44.9% in 2017), followed by the Center (29% in 2016; 33.3% in 2017) then by the South and Isles (23.7% in 2016; 21.8% in 2017) and for the lack of incentives to researchers to engage in spin-off’s creation. Differently from the other universities in the Netval surveys, Unical researchers (and LiO’s employees) who generate income from research are not rewarded in monetary terms or with the attribution of more funds for research, and neither is this considered as a criterion for academic progression. They can just participate in the equity of the spin-offs, receive monetary incentives from spin-off companies and obtain a sabbatical year to work in a spin-off they contribute to create. It is voluntary work, and they do it to find jobs for their students (from the interview to the actual delegate to the technological transfer). It should also be underlined that the percentage of active spin-offs for LiO (3.1%) almost coincides with the percentage of active spin-offs in the Calabria Region, while in the other Italian regions, several universities contribute to the value of the Region. Moreover, the number of active spin-offs at the University of Calabria is higher than the average national value. If we analyze the LiO results with the lens of our three phase-model of knowledge transfer, we can say that the University of Calabria belongs to cluster 2. In fact, universities belonging to this cluster focus on one specific formal mechanism of technology transfer, which for the University of Calabria is the creation of spin-offs. Unical’s KT strategy focuses on the valorization of the research outcomes, on the spin-off companies launching and development, and this aim is evident not only

11

On this issue, we asked the head of LiO, who told us that the datum should be completed with the five patent applications in 2017, which hopefully will become patents granted in the near future. Anyway, from the 6 patents licensed (1 in 2017, 2 in 2016), Unical does not earn a great amount of money (from the interview with the head of LiO).

142

P. Puntillo et al.

from the output knowledge transfer indicators, but it is also a strategic objective of Unical, included in its strategic plans and it is as well the main aim of LiO. On this point, the head of LiO commented that the survival rate of the created spin-offs is very high, as only 4% of launched spin-offs failed, and this is why spin-offs at the University of Calabria are the object of continuous investment. Our results found confirmation in the results of recent research, highlighting a positive association between the TTO size in terms of employees (like LiO) and the number of spin-offs created (Iacobucci et al., 2020).

6 Conclusions Creating and transferring knowledge is actually considered crucial for universities to improve their competitiveness in a new mission that flanks teaching and research (third mission), involved with the universities' contribution to social development and economic growth. Universities, mainly through their TTOs, share the knowledge created by transferring technology to industry, and efficiency in this process is essential to maintaining the competitive advantage of a university. In the research, we focus on the efficiency of TTO in the University of Calabria, selected as a case study because Unical’s LiO is a representative case for TTOs, given its size in terms of employees and its age, proxy of its experience in KT activities. Consistently with Vinig and Lips (2015), we do not use monetary value to measure the TTO performance, as the current literature does; instead, we estimate the TTO efficiency based on the potential of technology transfer in terms of research outputs, namely patents, licenses, and spin-off contracts. Consistently with Cesaroni and Piccaluga (2016), we think that universities follow an evolutionary three-phase model of knowledge transfer, going from universities slightly involved in knowledge transfer activities (Cluster 1) to universities strongly involved in knowledge transfer activities (Cluster 3) and that they can be positioned in one of the three phases according to the way in which they conceive and implement knowledge transfer. We thus analyzed LiO’s KT performance indicators in the light of our three-phase model developed for the research on the basis of Cesaroni and Piccaluga’s (2016) model, which positioned universities along a continuum going from Cluster 1 to Cluster 3. According to our model, the University of Calabria belongs to Cluster 2, being a university strongly involved in knowledge transfer activities but still focused on the knowledge commercialization of its research outcomes. In this context, LiO plays a relevant role for the university of Calabria competitiveness in the third mission, valorizing the university research results, mainly helping spin-offs in their start-up and development phases. We believe that the University of Calabria (and LiO) is at a turning point and that it has the potentiality to switch from phase 2 to phase 3, given its long experience in KT, but it has to be underlined that this switch will require a change of strategy, in

Transferring Knowledge to Improve University Competitiveness: The. . .

143

the culture and in actions carried out to achieve this strategic aim and time to align top managers’ strategy with internal resources and capabilities devoted to this goal. In other words, university managers should modify the way in which they conceive knowledge transfer, no longer as a means to raise additional financial resources; instead, thinking of it as a broader activity whose goal is to favor local and national economic development, and only indirectly to gain financial returns, thus switching from a knowledge commercialization culture to an academic engagement culture (Perkmann et al., 2013; Cesaroni & Piccaluga, 2016). The findings of our study contribute to the existing debate on the effectiveness of TTOs, by adding the results of a specific case study. There are several theoretical and practical implications that could derive from the research. From a theoretical point of view, the research framework could be used by other researchers to position universities in one of the three phases according to the way in which they conceive and implement the knowledge transfer strategy. From an empirical point of view, the case study could offer interesting cues to other universities interested in investing in technology transfer. Furthermore, our results can also have implications for the technology transfer policies of universities and for the choice of indicators used to evaluate the third mission. It is also important to acknowledge the main limitation of the paper. Owing to the exploratory and qualitative nature of the study, the sample was limited to a single case study. Even though the use of a single case study provides in-depth and rich data, it also limits the generalizability of the observations to other TTOs. Additionally, the results obtained are based on the authors’ analysis and interpretation of qualitative data from the semi-structured interviews in the light of the interpretative approach. Future research on this topic could consider investigating and measuring the impact of LiO research outcomes and activities on the regional innovative entrepreneurial system, which is the highest level of academic engagement. In fact, only TTOs able to contribute deeply to the economic and social growth of the territory in which they operate, really concretize the “third mission” of the university and completing their own path towards the entrepreneurial ideal, able to align the missions of teaching, research, and economic development. Obviously, we expect that such an evolution of strategic intention towards a stronger engagement may take time, as universities need to recognize that their knowledge transfer actions can benefit the regional and national economic environment (and vice versa), and only then will they adjust their strategies and actions accordingly.

Appendix: Guiding Interview Questions For the current head of LiO

144

P. Puntillo et al.

1. How specialized is the LiO staff? 2. How much are the revenues from the knowledge transfer in the Unical consolidated report? 3. How many patents are there in the LiO portfolio? 4. Are the employees of the Technology Transfer Offices specialized? 5. Do LiO relationships with the Spin-off go beyond the creation phase? 6. Could academic researchers participate in a spin-off? 7. Could academic researchers ask for a sabbatical year to work in the spin-off they contribute to create? 8. Is the academic researchers’ involvement in the knowledge transfer activities taken into consideration for academic progression? 9. Which incentives are used to stimulate the academic researchers’ participation in knowledge transfer activities? 10. Are academic researchers financially (or through more academic funds) rewarded if they generate earnings from the applied research beyond an established level? 11. Are academic researchers financially (or through more academic funds) rewarded if they contribute to create university spin-offs. 12. How is the percentage of the LiO’s self-financing? And which sources contribute more? (research projects, services to spin-off firms, patents, and so on). 13. Does the LiO (or the academic researcher) involve all the areas in the university potentially useful (law areas, business areas and so on), or is there a search from external consultants? For the current and former University of Calabria delegates to technology transfer 14. Is the third mission in terms of knowledge transfer included within the university mission? 15. Does the university promote training courses and programs to support entrepreneurship? 16. Has the university a specific strategy for technology transfer? 17. Does the university offer services to support the spin-off creation involving external people such as legal or marketing consultants? 18. Which services does the university make available to researchers to favor the academic spin-off creation, to business plan activities, to patenting process, to licensing activities, to find external funds? 19. Is the definition of a knowledge transfer strategy a main objective of the university? Is a knowledge transfer strategy included in the University Strategic Plan? 20. How important are the networking activities (i.e., agreements with other universities, shared guiding lines, shared patents, interactions with local firms, regional/national/international centers for innovation and knowledge transfer? 21. Has the university dedicated a specific budget to the LiO?

Transferring Knowledge to Improve University Competitiveness: The. . .

145

References Agrawal, A. (2001). University-to-industry knowledge transfer: Literature review and unanswered questions. International Journal of Management Reviews, 3(4), 285–302. Algieri, B., Aquino, A., & Succurro, M. (2013). Technology transfer offices and academic spin-off creation: The case of Italy. Journal of Technology Transfer, 38(4), 382–400. Audretsch, D., & Lehmann, E. E. (2005). Do university policies make a difference? Research Policy, 34(3), 343–347. Balderi, C., Butelli, P., Conti, G., Di Minin, A. & Piccaluga, A. (2007). Towards an Italian way in the valorisation of results from public research. Impresa Progetto: Electronic Journal of Management, 1, DITEA. Retrieved from http://www.impresaprogetto.it/essays/2007-1/ balderi-butelli-conti Bessette, R. W. (2003). Measuring the economic impact of university-based research. Journal of Technology Transfer, 28(3–4), 355–361. Boffo, S., & Cocorullo, A. (2019). University fourth mission, spin-offs and academic entrepreneurship: Connecting public policies with new missions and management issues of universities. Higher Education Forum, 16, 125–142. Bowen, G. A. (2009). Document analysis as a qualitative research method. Qualitative Research Journal, 9(2), 27–40. Bremer, H. W. (1999) University technology transfer evolution and revolution. Council on Governmental Relations. Retrieved from http://www.cogr.edu Cesaroni, F., & Piccaluga, A. (2016). The activities of universities knowledge transfer offices: Towards the third mission in Italy. Journal of Technology Transfer, 41, 753–777. Chiucchi, M. S. (2012). Il metodo dello studio di caso nel Management Accounting. Giappichelli Editore. Chrisman, J. J., Hynes, T., & Fraser, S. (1995). Faculty entrepreneurship and economic development: The case of the University of Calgary. Journal of Business Venturing, 10(4), 267–281. Di Gregorio, D., & Shane, S. (2003). Why do some universities generate more start-ups than others? Research Policy, 32, 209–227. EC (European Commission). (2001, November 5). Building an innovative economy in Europe, a review of 12 studies of innovation policy and practice in today’s Europe. EC (European Commission). (2003, February 5). The role of universities in the Europe of knowledge. Communication from the Commission, Brussels. Elliott, D. S., Levin, P. S. L., & Meisel, J. B. (1988). Measuring the economic impact of institutions of higher education. Research in Higher Education, 28(1), 17–33. Etzkowitz, H., & Leydesdorff, L. (2000). The dynamics of innovation: From national systems and mode 2 to the Triple Helix of University – Industry – Government relations. Research Policy, 29, 109–123. Fayolle, A., & Redford, D. T. (Eds.). (2014). Handbook on the entrepreneurial university. Edward Elgar. Fini, R., Grimaldi, R., & Sobrero, M. (2009). Factors fostering academics to start up new ventures: An assessment of Italian founders’ incentives. Journal of Technology Transfer, 34(4), 380–402. Goldstein, H. A. (1990). Estimating the regional economic impact of universities: An application of input–output analysis. Planning for Higher Education, 18(1), 51–64. Guerrero, M., Cunningham, J. A., & Urbano, D. (2015). Economic impact of entrepreneurial universities’ activities: An exploratory study of the United Kingdom. Research Policy, 44(3), 748–764. Guerrero, M., Urbano, D., Fayolle, A., Klofsten, M., & Milan, S. (2016). Entrepreneurial universities: Emerging models in the new social and economic landscape. Small Business Economics, 47(3), 551–563. Hayes, D., & Wynyard, R. (Eds.). (2002). The McDonaldization of higher education. Bergin & Garvey.

146

P. Puntillo et al.

Hsu, D. W. L., Shen, Y. C., Yuan, B. J. C., & Chou, C. Y. (2015). Toward successful commercialization of university technology: Performance drivers of university technology transfer in Taiwan. Technology Forecasting & Social Change, 92, 25–39. Huyghe, A., Knockaert, M., Piva, E., & Wright, M. (2016). Are researchers deliberately bypassing the technology transfer office? Small Business Economics, 47, 589–607. Iacobucci, D., Micozzi, A., & Picaluga, A. (2020). An empirical analysis of the relationship between university investments in technology transfer offices and academic spin-offs (2020). R&D Management. https://doi.org/10.1111/radm.12434 Iazzolino, G., Greco, D., Verteramo, S., Attanasio, A. L., Carravetta, G., & Granato, T. (2019). An integrated methodology for supporting the development and the performance evaluation of academic spin-offs. Measuring Business Excellence, 24(1), 69–89. Kretz, A., & Sá, C. (2013). Third stream, fourth mission: Perspectives on university engagement with economic relevance. Higher Education Policy, 26(4), 497–506. Lombardi, R., Massaro, M., Dumay, J., & Nappo, F. (2019). Entrepreneurial universities and strategy: The case of the University of Bari. Management Decision, 57(12), 3387–3405. Lundquist, G. (2003). A rich vision of technology transfer. Journal of Technology Transfer, 28(3), 265–284. Martin, F. (1998). The economic impact of Canadian University R&D. Research Policy, 27(7), 677–687. Mazzotta, R., Nardo, M., Pastore, P., & Vingelli, G. (2020). Board composition and gender sensitivity approach in Italian universities. Meditari Accountancy Research. https://doi.org/10. 1108/MEDAR-06-2019-0517 Meoli, M., & Vismara, S. (2016). University support and the creation of technology and non-technology academic spin-offs. Small Business Economics, 47(2), 345–362. Netval Surveys. (2019, 2018). Retrieved from http://www.netval.it Perkmann, M., Tartari, V., McKelvey, M., Autio, E., Brostrӧm, A., D’Este, P., et al. (2013). Academic engagement and commercialisation: A review of the literature on university–industry relations. Research Policy, 42(2), 423–442. Perkmann, M., & Walsh, K. (2008). Engaging the scholar: Three types of academic consulting and their impact on universities and industry. Research Policy, 37(10), 1884–1891. Philpott, K., Dooley, L., O’Reilly, C., & Lupton, G. (2011). The entrepreneurial university: Examining the underlying academic tension. Technovation, 31, 161–170. Qu, S., & Dumay, J. (2011). The qualitative research interview. Qualitative Research in Accounting and Management, 8(3), 238–264. Rasmussen, E., Moen, O., & Gulbrandsen, M. (2006). Initiatives to promote commercialization of university knowledge. Technovation, 26(4), 518–533. Roessner, D., Bond, J., Okubo, S., & Planting, M. (2013). The economic impact of licensed commercialized inventions originating in university research. Research Policy, 42(1), 23–34. Ryan, B., Scapens, R., & Theobald, M. (2002). Research method and methodology in finance and accounting. Thomson. Secundo, G., De Beer, C., Schutte, C. S. L., & Passiante, G. (2017). Mobilising intellectual capital to improve European universities’ competitiveness: The technology transfer offices’ role. Journal of Intellectual Capital, 18(3), 607–624. Siboni, B., Nardo, M. T., & Sangiorgi, D. (2013). Italian state university contemporary performance plans: An intellectual capital focus? Journal of Intellectual Capital, 14(3), 414–430. Siegel, D. S., Waldman, D. A., Atwater, L. E., & Link, A. N. (2004). Toward a model of the effective transfer of scientific knowledge from academicians to practitioners: Qualitative evidence from the commercialization of university technologies. Journal of Engineering and Technology Management, 21(1), 115–142. Siegel, D. S., Waldman, D., & Link, A. (2003). Assessing the impact of organizational practices on the relative productivity of university technology transfer offices: An exploratory study. Research Policy, 32(1), 27–48.

Transferring Knowledge to Improve University Competitiveness: The. . .

147

Siegel, D. S., & Wright, M. (2015). Academic entrepreneurship: Time for a rethink? British Journal of Management, 26(4), 582–595. Slaughter, S., & Leslie, L. L. (1997). Academic capitalism: Politics, policies, and the entrepreneurial university. Johns Hopkins University Press. Swamidass, P. M., & Vulasa, V. (2008). Why university inventions rarely produce income? Bottlenecks in university technology transfer. Journal of Technology Transfer, 34(4), 343–363. University of Calabria Strategic Plans 2016-2018; 2017–2019; 2018; 2020 at www.Unical.it Vinig, T., & Lips, D. (2015). Measuring the performance of university technology transfer using met data approach: The case of Dutch universities. Journal of Technology Transfer, 40, 1034–1049. Yin, R. K. (2014). Case study research, design and methods. Sage.

Third Mission and Intellectual Capital External Dimension: The Implications in the European University Planning Process Elisa Bonollo, Simone Lazzini, and Zeila Occhipinti

1 Introduction Higher education institutions (HEIs)1 are knowledge-based institutions in which intellectual capital (IC) has a relevant role (Ramírez & Gordillo, 2014). “If a knowledge-based economy is mainly characterized by the production, transmission and dissemination of knowledge, universities are unique in all these processes” (Sánchez & Elena, 2006, p. 530). Furthermore, “a university’s most valuable resources are its researchers and students with their relations and organizational routines; their most important output is knowledge” (Leitner, 2004, p. 129). In a knowledge-based economy, universities must promote knowledge exchange with the external world, moving beyond the two traditional missions (teaching and research) and including the third mission. The third mission represents the trait d’union between universities and the external world; it allows scholars to leave the ivory tower and to increase their coworking opportunities and exchanges with society. Scholars recognize the relevance of third mission activities for economic and social development, and contribute to the development and evolution of the third mission conception. As such, third mission has evolved from a productive conception mainly focused on the economic exploitation of research, to a universalistic 1

The terms higher education institution (HEI) and university are used interchangeably throughout this chapter.

E. Bonollo (*) Department of Economics and Business Studies, University of Genoa, Genoa, Italy e-mail: [email protected] S. Lazzini · Z. Occhipinti Department of Economics and Management, University of Pisa, Pisa, Italy e-mail: [email protected]; [email protected] © The Author(s), under exclusive license to Springer Nature Switzerland AG 2022 E. Caperchione, C. Bianchi (eds.), Governance and Performance Management in Public Universities, SIDREA Series in Accounting and Business Administration, https://doi.org/10.1007/978-3-030-85698-4_8

149

150

E. Bonollo et al.

conception including also the social impact of HEIs’ activities on the community. In a universalistic conception, third mission presents three dimensions, i.e. technology transfer innovation (TT), continuing education (CE), and public engagement (PE). The European projects, PRIME-OEU and E3M, attempt to develop indicators to measure the third mission’s dimensions. Even if both European projects are aligned with the universalistic conception of third mission, the E3M project contributes more effectively in defining the three dimensions of third mission than the PRIME-OEU. Overall, despite the academic research and the European projects’ attempts, a comprehensive definition and measurement framework for the third mission is still lacking (Göransson et al., 2009; Loi & Di Guardo, 2015; Marhl & Pausits, 2011; Molas-Gallart et al., 2002; Palla, 2016; Saad & Zawdie, 2011; Secundo et al., 2017). As to IC, it represents the input and output of university activities (Castilla-Polo & Gallardo-Vázquez, 2016). On the one hand, IC drives the value creation in HEI activities; it is the stock of knowledge used by the organization for value creation (Edvinsson, 2013; Secundo et al., 2010). On the other hand, it is the result of research activities (European Commission, 2006). IC’s components are human capital, organizational capital, and social capital. Human capital is the knowledge embedded in researchers and non-scientific staff, organizational capital includes all processes within a university, and social capital includes the relationships between the university and all academic/nonacademic stakeholders (Leitner, 2004). Each component of IC is related to the three university missions, i.e. the first mission (teaching), the second mission (research), and the third mission. Specifically, IC offers a new way to seize upon third mission activities. Each dimension (continuing education-CE, technology transfer innovation-TT, public engagement-PE) is strictly related to the external dimension of each IC component. Previous literature about the relationship between IC and third mission is limited and prioritizes the investigation of IC over third mission (Aversano et al., 2020; Brusca et al., 2018; Di Berardino & Corsi, 2018; Lombardi et al., 2019; Mariani et al., 2018; Secundo et al., 2017, 2018). As such, the need to deepen investigation of the IC-Third mission relationship emerges, as it could provide useful insights related to third mission. As a consequence of the profound transformations in the university sector at the end of the 1990s, HEIs have been increasingly required to reduce the information gap between them and their stakeholders, and IC disclosure plays an important role in this (Sangiorgi & Siboni, 2017). As such, IC voluntary disclosure can reduce the information gap between organizations and stakeholders (Branswijck & Everaert, 2012; Brüggen et al., 2009; Castilla-Polo & Gallardo-Vázquez, 2016; Guthrie et al., 2004; Leuz & Verrecchia, 2000; Lev, 1992; Vergauwen & Van Alem, 2005) and allow the inclusion of intangible value in managerial decisions (Aboody & Baruch, 2000; Catasús & Gröjer, 2003; Giuliani & Marasca, 2011; Holland, 2001; Mouritsen et al., 2004; Starovic & Marr, 2003; Vergauwen & Van Alem, 2005). Most of the studies on IC disclosure analyze annual reports and websites as the primary data sources (Dumay & Cai, 2015; Manes Rossi et al., 2018), although in recent years many authors have cited the need for more research examining other dissemination channels.

Third Mission and Intellectual Capital External Dimension: The. . .

151

The quantity and nature of ex post information depend on the extent, quality, and measurability of the information disclosed in the planning documents (Siboni et al., 2013). The planning phase highlights the attention paid to, or rather, the tension directed towards IC information. Considering the need to deepen the study of the IC-Third mission relationship and the request for research on channels other than reports, we develop a comparative and explorative study aiming to verify the degree of formalization of the third mission in the planning documents of European universities. By verifying the formalization of the third mission as a unitary phenomenon in these planning documents, we can assess the formalization of the IC external dimension in their programming activities. The IC external dimension and third mission activities are intrinsic to universities, which manage IC to support the relationship between academics and the external world. The degree to which the third mission is formalized reflects the universities’ awareness of third mission and thus of their assets related to the IC external dimension. Furthermore, it provides evidence of how and to what extent universities take into account third mission and the IC external dimension in their strategic decision-making process. To achieve our aim, we verify the degree of third mission formalization among a sample of 25 European universities through a WordStat automated content analysis of the disclosures included in their strategic/integrated plans. The degree of third mission formalization may vary across European countries in relation to their engagement in the PRIME-OEU and E3M projects. As such, we select universities from three countries, i.e. Italy, France, and the UK, which are representative of a different engagement in these European projects. Italy participated both to the PRIME-OEU and the E3M project, and France participated only to the PRIME-OEU project, while the UK did not take part in any of them. The research findings confirm that despite the relevance of third mission activities to economic and social development, the lack of a comprehensive definition and measurement framework for the third mission leads to its low formalization in university planning documents. As a consequence, universities offer limited disclosure of the IC external dimension. The third mission and IC external dimensions inform, to a limited extent, the strategic decision process inside universities, potentially leading to suboptimal decisions. This research contributes to the literature by linking the debate about voluntary IC information to third mission measurements. It offers empirical evidence about the potential for using third mission formalization as an indicator of attention given to IC. Furthermore, our research can help policy makers and public managers understand the determinants for enhancing third mission disclosure in university planning documents. The remainder of the chapter is organized as follows. Section 2 depicts the evolution of the third mission in universities. Section 3 provides a literature review of IC in universities. Section 4 depicts the Italian, the UK, and French university evaluation systems. Section 5 provides the research design. Section 6 presents the discussion of the results, and finally, Section 7 presents the conclusions.

152

E. Bonollo et al.

2 The Evolution of the Third Mission in Universities: State of the Art The third mission has evolved from a productive to a universalistic concept. In the productive conception, third mission activities are focused on the economic exploitation of research and on the relationship between universities and industry (Nedeva, 2007). The universalistic conception adds to this the social impact of HEIs’ activities on the community. The universalistic conception summarizes the three dimensions of third mission (see Fig. 1): technology transfer innovation, continuing education, and public engagement (E3M Project, 2011). According to the technology transfer innovation dimension, research is transformed into competitive capacity, generating a positive impact on the competitiveness of the industrial sector and on society (Bencardino & Napolitano, 2011; Cesaroni & Piccaluga, 2016). The public engagement dimension is related to the role of the university as a provider of social services for the community. It captures the collaboration between universities and the community for knowledge exchanges. The continuing education dimension differs from the traditional educational mission of universities, more focused on general education. Continuing education addresses education for entrepreneurial competences aiming at generating the relationships between academic research and the professional world (Marhl & Pausits, 2011). As far as the third mission measurement framework is concerned, European projects try to state a set of third mission activity indicators sharing the third mission universalistic conception. A network of European experts from 15 universities of eight countries, the so-called PRIME-OEU, developed a third mission “radar” with

Fig. 1 Third mission dimensions. Source: Our elaboration

Third Mission and Intellectual Capital External Dimension: The. . .

153

eight dimensions of third mission: “human resources, intellectual property, spin-offs, contracts with industry, contracts with public bodies, participation into policy making, involvement into social and cultural life and public understanding of science” (OEU PRIME, 2006). More recently, the Valencia HEI of Technology (UPV) supported a three-year research project (2009–2012) E3M “Indicators and Ranking Methodology for University’s Third Mission” which involved partners from eight universities and seven countries (Carrión García et al., 2012; E3M Project, 2011; Marhl & Pausits, 2011; Secundo et al., 2017). In contrast to the PRIME-OEU, the E3M project contributes to formalize the three dimensions of third mission, i.e. technology transfer innovation, continuing education, and public engagement, by providing a set of indicators for each of them. Overall, despite the attempts of these European projects, a comprehensive definition and measurement framework for the third mission is still lacking.

3 Intellectual Capital and the Third Mission in Universities: A Literature Review IC represents the main driver of value creation and competitive advantage in organizations (Lev, 2001). It consists of the intangible resources that organizations can put in use to create value, i.e. value, skills, competences, employees’ knowledge, research, and development activities, procedural systems for the organization, and intellectual property rights (Stewart, 2010). The several IC classifications (Boedker et al., 2008; Guthrie et al., 2006, 2007; Nahapiet & Ghoshal, 1998; Ricceri, 2008; Sangiorgi & Siboni, 2017; Secundo et al., 2017) result in definitions for three IC components: human capital, i.e. the knowledge embedded in people; organizational capital, i.e. the knowledge rooted in the organization and its systems; and social capital, i.e. the knowledge rooted in relationships external to the organization (Guthrie et al., 2012). IC research has experienced four stages (Dumay et al., 2016; Edvinsson, 2013; Guthrie et al., 2012; Sangiorgi & Siboni, 2017). The first stage addressed IC meaning, measurement, and reporting. It focused on why it is important to recognize IC for creating a competitive advantage. The second stage dealt with enhancements in the guidelines related to IC management and reporting and with the formulation of IC classifications (Giuliani et al., 2016; Guthrie et al., 2012; Petty & Guthrie, 2000). The third and fourth IC research stages address a more performative approach, aiming to analyze how IC works in organizations and its impact on people, processes, and relationships (Cuganesan, 2005; Cuganesan et al., 2007; Mouritsen, 2006; Secundo et al., 2017). The concept of IC was first studied in relation to for-profit organizations (Edvinsson & Malone, 1997), then it included public organizations, nonprofit organizations, and universities (Kong & Prior, 2008; Mouritsen et al., 2004; Ramírez, 2010).

154

E. Bonollo et al.

IC represents a large proportion of university assets (Ramírez et al., 2011; Sánchez et al., 2009; Secundo et al., 2010, 2017), as universities are knowledgebased institutions in which IC plays a crucial role (Bezhani, 2010; Ramírez & Gordillo, 2014). The definition of the three IC components, i.e., human capital, organizational capital, and social capital, in the university setting is as follows (Leitner, 2004, p. 133; Secundo et al., 2017): – Human capital is “the knowledge of the researchers and non-scientific staff of the university” (Leitner, 2004); this includes ability, knowledge of research, professors, technical staff, students, and administrative staff. – Organization capital comprises “the routines and processes within a university, an often-neglected infrastructure of theirs” (Leitner, 2004); this consists of databases, intellectual property, research infrastructure, education and research project, university culture, and governance processes. – Social capital represents “the relationships and networks of the researchers as well as the entire organization” (Leitner, 2004); this includes relationships with public and private partners, partnerships with the business sector and regional governments, its academic prestige, its brand, its relations with nonprofit organizations and civil society, relationships with national and international research centers, networks and alliances, and its reputation as a place to study and to work. IC disclosure plays a relevant role in university voluntary disclosure (Ramírez et al., 2011). As a consequence of the profound changes in the university sector at the end of the 1990s, HEIs have been increasingly required to reduce the information gap between them and their stakeholders. Sangiorgi and Siboni (2017) grouped the aforementioned historic changes into four events. The first refers to the increasing number of enrolled students, which generates a transition from an industrial economy to a knowledge economy. The second addresses the consequences of the Bologna Process: universities obtained greater autonomy in relation to management, organization, and budget distribution. The third is the decline in public funds, which forces universities to apply for other financing resources. The fourth relates to the increasing competition among universities due to pressure from governments and external agencies in controlling academic outputs. HEIs are being forced to adopt new tools for results’ management, measurement, and disclosure. To reduce information asymmetry, universities should provide additional information, and IC plays a relevant role in this (Castilla-Polo & Gallardo-Vázquez, 2016; Sánchez & Elena, 2006; Sangiorgi & Siboni, 2017). IC voluntary disclosure allows us to reduce the information gap between organizations and stakeholders (Branswijck & Everaert, 2012; Brüggen et al., 2009; Castilla-Polo & GallardoVázquez, 2016; Guthrie et al., 2004; Leuz & Verrecchia, 2000; Lev, 1992; Vergauwen & Van Alem, 2005) and to include the intangible value in managerial decisions (Aboody & Baruch, 2000; Catasús & Gröjer, 2003; Giuliani & Marasca, 2011; Holland, 2001; Mouritsen et al., 2004; Starovic & Marr, 2003; Vergauwen & Van Alem, 2005). In addition, voluntary disclosure provides learning benefits to the organization, and organizations develop internal competences when they begin to

Third Mission and Intellectual Capital External Dimension: The. . .

155

apply IC measurement and reporting (Castilla-Polo & Gallardo-Vázquez, 2016; Chiucchi, 2013; Lev & Zambon, 2003; Sveiby, 2001; Yu & Humphreys, 2013). In Europe, alongside IC literature development, guidelines for IC management and reporting practices have been formulated (Ramírez & Gordillo, 2014). The first guideline was proposed by the European Commission: the Observatory of the European University2 proposed the “Intellectual Capital Report for HEIs” (ICU report), which helps HEIs to formulate IC voluntary disclosure. However, the ICU reports do not result in regulations requiring mandatory IC reports (Ramírez et al., 2011; Sangiorgi & Siboni, 2017). The second initiative was represented by the Austrian University Act (2002), through which the Austrian government required all Austrian HEIs to produce a performance report describing IC (Sangiorgi & Siboni, 2017). In the Italian context, Italian Legislative Decree 150, 2009, invited HEIs to elaborate a strategic map with the aim of explaining their strategy by linking “objectives to indicators, intangible resources to tangible indicators and indicators to targets” (CIVIT, 2010) and of communicating this strategy both within the organization and to external stakeholders. Sangiorgi and Siboni (2017) investigate the amount of voluntary IC disclosure in the voluntary social reports (SRs) of Italian universities. Their research shows a relevant amount of IC disclosure in SRs, and university top managers demonstrate being aware of the benefits of IC management and reporting practices in line with previous IC voluntary disclosure research (Aboody & Baruch, 2000; Catasús & Gröjer, 2003; Chiucchi, 2013; Giuliani & Marasca, 2011; Holland, 2001; Mouritsen et al., 2004; Starovic & Marr, 2003; Vergauwen & Van Alem, 2005). The third mission represents the trait d’union between the university and the external world and thus is strictly related to the external dimension of IC (Manes Rossi et al., 2018). Previous literature about the relationship between IC and the third mission is limited. When searching Scopus,3 only 17 studies include both “third mission” and “intellectual capital” in the abstract as keywords. Scholars highlight that universities manage IC to achieve and support the third mission, contributing to social and economic development (Aversano et al., 2020; Brusca et al., 2018; Di Berardino & Corsi, 2018; Lombardi et al., 2019; Mariani et al., 2018; Secundo et al., 2017, 2018). Aversano et al. (2020) analyzed the level of IC disclosure in the integrated plan of Italian universities, and they included IC items related not only to the first and second missions but also to the third mission. They find that universities are aware of the relevance of disclosing information about IC components related to the third mission. Lombardi et al. (2019) developed an exploratory case study of the University of Bari (Italy) and concluded that HEIs use a wide set of intangible resources related to the third mission. Overall, previous literature about

2

The Observatory of the European HEI was established within the PRIME funded by the European Commission for the creation of management tools for research activity governance and to create a framework useful for comparing universities. 3 Scopus is one of the largest academic databases of peer-reviewed literature (Massaro et al., 2016).

156

E. Bonollo et al.

the relationship between IC and the third mission prioritizes the investigation of IC over the third mission. As such, a need emerges to deepen the study of the IC–Third mission relationship, prioritizing insights related to third mission. Consistent with Di Berardino and Corsi (2018), we develop the IC-Third mission theoretical framework, which provides evidence supporting the third mission as an external dimension of IC components. From a well-established IC theoretical framework, we build a comprehensive IC-Third mission framework that assigns the related third mission activities to each IC component. We employ the IC theoretical framework developed by Low et al. (2015) and adjusted by Manes Rossi et al. (2018), and for each IC element we define the related third mission dimension, i.e. technology transfer innovation, continuing education, and public engagement. The assignment of third mission dimensions is supported by the third mission measurement framework of the Radar Third Mission—PRIME project and the E3M project. Figure 2 depicts the IC-Third mission theoretical framework. Human capital represents the knowledge embedded in the academic and nonacademic staff of universities, and it drives the value creation of the university activities related to the three university missions. Coworking and the exchange of the knowledge embedded in the academic and nonacademic staff represent the external dimension of human capital, which drives the value creation of third mission activities, in particular, of technology transfer innovation and continuing education dimensions. By representing the infrastructure of the university, organizational capital provides knowledge to fulfill all three university missions. The external dimension of organizational capital refers to the facilities, projects, and technologies that favor dissemination and interaction with public society; thus, it is related, in particular, to the technology transfer innovation and public engagement dimensions of third mission. Finally, social capital includes the knowledge embedded in the relationship between the organization and academics/practitioners, and it is a crucial driver of value creation for university missions. The external dimension of social capital is related to all three dimensions of the third mission, i.e. technology transfer innovation, continuing education, and public engagement. In fact, it refers to the whole range of relationships that promote the exchange of knowledge between universities and public/industry sectors. Figure 3 provides a visual representation of the IC-Third mission theoretical framework. The IC external dimension and thus the third mission activities are intrinsic to universities, which manage IC to support the relationship between academics and the external world. The degree to which the third mission is formalized provides evidence of HEIs’ awareness of the activities covered by the third mission and thus of the assets related to the IC external dimension. Considering the third mission as an expression of IC components, we intend to develop a comparative and explorative study with the aim of verifying the degree of formalization of the third mission in the planning documents of European universities and identifying the elements associated with this formalization. By verifying the formalization of the

Third Mission and Intellectual Capital External Dimension: The. . .

157

Fig. 2 IC-Third Mission theoretical framework. Source: elaboration adapted from Di Berardino and Corsi (2018), Low et al. (2015), Manes Rossi et al. (2018), Radar Third Mission-PRIME project and the E3M project

Fig. 3 IC-Third mission theoretical framework. Source: Our elaboration

158

E. Bonollo et al.

third mission as a unitary phenomenon in university planning documents, we assess the formalization of IC’s external dimension in university programming activities.

4 The Italian, The UK, and French University Sector: Context and Initiatives for Evaluation Systems The evaluation of university activities is a topic that has taken on such a growing importance in recent years that it has become one of the central fields of research about universities (Tight, 2003). The important role recognized in the evaluation of universities is inspired, on the one hand, by the public management theories, which see the process of measurement and evaluation as a fundamental element to improve the efficiency and effectiveness of public administrations (Guthrie et al., 2005). On the other hand, the evaluation of universities responds to a common need felt at European level to have unitary metrics to promote integration and cooperation at supranational level (Turri, 2015). The circumstance that there are independent national university systems called to a common path of convergence on quality assurance, shows undoubted elements of homogeneity that coexist with marked differences largely due to context variables. Starting from this premise, this section summarizes the characteristics of the university evaluation systems adopted in Italy, the UK, and France, paying particular attention to initiatives for the evaluation of research and third mission activities.

4.1

The Italian University Sector

In the Italian context, the Italian National Agency for the Evaluation of the University and Research Systems (ANVUR) is entrusted for the quality evaluation system for all the three universities missions. Established in 2006, ANVUR became operational in 2011, replacing the previous evaluation body, i.e. the National Committee for the Evaluation of the University System (CNVSU) and the Steering Committee for the Evaluation of Research (CIVR). Although being a separate legal entity, ANVUR’s operations are subject to the Ministry, which approves ANVUR’s annual program, establishes the evaluation procedures, and regulates the financial resources for its functioning (Turri, 2014). ANVUR organizes the evaluation system at two levels: the system of periodic self-evaluation and accreditation (AVA) of all the three university missions, and a cyclical evaluation process (VQR) concentrated only on research and third mission. While first and second mission performances are tied to the university funding policy, the third mission measurement is still experimental and thus not yet tied to any funding policy (Turri, 2014; Di Berardino & Corsi, 2018; Aversano et al., 2020).

Third Mission and Intellectual Capital External Dimension: The. . .

159

In relation to the aim of this chapter, the following paragraphs will focus on the description of the evaluation system of the research and on the main steps of the process of third mission institutional recognition and evaluation. As to research evaluation, ANVUR applies VQR to evaluate universities and departments on the basis of the research products submitted by their academic staff. A national panel evaluates the research products according to a peer-review logic. The first VQR exercise covered the period 2004–2010, the second the period 2011–2014, and finally ANVUR recently launched the third VQR exercise, covering the years 2015–2019. Conversely, the AVA system works as an evaluation tool, which collects information about universities’ research activities through the compilation of Annual Self-evaluation Forms4 (Lumino et al., 2017). The information collected through the AVA system will be subject to further evaluation, which has not yet been defined. The institutional recognition of the third mission follows a gradual process which paves the way towards an informative system to evaluate it. The current Italian evaluation system of the third mission activities is centered on two macro-areas, i.e. the development of research and the production of public goods. The first area relates to the third mission dimension of technology transfer innovation, while the second macro area corresponds to the third mission dimensions of continuing education and public engagement (E3M Project, 2011; Anvur, 2015; Palla, 2016). Before the third mission was officially institutionalized as a university’s mission, ANVUR made a first attempt to evaluate it within the first round of the Evaluation of Research Quality exercise, VQR 2004–2010. Critical issues have emerged in this first attempt: the standardization of third mission indicators was sometimes not sufficient to systematically compare universities. The results of the VQR 2004–2010 were applied to create ministerial guidelines about methodology and evaluation procedures for third mission activities (Anvur, 2015; Di Berardino & Corsi, 2018). Through the introduction of AVA (Legislative Decree No. 19 del 2012; Ministerial Decree No. 47/2013), the third mission was officially recognized as an institutional mission, alongside teaching and research. Following the ministerial guidelines, ANVUR together with the Ministry developed the Annual Third Mission Form SUA-TM, which represents an informative system capable to collect standardized and comparable data from all Italian universities. In the second round of the Evaluation of Research Quality exercise, i.e. VQR 2011–2014, the third mission evaluation benefitted of the Ministerial guidelines and the data collected through the Annual Third Mission Form SUA-TM. VQR 2011–2014 applied qualitative and quantitative indicators, which were focused

4

AVA requires universities to compile two Annual Self-evaluation Forms (SUA) focused, respectively, on the evaluation of teaching programs (SUA-DID, in force by 2012) and on the evaluation of departmental research and third mission activities (SUA-RES-TM, in force by 2014) (Anvur, 2015; Lumino et al., 2017).

160

E. Bonollo et al.

mainly on technology transfer innovation and only partially on continuing education and public engagement (Di Berardino & Corsi, 2018). In 2018, ANVUR reviewed the Annual Third Mission Form SUA-TM and mapped all the statistical sources and databases that could be useful for collecting data about third mission activities and the impact on the society. In January 2020, the third cycle of the VQR 2015–2019 has been launched. It requires universities to submit case studies about third mission activities implemented in the period 2015–2019, highlighting the impact of these activities on the social context in which the university operates (Anvur, 2019a).

4.2

The UK University Sector

The UK evaluation system of HEIs involves teaching, research, and knowledge transfer activities and is related to the annual public funding for universities. Due to the aim of this chapter, the focus is on the evaluation system of the research and knowledge transfer, neglecting the teaching activities. Universities research is assessed in two different ways: – The Research Councils for each discipline manage the ex ante evaluation of specific research projects for funding, after competitive bidding; – The Funding Councils (Research England, Scottish Funding Council, Higher Education Funding Council for Wales, Department for Employment and Learning-Northern Ireland) allocate block grants based on periodical ex post research assessment. Since 2014, Research Assessment Exercise (the oldest research evaluation mechanism that had been introduced in 1986) has been replaced by Research Excellence Framework (REF). Both evaluation exercises aim to inform the selective allocation of funding for research, to provide accountability for public investments in research, to provide benchmarking information, and establish reputational yardsticks (Hicks, 2012; Rebora & Turri, 2013). Furthermore, REF continues to use peer review based on the disciplinary areas and not to adopt the metrics-based system as a primary indicator of research quality, although it considers it as useful to inform the process of expert peer review. Unlike in the past, REF includes for the first time the assessment of the broader impact of research beyond academia (Greco, 2014; REF, 2011). In details, in REF 2014 (including research outputs from 2008 to 2013), ratings mainly focus on research quality in terms of their originality, significance, and rigor, with reference to international quality standard (65%); then 20% of the ratings is assigned according to impacts on the “economy, society, culture, public policy or services, health, environment or quality of life”; finally, 15% is the weight assigned to “vitality and sustainability” of the environment that supports research in relation to the national, international, and world-leading perspectives (REF, 2011). This last element, already existing in the past research assessment exercises, highlights that

Third Mission and Intellectual Capital External Dimension: The. . .

161

the assessment regards also the future perspectives (e.g., the universities have to specify the next 5 years research projects linked to the current research outputs). The most notable change is, as already said, the assessment of the university research impact with the aim to reward university departments that engage with businesses, the public sector, and civil society organizations to benefit them with new ideas. To this aim, the universities provide impact case studies to describe these positive effects beyond academia (e.g., quotations on official documents of international organizations, newspapers and media, etc.). The next REF will be undertaken in 2021 and, unlike the REF 2014, the research quality will carry a weight of 60%, the impact beyond the academy of 25% and the research environment of 15%, highlighting the increasing relevance assigned to the effects of the research activities on stakeholders, something similar to the so-called public engagement of the third mission (REF, 2019). As to the knowledge transfer activities, the evaluation system is based on the quality of specific research proposals and on the results of the annual Higher Education Business and Community Interact survey. Since 1999, this annual survey collects financial and output data on a range of activities, such as the provision of continuing education courses and commercialization of intellectual property. However, only the income received by a HEI from businesses, public and third sector services, the community and wider public is considered as a proxy measure for the impact of its knowledge exchange activities and inform the funding. Alongside the next REF in 2021, Research England is now developing the Knowledge Exchange Framework, a new assessment exercise on knowledge exchange activities of English HEIs. It aims to map these activities through a dashboard of standardized indicators regarding patents, academic entrepreneurships, third party research funding, local development projects, and public and community involvement, but so far without any connection with funding.

4.3

The French University Sector

Since 2007 the French Higher Education System endured one of the deepest renovations of its history with the creation of AERES (Higher Education and Research Evaluation Agency). Before AERES the responsibility for the evaluation of higher education and research was distributed among different agencies (Chevaillier, 2013). In 2013 the Law No. 660 of July 22 created the High Council for the Evaluation of Research and Higher Education (Hcéres) with the aim to pursuing the ambitious purpose “to uphold France’s ranking among the world’s leading research powers, and to enable French research to take up the scientific, technological, environmental and societal challenges of the 21st century” (Hcéres, 2020). It has replaced the AERES from November 17, 2014. The new evaluation body maintains the same competences as the previous one. The main innovations concern the intention to increase the degree of evaluators’

162

E. Bonollo et al.

independence, transparency, and fairness. In this sense, also the use of experts for evaluation coming from the scientific fields of research has increased. In order to carry out its activities, Hcéres uses a pool of more than 10,000 experts of which about 4,500 are engaged every year. The 20% of them are foreign experts. Its governance structure and operating mode were established by the decree No. 1365 of November 14, 2014. The mission of Hcéres consists in: evaluating all the higher education and research bodies in France (universities, schools, institutions, higher education clusters, research units, doctoral schools, programs pertaining to the Bachelor/Master/ Doctorate system); validating the evaluation procedures designed by other bodies; evaluating, when they wish to do so, the higher education bodies of other countries; providing analyses and indicators both on the domestic and international levels by utilizing the studies of the Observatory of Sciences and Technology (OST), a branch of Hcéres; contributing to define a national policy on scientific integrity, observing practices and providing support to the actions of key actors by relying on the French office of scientific integrity (Ofis), a branch of Hcéres (Hcéres, 2020). The evaluation methodology adopted by Hcéres complies with international best practices concerning teaching and research activities. The strong propensity for internationalization is reflected in the membership of Hcéres in ENQA (the European Association for Quality Assurance in Higher Education) and the convinced adherence to the principles expressed by the Bologna process. The evaluation covers three fields of analysis. The first concerns the evaluation of universities and the cluster of higher education and research institution. This process is based on analysis of the university’s management, of its leadership and, in particular, its attitude to exhibit the effectiveness of its activities. Based on the European Standards Guidelines (ESG) applied to the European Higher Education Area, this evaluation focuses on an exam of the continuous improvement processes implemented by the university. The evaluation keeps account of the different features of institutions. It is based on a self-assessment report followed by an external visit and provides the publication of a final report. The second field concerns the evaluation of Research Units, Clinical Investigation Centers, Federative Structures and Health Institutions. It assesses their research activities and describes their position in the regional, national, and international environment (Barats & Née, 2020). The third area of analysis focuses on the program of study. This evaluation takes into consideration the relevance of the training offer. It is planned to draw up a document of self-evaluation by the university evaluated, the drafting of an evaluation by external committee of disciplinary experts, and finally the publication of the report (Turri, 2015). The outcome of the evaluation is an integrated report that takes into account first of all the quality of the scientific production. In France there is therefore no specific area of analysis concerning the third mission of the universities, but in the integrated final report some of its aspects are

Third Mission and Intellectual Capital External Dimension: The. . .

163

indicated, such as the contribution to industrial research and the inclusion in the international academic community. Universities are therefore evaluated on the ability to generate spillover effects on the industrial sector of the research activities carried out.

5 Research Design The method followed to implement this comparative and exploratory study follows three main stages. First Stage. We addressed the selection of European HEIs to be included in the sample for the analysis. We selected HEIs from Italy, the UK, and France following selection criteria based on the academic ranking and size of the university. The study of Brusca et al. (2019) provides evidence of the relationship between IC disclosure and academic ranking, highlighting that a higher index for ranking is associated with a higher index for IC disclosure. Additionally, Brusca et al. (2019) identify a positive relationship between IC and university size. In light of this evidence, to achieve the aim of our research, we selected HEIs with a higher academic ranking and a greater size. In relation to the academic ranking, we adopted the Times Higher Education World University Rankings, which lists the top HEIs in the world, making it the biggest international table to date5 (Times Higher Education, 2018). HEIs in the first 351 positions of the Times Higher Education World University Ranking are included in the sample. The size of the university is measured through the number of enrolled students. Size data are obtained from the European Tertiary Education Register6 (European Tertiary Education Register, 2018). Italian HEIs with more than 40,000 students are included in the sample. French and British HEIs with more than 20,000 students are included in the sample. The final sample of HEIs includes a total of 25 HEIs. Second stage. We analyzed the university disclosure through automated content analysis, implemented through WordStat software (Krippendorff, 2004; Provalis Research, 2015). We took into account the disclosure provided by HEIs in their most recent strategic plans.7 We created a third mission dictionary that captures both the explicit and implicit formalization of the third mission dimensions (technology transfer innovation, continuing education, and public engagement) and of third

5

Times Higher Education World University Ranking is the only table ranking university performance; it judges research-intensive universities across all core missions: teaching, research, knowledge transfer, and international outlook (third mission). 6 The European Tertiary Education Register (ETER) is a database that collects data about European universities’ characteristics, providing data at the level of individual universities. 7 The extraction of the documents took place in January 2018. We downloaded the most recent documents available at the date of extraction, as the documents are dated between 2009 and 2018. In the absence of a strategic plan, the integrated plan of universities has been considered in the automated content analysis.

164

E. Bonollo et al.

mission as a whole. The dictionary contains four main categories, i.e., three categories related to the three dimensions of the third mission and a fourth corresponding to the third mission as a whole. For each category, the dictionary lists the common expressions HEIs use to refer explicitly or implicitly to the third mission and its dimensions (see Table 1). The extant third mission research supports our identification of the terms related to the explicit formalization of third mission. Whereas, we adopted an inductive approach to select terms related to the implicit formalization of third mission. Specifically, we grouped the expressions resulting from the automated content analysis into the dictionary categories.8 An expression is considered an implicit formalization of third mission if it implicitly describes an activity related to the third mission. The third mission dictionary was validated by three independent researchers (Humphreys & Jen-Hui Wang, 2018). We conducted the content analysis based on the dictionary with the aim of counting the frequency of third mission terms in university disclosure and thus verifying the explicit and implicit formalization of third mission dimensions (technology transfer innovation, continuing education, and public engagement) and of the third mission as a whole. Through the assessment of third mission formalization, we aimed to assess the formalization of IC’s external dimension in university disclosures. Third stage. To analyze the information collected through the automated content analysis, we constructed a disclosure index. Specifically, we constructed an IC formalization index (ICF index) to more closely examine the formalization of the third mission of each university. Three dimensions were adopted to assign a score to information previously found through the automated content analysis: relevance, quality, and measurability. The relevance of information considers the quantity of information, which is the count of keywords identified in the planning documents. We assigned 1 point if the quantity is low (from zero to 37 words), 2 points if the quantity is medium (from 38 to 59 words), and 3 points if the quantity is high (more than 60 words). The score ranges were established on the basis of quartiles for the number of words detected in the documents. The quality of information was used to assign a score to the sentence where the keyword was found by the software: 1 point for a generic and declarative sentence, 2 points for a detailed but declarative sentence, and 3 points for a sentence with references to tables, figures, or graphs. The measurability of information also considers whether the sentence includes hard-fact substantial information: 0 points if there is no quantitative information; 1 point if the sentence includes at least one number (monetary or nonmonetary); 8

WordStat calculates the expression frequencies for each of the retrieved documents. We restrict the number of included expressions to a maximum of 1000 items based on the computed TFxIDF index, which is the “term frequency weighted by inverse document frequency”; and it is based on “the assumption that the more often a term occurs in a document, the more it is representative of its content yet, the more documents in which the term occurs, the less discriminating it is” (Provalis Research, 2015, p. 37). Expressions have a minimum of 2 and a maximum of 5 words.

Explicit items THIRD_MISSION, TERZA_MISSIONE, TROISIEME_MISSION

INNOVAZIONE, TECHNOLOGY_TRANSFER_AND_INNOVATION, TECHNOLOGY_TRANSFER, TRASFERIMENTO_TECNOLOGICO, TRANSFERT_DE_TECHNOLOGIE, INNOVATION

Category TM

TT

Table 1 Third mission dictionary

(continued)

Implicit items ACADEMIC_AND_PROFESSIONAL, ECONOMIC_AND_CULTURAL, ECONOMIC_AND_SOCIAL, IMPACT_FOR_SOCIETY, IMPACT_ON_SOCIETY, IMPATTO_ECONOMICO, KNOWLEDGE_AND_SKILLS, KNOWLEDGE_EXCHANGE SHARE_KNOWLEDGE, KNOWLEDGE_TRANSFER, STRATEGIC_PARTNERS ACADEMIC_INDUSTRY, BREVETTE, BREVETTO, BUSINESS_AND_INDUSTRY, COLLABORATIVE_PARTNERSHIPS, COMMERCIAL_AND_SOCIAL_ENTERPRISE_ACTIVITY CONTRATTI_DI_RICERCA, CONTRIBUTION_A_L_’INNOVATION, DE_PROPRIÉTÉ_INTELLECTUELLE, ENTI_DI_RICERCA, GRANT_APPLICATIONS IMPACT_OF_OUR_RESEARCH, IMPRENDITORIALITÀ_ACCADEMICA, INCUBATEUR, INCUBATOR, INCUBATORE, INCUBATORI, INDUSTRIAL_PROPERTY, INNOVATIVE_RESEARCH, INTELLECTUAL_PROPERTY, LE_PARTENARIAT, MEDICAL_RESEARCH, PARCHI_SCIENTIFICI, PARTENARIATS_AVEC, PARTENARIATS_PRIVILÉGIÉS, PATENT, PROPRIETE_INTELLECTUELLE, PROPRIETÀ_INDUSTRIALE, PROPRIETÀ_INTELLETTUALE, PROPRIÉTÉ_INDUSTRIELLE PROPRIÉTÉ_INTELLECTUELLE, REGIONAL_ECONOMY, RESEARCH_AND_DEVELOPMENT, RESEARCH_AND_ITS_IMPACT, RESEARCH_COLLABORATIONS, RESEARCH_GRANT, RESEARCH_WITH_IMPACT, SMALL_AND_MEDIUM_SIZED_ENTERPRISES, SPIN_OFF, STRATEGIC_PARTNERSHIPS SVILUPPO_E_RAPPORTI, SPIN_OUT_COMPANIES, UNIVERSITY_RESEARCH

Third Mission and Intellectual Capital External Dimension: The. . . 165

CIVIC_ENGAGEMENT, EGAGEMENT, EGAGEMENT_PUBLIQUE EGAGEMENT_SOCIAL, ENGAGEMENT_ACTIVITIES, GLOBAL_ENGAGEMENT, PUBLIC_ENGAGEMENT, SOCIAL_ENGAGEMENT, REGIONAL_ENGAGEMENT

PE

Source: Our elaboration

Explicit items CONTINUING_PROFESSIONAL_DEVELOPMENT, FORMATION_CONTINUE, FORMAZIONE_PERMANENTE LIFELONG_LEARNING, LIFE_LEARNING, CONTINUING_EDUCATION, FORMAZIONE_CONTINUA, EDUCAZIONE_PERMANENTE, EDUCAZIONE_CONTINUA

Category CE

Table 1 (continued) Implicit items CORSI_FORMAZIONE, FORMATION_PROFESSIONNELLE, FORMAZIONE_PROFESSIONALE, INSERTION_PROFESSIONNELLE, MASSIVE_OPEN_ONLINE_COURSES, POST_GRADUATION_PROGRAMME, MOOC PROFESSIONAL_DEVELOPMENT, PROGRAMME_POSTUNIVERSITAIRE, PROGRAMMA_POST_LAUREA, TRAINING_COURSES BENI_PUBBLICI, BIBLIOTHÈQUE, COLLECTIVITÉ, COMMUNITIES_AND_WIDER_SOCIETY, COMMUNITY_ENGAGEMENT, COMUNITÀ, CONTRIBUTE_TO_SOCIETY, DES_BIBLIOTHÈQUES, LA_SOCIÉTÉ, LES_RELATIONS, MUSÉE, OPEN_ACCESS, POLO_TERRITORIALE, PUBLIC_SERVICE, SERVICE_OF_SOCIETY, SISTEMA_BIBLIOTECARIO, SISTEMA_MUSEALE, MUSEUM, ARCHEOLOGICAL, SOCIAL_COMMUNITY, OPEN_DAY, SOCIAL_EVENT, EXHIBITION, MUSEO, ARCHEOLOGICO, ORCHESTRA, EVENTO_SOCIALE, MOSTRA, ARCHEOLOGIQUE, ORCHESTRE, EVENEMENT_SOCIAL, EXPOSITION, LIBRARY, BIBLIOTECA, SOCIETAL_CHALLENGE_THEMES, STAGE, TERRITORIO_LOCALE, TIROCINIO, WIDER_COMMUNITY

166 E. Bonollo et al.

Third Mission and Intellectual Capital External Dimension: The. . .

167

2 points if the sentence contains both numbers and tables or figures or graphs; and 3 points if the sentence includes at least one performance indicator. Thus, the ICF index is calculated as follows: n P

ICF Index ¼ q þ

n P

ql

i¼1

n

þ

m

i¼1

n:

where q is the “score for quantity” n P ql is the sum of scores obtained for the quality of information i¼1 Pn i¼1 m is the sum of scores obtained for the measurability of information n is the number of keywords included at least once in the planning documents The maximum score obtainable is 9. The information required for this index was obtained by performing a manual content analysis of the selected planning documents. To ensure the validity and reliability of the manual content analysis, two experts in the method were selected as coders. Then, a third researcher replicated the analysis on a pilot sample of planning documents. No major issues of differences were reported, confirming the reliability of the coding process.

6 Findings and Discussion The degree of formalization of the third mission in the university planning documents provides evidence of HEIs’ awareness of the activities covered by the third mission and thus of their assets related to the IC external dimension. Furthermore, it provides evidence for how and to what extent universities take into account the third mission and the IC external dimension in their strategic decisions. Table 2 shows the results from the WordStat automated content analysis combined with the construction of the ICF index. The automated content analysis of HEI planning documents through the third mission dictionary reveals that the frequency of terms related to the explicit and implicit formalization of the third mission as a unitary phenomenon is limited. HEI disclosure mainly formalizes single dimensions of third mission, in particular the technology transfer innovation dimension. This results in third mission and IC external dimensions informing strategic decisions inside HEIs in a limited way, potentially leading to a suboptimal decision-making process. The institutional context contributes to determining the level of explicit and implicit formalization of the third mission and its dimensions in university planning documents. Italian universities present the highest ICF index and thus the highest level of formalization of third mission and its dimensions. Their strategic plans contain a high level of both explicit and implicit formalization of third mission

London Global University

Newcastle University

Queen’s University Belfast

The University of Edinburgh

The University of Liverpool

7

8

9

10

11

Cardiff University

5

King’s College London

Université Pierre Marie Curie

4

6

Université Paris Diderot

3

UK

L’Université Paris Descartes

France Aix-Marseille Université

2

1

University

6

23

0

2

0

0

3

4

0

0

5

0

3

0

0

0

0

0

0

0

0

Implicit TM

0

Explicit TM

23

6

2

3

4

5

3

0

0

0

0

TM

28

15

7

6

11

6

8

11

8

0

5

Explicit TTI

Table 2 WordStat content analysis and the ICF index

16

8

3

4

31

13

3

3

9

9

6

Implicit TTI

44

23

10

10

42

19

11

14

17

9

11

TTI

0

6

0

1

15

0

0

7

12

7

7

Explicit CE

1

4

0

0

5

0

0

2

10

7

8

Implicit CE

1

10

0

1

20

0

0

9

22

14

15

CE

3

2

1

3

11

2

0

0

0

0

0

Explicit PE

8

6

2

16

16

9

0

15

3

7

11

Implicit PE

11

8

3

19

27

11

0

15

3

7

11

PE

79

47

15

33

93

35

14

38

422

30

37

TOT

3

2

1

1

3

1

1

2

2

1

1

Quantity

ICF Index

1.56

1.43

1.67

1.48

1.66

1.26

1.14

1.82

1.5

1.3

1.76

Quality

0.44

0.09

1.6

0

0.22

0.31

0.21

0.47

0.24

0.27

0

Measurability

5

3.51

4.27

2.48

4.87

2.57

2.36

4.29

3.74

2.57

2.76

TOT

168 E. Bonollo et al.

Università di Padova

Politecnico di Milano

20

24

University of Warwick

19

Università di Milano

University of Southampton

18

23

University of Oxford

17

Università di Bologna

University of Nottingham

16

22

University of Glasgow

15

Sapienza Università di Roma

University of Birmingham

14

21

The University of Sheffield

13

Italy

The University of Manchester

12

6

21

0

0

0

0

4

10

1

0

3

2

1

9

2

17

11

7

0

0

0

0

0

0

0

0

6

21

10

4

8

0

3

2

1

9

2

17

11

42

50

14

9

36

7

1

5

1

3

10

23

4

8

23

3

5

37

4

0

5

0

3

6

18

17

50

73

17

14

73

11

1

10

1

6

16

41

21

10

6

1

3

4

1

0

4

0

0

0

0

2

0

0

1

0

2

0

0

4

0

0

2

0

1

10

6

2

3

6

1

0

8

0

0

2

0

3

1

0

0

0

1

0

0

2

2

2

0

0

2

30

60

16

24

88

3

0

3

3

3

7

1

13

31

60

16

24

89

3

0

5

5

5

7

1

15

97

160

45

45

176

15

4

25

7

20

27

59

50

3

3

2

2

3

1

1

1

1

1

1

2

2

1.76

1.98

2.44

1.13

2.56

1.73

1

1.16

1

1.35

1.41

1.37

1.28

0.66

0.23

0.73

0.18

1.69

0.87

0

0

0

0.65

0.19

0.02

0.62

(continued)

5.42

5.21

5.18

3.31

7.25

3.6

2

2.16

2

3

2.59

3.39

3.9

Third Mission and Intellectual Capital External Dimension: The. . . 169

Total

Università di Pisa

Source: Our elaboration

25

University

Table 2 (continued)

10

102

66

Implicit TM

18

Explicit TM

168

28

TM

333

23

Explicit TTI

241

7

Implicit TTI

574

30

TTI

87

1

Explicit CE

47

0

Implicit CE

134

1

CE

33

1

Explicit PE

406

62

Implicit PE

439

63

PE

1315

122

TOT 3

Quantity

ICF Index Quality 1.84

Measurability 1.17

6.02

TOT

170 E. Bonollo et al.

Third Mission and Intellectual Capital External Dimension: The. . .

171

dimensions and of third mission as a whole. These results can be explained in light of the advanced Italian system of university valuation, which officially includes the third mission among the institutional university activities, subject to the evaluation process performed by ANVUR. All the activities carried out in recent years in the Italian context contribute to developing a solid informative system to evaluate third mission activities. These activities are also expression of the Italian participation to the European projects about third mission measurement, i.e., PRIME-OEU and E3M projects, which all together contribute to the definition of the three dimensions of third mission. As such, the Italian national university evaluation system promotes the clear formalization of third mission as a unitary phenomenon distinct from the other missions. By contrast, British and French HEIs present lower ICF index values than Italian HEIs, providing evidence for the limited formalization of third mission and its dimensions. Their strategic plans contain a low level of explicit formalization of both the third mission dimensions and third mission as a whole. The third mission and its dimensions are mainly disclosed in an implicit way. The UK and French national university evaluation systems do not include third mission in the evaluation process and thus do not promote the formalization of third mission in university disclosures. Their attitude towards third mission reflects also their limited engagement to the European projects about third mission dimensions, compared with Italy. In fact, France participates only to the PRIME-OEU, while the UK does not participate to any of the two projects. As a consequence, the boundary of third mission in relation to the first (teaching) and second missions (research) is not clearly defined in these universities. However, unlike French universities, British universities present a high level of implicit formalization of the third mission overall. UK universities are subject to the Research Excellence Framework (REF), which assesses and evaluates university research. In addition to research evaluation, the REF assesses the impact of research on society and thus implicitly assesses third mission activities. Recently, ANVUR and the English higher education research and knowledge exchange funding agency, Research England, have begun a partnership that aims at supporting assessment of the three missions of the university (ANVUR, 2019b). Against this background, we expect that in line with Italy, the UK will experience the institutionalization of third mission among university missions, which will eventually result into an increased formalization of third mission in UK universities.

7 Conclusion The university is a knowledge-based institution in which IC plays a crucial role (Ramírez & Gordillo, 2014). IC represents the input and output of university activities (Castilla-Polo & Gallardo-Vázquez, 2016). In a knowledge-based economy, universities must

172

E. Bonollo et al.

exchange knowledge with the external world, pursuing not only the two traditional missions (teaching and research) but also the third mission. The third mission represents the joining link between the university and the external contest. It allows us to increase coworking and knowledge exchanges with society. Considering third mission as an expression of the external dimension of IC components (Secundo et al., 2017), we developed a comparative and explorative study with the aim of verifying the degree of formalization of third mission in the European university planning documents. By verifying the formalization of third mission as a unitary phenomenon in the university planning documents, we assessed the formalization of the IC external dimension in their programming activity. The IC external dimension and thus third mission activities are intrinsic to universities, which manage IC to support the relationship between academics and society. The degree of formalization of third mission provides evidence of the HEIs’ awareness of the activities covered by the third mission and thus of those of their assets related to the IC external dimension. Furthermore, it provides evidence of how and to what extent HEIs account for third mission and the IC external dimension in their strategic decisions. To reach our aim, we verified the third mission’s degree of formalization for a sample of 25 European universities from Italy, the UK, and France, through a WordStat automated content analysis of the disclosures included in their strategic/ integrated plan. The content analysis provides empirical evidence of the lack of explicit formalization of the overall third mission. Universities formalize single dimensions of the third mission (technology transfer innovation, continuing education, and public engagement), and references to the third mission in strategic planning documents are still limited. Specifically, the HEI planning documents mainly formalize the technology transfer and innovation dimension of third mission. The result is that the formalization of the third mission is still mainly related to the productive conception (focused on the economic exploitation of research), regardless of the evolution of the third mission concept. In other words, although the literature has found a broad convergence on a unitary conception of the constituent elements of the third mission, their connection with the university programming system is still fragmented. This result is even more clearly manifested in the French and English systems, where the production concept appears to be very marked. Technology transfer innovation represents the central element around which the university-business relational system develops. The theme of continuous education is another aspect being present in all three university systems, but the documental analysis shows that it is still interpreted as an eminently didactic aspect rather than as an element of interaction with society. Our findings confirm that despite the relevance of third mission activities for economic and social development, the lack of a comprehensive third mission definition and measurement framework leads to the low formalization of the third mission in the planning documents. As a consequence, the IC external dimension is formalized and disclosed in a limited way. The third mission and IC external

Third Mission and Intellectual Capital External Dimension: The. . .

173

dimensions inform strategic decisions inside universities in a limited way, potentially leading to suboptimal decisions. The level of institutionalization of the third mission in the national evaluation systems of HEIs represents a crucial determinant of the level of formalization of the third mission in university disclosure. Countries with national evaluation systems that institutionalize the third mission along with the other two missions promote the clear formalization of the third mission as a unitary phenomenon distinct from the other two missions. By contrast, the lack of institutionalization of the third mission in the national evaluation systems leads to limited awareness among universities of the activities covered by the third mission and thus of their assets related to the IC external dimension. The examined planning documents also show a low level of dissemination in the use of indicators to enable ex ante programming based on the measurability of the future actions to be undertaken. Even in Italy, where the third mission is the subject of an explicit evaluation activity carried out by a specific authority, an approach based on measurability remains unusual. Only a few universities make use of the planning system of the third mission focused on the measurability of the objectives, particularly those focused on engineering disciplines, which are always geared towards technology transfer and a greater connection with the industrial world. Our study is based on a framework that shows the connections between IC and the third mission, highlighting that although the third mission, at least on a conceptual level, appears to be mature from the point of view of its operationalization, it still suffers marked gaps. This research contributes by advancing the literature about the relationship between IC and the third mission, providing useful insights related to the disclosure of the third mission and the IC external dimensions in university planning documents. Our research can also help policy makers and public managers to understand the determinants for enhancing third mission disclosure.

References Aboody, D., & Baruch, L. (2000). Information asymmetry, R&D, and insider gains. Journal of Finance, 55(6), 2747–2766. https://doi.org/10.1111/0022-1082.00305 Anvur. (2015). La valutazione della terza missione nelle università italiane. Manuale per La Valutazione. Retrieved from www.anvur.it Anvur. (2019a). BANDO Valutazione della Qualità della Ricerca 2015-2019 (VQR 2015-2019). ANVUR. (2019b). Nasce la partnership tra ANVUR e Research England. Retrieved from https:// www.anvur.it/news/nasce-la-partnership-tra-anvur-e-research-england-2/ Aversano, N., Nicolò, G., Sannino, G., & Tartaglia Polcini, P. (2020). The integrated plan in Italian public universities: New patterns in intellectual capital disclosure. Meditari Accountancy Research, 28(4), 655–679. https://doi.org/10.1108/MEDAR-07-2019-0519 Barats, C., & Née, E. (2020). Structuring, supervising, and neutralizing the assessment of research entities: The writings of the French agency AERES/HCERES. Communication & Langages, 203(1), 5–25.

174

E. Bonollo et al.

Bencardino, F., & Napolitano, M. R. (2011). L’Università nei processi di sviluppo economico e sociale. In A. Bianchi (Ed.), Le Università del Mezzogiorno nella Storia dell’Italia Unita 18612011. Bezhani, I. (2010). Intellectual capital reporting at UK universities. Journal of Intellectual Capital, 11(2), 179–207. https://doi.org/10.1108/14691931011039679 Boedker, C., Mouritsen, J., & Guthrie, J. (2008). Enhanced business reporting: International trends and possible policy directions. Journal of Human Resource Costing & Accounting, 12(1), 14–25. https://doi.org/10.1108/14013380810872734 Branswijck, D., & Everaert, P. (2012). Intellectual capital disclosure commitment: Myth or reality? Journal of Intellectual Capital, 13(1), 39–56. https://doi.org/10.1108/14691931211196204 Brüggen, A., Vergauwen, P., & Dao, M. (2009). Determinants of intellectual capital disclosure: Evidence from Australia. Management Decision, 47(2), 233–245. https://doi.org/10.1108/ 00251740910938894 Brusca, I., Labrador, M., & Larran, M. (2018). The challenge of sustainability and integrated reporting at universities: A case study. Journal of Cleaner Production, 188, 347–354. https:// doi.org/10.1016/j.jclepro.2018.03.292 Brusca, I., Cohen, S., Manes-Rossi, F., & Nicolò, G. (2019). Intellectual capital disclosure and academic rankings in European universities. Meditari Accountancy Research, 28(1), 51–71. https://doi.org/10.1108/medar-01-2019-0432 Carrión García, A., Carot, J., Soeiro, A., Hämäläinen, K., Boffo, S., Pausits, A., Murphy, M., Marhl, M., Vidal, J., Mora, J., & Padfield, C. (2012). Green Paper. Fostering and Measuring ‘Third Mission’ in Higher Education Institutions. Castilla-Polo, F., & Gallardo-Vázquez, D. (2016). The main topics of research on disclosures of intangible assets: A critical review. Accounting, Auditing and Accountability Journal, 29(2), 323–356. https://doi.org/10.1108/AAAJ-11-2014-1864 Catasús, B., & Gröjer, J. E. (2003). Intangibles and credit decisions: Results from an experiment. European Accounting Review, 12(2), 327–355. https://doi.org/10.1080/0963818032000089418 Cesaroni, F., & Piccaluga, A. (2016). The activities of university knowledge transfer offices: Towards the third mission in Italy. Journal of Technology Transfer, 41(4), 753–777. https:// doi.org/10.1007/s10961-015-9401-3 Chevaillier, T. (2013). Evaluation in French higher Education: History, policy and debates. Scuola Democratica, 4(2), 619–627. Chiucchi, M. S. (2013). Measuring and reporting intellectual capital: Lessons learnt from some interventionist research projects. Journal of Intellectual Capital, 14(3), 395–413. https://doi. org/10.1108/JIC-03-2013-0036 CIVIT. (2010). DELIBERA N.89/2010-Indirizzi in materia di parametri e modelli di riferimento del sistema di misurazione e valutazione della performance. Cuganesan, S. (2005). Intellectual capital-in-action and value creation: A case study of knowledge transformations in an innovation project. Journal of Intellectual Capital, 6(3), 357–373. https:// doi.org/10.1108/14691930510611102 Cuganesan, S., Boedker, C., & Guthrie, J. (2007). Enrolling discourse consumers to affect mate- rial intellectual capital practice. Accounting, Auditing & Accountability Journal, 20(6), 883–911. https://doi.org/10.1108/09513570710830281 Di Berardino, D., & Corsi, C. (2018). A quality evaluation approach to disclosing third mission activities and intellectual capital in Italian universities. Journal of Intellectual Capital, 19(1), 178–201. https://doi.org/10.1108/JIC-02-2017-0042 Dumay, J., & Cai, L. (2015). Using content analysis as a research methodology for investigating intellectual capital disclosure: A critique. Journal of Intellectual Capital, 16(1), 121–155. https://doi.org/10.1108/JIC-04-2014-0043 Dumay, J., Bernardi, C., Guthrie, J., & Demartini, P. (2016). Integrated reporting: A structured literature review. Accounting Forum, 40(3), 166–185. https://doi.org/10.1016/j.accfor.2016. 06.001

Third Mission and Intellectual Capital External Dimension: The. . .

175

E3M Project. (2011). Final Report of Delphi Study. E3M Project - European Indicators and Ranking Methodology for University Third Mission. Edvinsson, L. (2013). IC 21: Reflections from 21 years of IC practice and theory. Journal of Intellectual Capital, 14(1), 163–172. https://doi.org/10.1108/14691931311289075 Edvinsson, L., & Malone, M. S. (1997). Intellectual capital: Realizing your Company’s true value by finding its hidden Brainpower. Harper Business. European Commission. (2006). RICARDIS reporting intellectual capital to augment research, development and innovation in SMEs. European Tertiary Education Register. (2018). Eter project. Retrieved from https://www.eterproject.com/search/filtered Giuliani, M., & Marasca, S. (2011). Construction and valuation of intellectual capital: A case study. Journal of Intellectual Capital, 12(3), 377–391. https://doi.org/10.1108/14691931111154698 Giuliani, M., Chiucchi, M. S., & Marasca, S. (2016). A history of intellectual capital measurements: From production to consumption. Journal of Intellectual Capital, 17(3), 590–606. https://doi. org/10.1108/JIC-08-2015-0071 Göransson, B., Maharajh, R., & Schmoch, U. (2009). New activities of universities in transfer and extension: Multiple requirements and manifold solutions. Science and Public Policy, 36(2), 157–164. https://doi.org/10.3152/030234209X406863 Greco, G. (2014). Una comparazione internazionale tra i sistemi di valutazione della ricerca scientifica. Management Control, 1, 87–99. Guthrie, J., Petty, R., Yongvanich, K., & Ricceri, F. (2004). Using content analysis as a research method to inquire into intellectual capital reporting. Journal of Intellectual Capital, 5(2), 282–293. https://doi.org/10.1108/14691930410533704 Guthrie, J., Humphrey, C., Jones, L. R., & Olson, O. (2005). International public financial management reform: Progress, contradictions, and challenges. IAP. Guthrie, J., Petty, R., & Ricceri, F. (2006). The voluntary reporting of intellectual capital: Comparing evidence from Hong Kong and Australia. Journal of Intellectual Capital, 7(2), 254–271. https://doi.org/10.1108/14691930610661890 Guthrie, J., Petty, R., & Ricceri, F. (2007). Intellectual capital reporting: Lessons from Hong Kong and Australia. Institute of Chartered Accountants of Scotland. Guthrie, J., Ricceri, F., & Dumay, J. (2012). Reflections and projections: A decade of intellectual capital accounting research. The British Accounting Review, 44(2), 68–82. https://doi.org/10. 1016/j.bar.2012.03.004 Hcéres. (2020). Hcéres: contributor to the enhancement of higher education. Retrieved from https://www.hceres.fr/en Hicks, D. (2012). Performance-based university research funding systems. Research Policy, 41(2), 251–261. https://doi.org/10.1016/j.respol.2011.09.007 Holland, J. (2001). Financial institutions, intangibles and corporate governance. Accounting, Auditing and Accountability Journal, 14(4), 497–529. Humphreys, A., & Jen-Hui Wang, R. (2018). Automated text analysis for consumer research. Journal of Consumer Research, 44(6), 1274–1306. https://doi.org/10.1093/jcr/ucx104 Kong, E., & Prior, D. (2008). An intellectual capital perspective of competitive advantage in nonprofit organizations. International Journal of Nonprofit and Voluntary Sector Marketing, 13(2), 119–128. https://doi.org/10.1002/nvsm.315 Krippendorff, K. (2004). Content analysis: An introduction to its methodology (2nd ed.). Sage. Leitner, K.-H. (2004). Intellectual capital reporting for universities: Conceptual background and application for Austrian universities. Research Evaluation, 13(2), 129–140. https://doi.org/10. 3152/147154404781776464 Leuz, C., & Verrecchia, R. E. (2000). The economic consequences of increased disclosure. Journal of Accounting Research, 38, 91–124. https://doi.org/10.2307/2672910 Lev, B. (1992). Information disclosure strategy. California Management Review, 34(4), 9–32. https://doi.org/10.2307/41166701 Lev, B. (2001). Intangibles: Management, measurement, and reporting. Brooking Institution Press.

176

E. Bonollo et al.

Lev, B., & Zambon, S. (2003). Intangibles and intellectual capital: An introduction to a special issue. European Accounting Review, 12(4), 597–603. https://doi.org/10.1080/ 0963818032000162849 Loi, M., & Di Guardo, M. C. (2015). The third mission of universities: An investigation of the espoused values. Science and Public Policy, 42(6), 855–870. https://doi.org/10.1093/scipol/ scv012 Lombardi, R., Massaro, M., Dumay, J., & Nappo, F. (2019). Entrepreneurial universities and strategy: The case of the University of Bari. Management Decision, 57(12), 3387–3405. https://doi.org/10.1108/MD-06-2018-0690 Low, M., Samkin, G., & Li, Y. (2015). Voluntary reporting of intellectual capital: Comparing the quality of disclosures from New Zealand, Australian and United Kingdom universities. Journal of Intellectual Capital, 16(4), 1469–1930. https://doi.org/10.1108/JIC-03-2015-0022 Lumino, R., Gambardella, D., & Grimaldi, E. (2017). The evaluation turn in the higher education system: Lessons from Italy. Journal of Educational Administration and History, 49(2), 87–107. https://doi.org/10.1080/00220620.2017.1284767 Manes Rossi, F., Nicolò, G., & Tartaglia Polcini, P. (2018). New trends in intellectual capital reporting: Exploring online intellectual capital disclosure in Italian universities. Journal of Intellectual Capital, 19(4), 814–835. https://doi.org/10.1108/JIC-09-2017-0119 Marhl, M., & Pausits, A. (2011). Third mission indicators for new ranking methodologies. Evaluation in Higher Education, 5(1), 43–64. Mariani, G., Carlesi, A., & Scarfò, A. A. (2018). Academic spinoffs as a value driver for intellectual capital: The case of the University of Pisa. Journal of Intellectual Capital, 19(1), 202–226. https://doi.org/10.1108/JIC-03-2017-0050 Massaro, M., Dumay, J., & Guthrie, J. (2016). On the shoulders of giants: Undertaking a structured literature review in accounting. Accounting, Auditing and Accountability Journal, 29(5), 767–801. https://doi.org/10.1108/AAAJ-01-2015-1939 Molas-Gallart, J., Salter, A., Patel, P., Scott, A., & Duran, X. (2002). Measuring third stream activities. Final report to the Russell Group of Universities. SPRU, University of Sussex. Mouritsen, J. (2006). Problematising intellectual capital research: Ostensive versus performative IC. Accounting, Auditing & Accountability Journal, 19(6), 820–841. https://doi.org/10.1108/ 09513570610709881 Mouritsen, J., Nikolaj Bukh, P., & Marr, B. (2004). Reporting on intellectual capital: Why, what and how? Measuring Business Excellence, 8(1), 46–54. https://doi.org/10.1108/ 13683040410524739 Nahapiet, J., & Ghoshal, S. (1998). Social capital, intellectual capital, and the organizational advantage. Academy of Management Review, 23(2), 242–266. https://doi.org/10.5465/amr. 1998.533225 Nedeva, M. (2007). New tricks and old dogs? The “third mission” and the re-production of the university. In World Yearbook of Education 2008 (pp. 105–123). Routledge. OEU PRIME. (2006). Methodological guide. Palla, P. G. (Ed.). (2016). La terza missione dell’Università. In Universitas Studi e Documentazione di Vita Universitaria (Vol. 141). Petty, R., & Guthrie, J. (2000). Intellectual capital literature review: Measurement, reporting and management. Journal of Intellectual Capital, 1(2), 155–176. Provalis Research. (2015). WordStat user’s guide. Retrieved from https://provalisresearch.com/ Documents/WordStat7.pdf Ramírez, Y. (2010). Intellectual capital models in Spanish public sector. Journal of Intellectual Capital, 11(2), 248–264. https://doi.org/10.1108/14691931011039705 Ramírez, Y., & Gordillo, S. (2014). Recognition and measurement of intellectual capital in Spanish universities. Journal of Intellectual Capital, 15(1), 173–188. https://doi.org/10.1108/JIC-052013-0058

Third Mission and Intellectual Capital External Dimension: The. . .

177

Ramírez, Y., Penalver, J. F. S., & Ponce, A. T. (2011). Intellectual capital in Spanish public universities: Stakeholders’ information needs. Journal of Intellectual Capital, 12(3), 356–376. https://doi.org/10.1108/14691931111154689 Rebora, G., & Turri, M. (2013). The UK and Italian research assessment exercises face to face. Research Policy, 42(9), 1657–1666. https://doi.org/10.1016/j.respol.2013.06.009 REF. (2011). Assessment framework and guidance submission. REF. (2019). Guidance on submission. Ricceri, F. (2008). Intellectual capital and knowledge management: Strategic management of knowledge resources. Routledge. Saad, M., & Zawdie, G. (2011). Introduction to special issue: The emerging role of universities in socio-economic development through knowledge networking. Science and Public Policy, 38(1), 3–6. https://doi.org/10.3152/030234211X12960315267453 Sánchez, M. P., & Elena, S. (2006). Intellectual capital in universities: Improving transparency and internal management. Journal of Intellectual Capital, 7(4), 529–548. https://doi.org/10.1108/ 14691930610709158 Sánchez, M. P., Elena, S., & Castrillo, R. (2009). Intellectual capital dynamics in universities: A reporting model. Journal of Intellectual Capital, 10(2), 307–324. https://doi.org/10.1108/ 14691930910952687 Sangiorgi, D., & Siboni, B. (2017). The disclosure of intellectual capital in Italian universities: What has been done and what should be done. Journal of Intellectual Capital, 18(2), 354–372. https://doi.org/10.1108/JIC-09-2016-0088 Secundo, G., Margherita, A., Elia, G., & Passiante, G. (2010). Intangible assets in higher education and research: Mission, performance or both? Journal of Intellectual Capital, 11(2), 140–157. https://doi.org/10.1108/14691931011039651 Secundo, G., Elena Perez, S., Martinaitis, Ž., & Leitner, K. H. (2017). An intellectual capital framework to measure universities’ third mission activities. Technological Forecasting and Social Change, 123, 229–239. https://doi.org/10.1016/j.techfore.2016.12.013 Secundo, G., Massaro, M., Dumay, J., & Bagnoli, C. (2018). Intellectual capital management in the fourth stage of IC research: A critical case study in university settings. Journal of Intellectual Capital, 19(1), 157–177. https://doi.org/10.1108/JIC-11-2016-0113 Siboni, B., Nardo, M. T., & Sangiorgi, D. (2013). Italian state university contemporary performance plans: An intellectual capital focus? Journal of Intellectual Capital, 14(3), 414–430. https://doi. org/10.1108/JIC-03-2013-0033 Starovic, D., & Marr, B. (2003). Understanding corporate value: Managing and reporting intellectual capital. CIMA. Stewart, T. A. (2010). Intellectual capital: The new wealth of organization. Currency. Sveiby, K. E. (2001). A knowledge-based theory of the firm to guide in strategy formulation. Journal of Intellectual Capital, 2(4), 344–358. Tight, M. (2003). Researching higher education. Open University Press. Times Higher Education. (2018). World University ranking. Retrieved from https://www. timeshighereducation.com/world-university-rankings Turri, M. (2014). The new Italian agency for the evaluation of the university system (ANVUR): A need for governance or legitimacy? Quality in Higher Education, 20(1), 64–82. https://doi.org/ 10.1080/13538322.2014.889429 Turri, M. (2015). La valutazione nelle università europee. Scuola Democratica, 6(1), 83–102. Vergauwen, P. G. M. C., & Van Alem, F. J. C. (2005). Annual report IC disclosures in the Netherlands, France and Germany. Journal of Intellectual Capital, 6(1), 89–104. https://doi. org/10.1108/14691930510574681 Yu, A., & Humphreys, P. (2013). From measuring to learning? Probing the evolutionary path of IC research and practice. Journal of Intellectual Capital, 14(1), 26–47. https://doi.org/10.1108/ 14691931311289002

Institutional Logics to Unveil Entrepreneurial Universities’ Performances: A Cross-Country Comparative Study Canio Forliano, Paola De Bernardi, Alberto Bertello, and Francesca Ricciardi

1 Introduction Universities are adaptive and dynamic organisms that accomplish three different missions. The first mission regards teaching activities and implies knowledge preservation and transmission. The second mission includes the research activity and, consequently, knowledge production and advancement. The third mission consists of every activity that universities are asked to perform besides teaching and research, in order to contribute to economic and societal developments (Battaglia et al., 2017; Chau et al., 2017). These three missions are the result of two different academic revolutions that have brought universities to open their doors to a multiple set of stakeholders while increasing, at the same time, the complexity of the organizational field wherein they operate and interact (Geuna & Muscio, 2009; Philpott et al., 2011; Ardito et al., 2019). Adopting the institutional logics perspective, we could say that universities operate in organizational fields that shape and are shaped by multiple actors (e.g., governments, private companies, and civil society) and where different institutional logics coexist. Institutional logics provide actors with socially shared, deeply held rules of the game, assumptions, and values that shape cognitions and behaviors and

C. Forliano (*) Department of Political Science and International Relations, University of Palermo, Palermo, Italy Department of Management, University of Turin, Turin, Italy e-mail: [email protected] P. De Bernardi · A. Bertello · F. Ricciardi Department of Management, University of Turin, Turin, Italy e-mail: [email protected]; [email protected]; [email protected] © The Author(s), under exclusive license to Springer Nature Switzerland AG 2022 E. Caperchione, C. Bianchi (eds.), Governance and Performance Management in Public Universities, SIDREA Series in Accounting and Business Administration, https://doi.org/10.1007/978-3-030-85698-4_9

179

180

C. Forliano et al.

form the basis for legitimacy (Pierce et al., 2017). The literature has provided several examples of the different logics embedded in organizational fields, and, more recently, the reflection has moved to how organizations respond to institutional complexity when conflicting logics are at stake (Ocasio et al., 2017). Most of these studies have focused on the combination of two conflicting logics such as commercial logic versus social logic in social enterprises (Pache & Santos, 2013), market logic versus academic science logic in biotechnological companies (Murray, 2010), financial logic versus community logic in community banks (Almandoz, 2012), and market logic versus public value logic in debt collection companies (Forliano et al., 2020). However, there is still a paucity of studies analyzing logic multiplicity when more than two logics are at stake. The purpose of this study is to tackle this challenge by examining the set of institutional logics underlying the universities’ missions. Hence, this chapter aims to answer the following two research questions: RQ1. “What are the institutional logics embedded in universities’ organizational field?” RQ2. “How do universities respond to this logic multiplicity by developing specific sets of key performance indicators (KPIs)?” To answer these questions, we developed an exploratory research based on qualitative data from a Continental European university, an Anglo-Saxon university, and an Asian university. As a first step, we individuated nine logics (three for each mission) by developing semi-structured interviews with the universities under analysis. The logics individuated are inclusiveness, vocational, and excellence logic as regards the first mission; focalization, materiality, and excellence as regards the second mission; dissemination, translational, and entrepreneurial as regards the third mission. As a second step, we conducted a document analysis to develop a framework that identifies which institutional logics are covered and which ones are overlooked by the current indicators. In this way, this study aims at providing two main theoretical contributions. First, this research contributes to the literature on institutional logics, considering a complex context in which several logics are at stake. Indeed, scholars have mainly ignored logics multiplicity to date. Second, this chapter contributes also to studies that address universities’ performance management by providing a comprehensive mapping of the KPIs used to assess such performances in the three different academic missions. Practitioners and decision makers can find useful implications and insights as well. The remainder of this chapter is structured as follows. Section 2 provides the theoretical background of the work. Section 2 systematically describes the adopted methodology. Section 3 presents the results. Finally, Sect. 4 discusses the results and the implications, limitations, and further developments of this study.

Institutional Logics to Unveil Entrepreneurial Universities’ Performances:. . .

181

2 Theoretical Background 2.1

Entrepreneurial Universities and the Third Mission

Over the last decades, the role and the mission of universities have undergone several transitions requiring changes in content, structure, governance, and strategies (Guerrero et al., 2016). More specifically, universities have gradually turned themselves from ivory towers to entrepreneurial agents (Etzkowitz et al., 2000; Forliano et al., 2021) that are expected to develop a wide range of relationships with multiple stakeholders in order to continually adapt the university model (Miller et al., 2014). With the advent of the knowledge-based economy (Secundo et al., 2019), therefore, universities are called to play different interconnected roles that respond to three different missions, namely teaching, research, and the third mission (Etzkowitz et al., 2000). For a while, the teaching mission has been the traditional key reputation factor (Maassen, 2017). It consists of an education offer that mainly comprises courses at all levels, postgraduate courses, masters, and refresher and professional training courses (Marginson & Van der Wende, 2007). Then, institutional pressures (as a consequence of governmental initiatives and reforms) have led universities to draw greater attention to research activities that have become increasingly important for either overall or individual assessments. The research mission, usually characterized by a cross-disciplinary nature, has been thus fostered by both research centers and labs (Etzkowitz et al., 2000) and the participation in specific projects (Secundo et al., 2017). Moreover, in the 1980s, these two missions were considered not enough and the so-called third mission started emerging (Etzkowitz & Leydesdorff, 2000). Although the third mission comprehends several different activities, these can be grouped into three main categories (E3M, 2010), namely (i) technology transfer and innovation, (ii) continuing education, and (iii) public engagement. Despite the third mission encompasses any activity oriented to foster the development of economies and societies, universities have adopted the label entrepreneurship as a crucial concept to explain this revolutionary road (Etzkowitz et al., 2000). In this sense, the entrepreneurial university concept was coined first to indicate that universities should consider “the possibilities of new sources of funds to come from patenting the discoveries made by scientists holding academic appointments, from the sale of knowledge gained by research done under contract with commercial firms, and from entry into partnership with private business enterprises” (Etzkowitz, 1983, p. 198). A couple of decades later, another definition from Etzkowitz et al. (2000, p. 313) defined entrepreneurial universities as any university undertaking entrepreneurial activities with the objective of “improving regional or national economic performance as well as the university’s financial advantage and that of its faculty.” Over the years other definitions have been provided, also considering the capacity of entrepreneurial universities to support the creation of new ventures or perform other technology transfer activities (Etzkowitz, 2003), to contribute to the economic and social development of a given territory (Guerrero et al., 2015), to promote the

182

C. Forliano et al.

entrepreneurial orientation of students and align their curricula to new employability demands (Grimaldi & Fernandez, 2017), to disseminate knowledge through public engagement activities (Rinaldi et al., 2018), or promoting a sustainability transition of modern economies (Trencher et al., 2014). In short terms, it could be said that entrepreneurial universities are adaptive institutions that complement the missions of knowledge preservation and transmission through education and knowledge production and advancement through basic research with the third mission of widely contributing to the economic development of societies (Geuna & Muscio, 2009; Philpott et al., 2011).

2.2

Logic Multiplicity in Universities

Recent developments in the neo-institutional theory have argued that society is made up of inter-institutional systems in which multiple institutional orders coexist simultaneously (Qiu et al., 2019) and each of these institutional orders may influence in a different way individuals and organizations’ behaviors. This theoretical underpinning resulted in the development of the concept of institutional logics, defined by Thornton and Ocasio (1999, p. 804) as “the socially constructed, historical patterns of material practices, assumptions, values, beliefs, and rules by which individuals produce and reproduce their material subsistence, organize time and space, and provide meaning to their social reality.” According to the institutional logics perspective, the interaction between the society and a social organization is intermediated by the organizational field wherein an organization is immersed (Greenwood & Suddaby, 2006; Wooten & Hoffman, 2008; Zietsma & Lawrence, 2010). Strictly speaking, an organizational field is “a community of organizations that partakes of a common meaning system and whose participants interact more frequently and fatefully with one another than with actors outside the field” (Scott, 1995, p. 56). It includes any constituent that may have an influence on the organization such as public bodies, professional and trade associations, critical exchange partners, sources of funding, special interest groups, and the general public (Powell & DiMaggio, 2012; Scott, 1991). Subsequently, studying these fields is fundamental for recognizing, understanding, and abstracting what expectations and practices characterize a given organization (Pierce et al., 2017; De Bernardi et al., 2019). According to the advancements in the institutional logics perspective, organizational fields may be characterized by the presence of multiple logics. These logics can be more or less complementary (i.e., institutional pluralism, see Kraatz & Block, 2008, 2017) or conflicting and competing (i.e., institutional complexity, see Greenwood et al., 2011). Logic multiplicity has been investigated in education fields (Ezzamel et al., 2012), as well as in universities, as a result of the increasing number of stakeholders and the transition from the teaching mission to research and third mission (Guerrero et al., 2016). The aim of this study is, therefore, to extend the knowledge of how universities respond to organizational field-level multiple logics, shedding light on how this logic multiplicity is reflected in their KPIs.

Institutional Logics to Unveil Entrepreneurial Universities’ Performances:. . .

183

3 Methodology In our study, we aim at investigating what are the institutional logics that are embedded in a university’s organizational field and how such institutions monitor their performances in response to these logics. Previous research on how universities and educational fields deal with different and often contradictory institutional logics suggested more empirical analysis on the topic (Dumay et al., 2017; Fini et al., 2019). In this sense, after having selected three cases to be investigated, we performed a two-step inductive qualitative analysis (Eisenhardt, 1989; Yin, 2017). First, we conducted semi-structured interviews for identifying what institutional logics shape the purse of the three academic missions. Second, we analyzed secondary sources of data for mapping the KPIs used for monitoring universities’ performances in such activities.

3.1

Cases Selection

In line with McGrath et al. (1982), we decided to have a closer view of the three cases. These cases have been purposefully selected (Patton, 1990; Siggelkow, 2007) as considered to be good representatives of the entrepreneurial model in different institutional contexts and cultures. In doing so, we decided to provide results that can be generalized regardless of the geographical locations of entrepreneurial universities, rather than offering a nuanced overview that is a common characteristic of case studies. So, the selected cases are one university from Continental Europe, one AngloSaxon university, and one Asian university that are globally considered to be particularly relevant in the field of entrepreneurship and technology transfer. The European university (University of Milano Bicocca, U.MIB) is one of the largest generalist and multidisciplinary universities in Italy. In the last national evaluation of the research activity and results, 8 out of 14 departments were considered excellent at the national level, resulting in 60 Million Euros financed over 5 years (University of Milan-Bicocca, 2020). Moreover, based on the Italian network for the valorization of public research, it includes 1.1% of Italian academic spin-offs (Netval, 2018). It has also largely redesigned its courses through an experience-oriented approach, able to foster the entrepreneurial competences of its students significantly, besides providing them with technical competencies. Then, the Anglo-Saxon university (University of Birmingham, U.BIRM) is considered one of the most impacting universities for what regards the regional advancement and economic growth of the Midlands. Indeed, a recent report from Oxford Economics estimated that this university contributes around 1 Billion Pounds to the region’s economy and considered it as one of the most proficient job providers of the region (Jelfs, 2016). Moreover, it is part of the Russell Group, which includes the top universities for what regards research activities in the UK. Finally, the Asian university (University of Hong

184

C. Forliano et al.

Kong, U.HK) was chosen since it is considered one of the most internationalized universities in the world, able to foster its regional innovation system through the research activity and thanks to several connections with world-class scientific and scholarly communities (Tang, 2018; Wang, 2018). To ensure comparability, all these universities are public universities with deep historical roots and considered medium-large universities (i.e., they have around 30–40 thousand enrolled students). Moreover, all of them are recognized as leading in terms of performances in their home countries according to the Quacquarelli Symonds’ World University ranking (QS ranking, 2020), which is one of the leading international agencies involved in universities’ evaluation and activities monitoring.

3.2

Data Collection and Analysis

After having selected the cases to be investigated, as the first step of our analysis, we conducted ten in-depth semi-structured interviews (Bryman & Bell, 2011). The protocol was based on semi-structured interviews aimed at gaining insights on values, beliefs, and normative expectations behind the three university missions as well as how these missions were accomplished within the context of the university. The objective of this phase was explorative, for identifying what are the institutional logics that underlie the investigated organizations in the pursuit of the three academic missions. The interviews involved at least two people from each of the selected universities, working there as scholars or heads of the research and third mission activities. On average, they lasted 60 min each and were conducted both face-to-face and through Skype meetings. In order to identify the interviewees’ point of view, ensuring naturalness, we made interviewees aware only of the general aim of the research and not of specific questions in advance (Easton, 2010). Moreover, using the list of questions as a tool for supporting the interview but not hindering a spontaneous conversation, we could discuss any new issue that emerged during the dialogue (Gillham, 2005) and analyze the investigated topic on the basis of the opinions, experiences, and perspectives of each interviewee (Silverman, 2013). Then, as the second step of our analysis, we transcribed the interviews and triangulated them with secondary data sources (Eisenhardt, 1989). So, we revised the strategic and/or performance plans of the three selected universities. These plans, usually covering 3–5 years, were preferred to short-term operative plans since they are less likely to be changed. On average, in our case they covered a period going from 2015 to 2020. This analysis enabled us to identify what KPIs are used for evaluating the activities related to the three different academic missions at both organizational and individual levels. Thus, at least two authors independently coded each document for ensuring the accuracy and reproducibility of the analysis (Strauss & Corbin, 1990). At first, we adopted an open coding technique for discovering, labeling, and categorizing the topic under investigation. Then, after having achieved a shared interpretation of data, we used an axial coding technique (Bryman & Bell, 2011). In this way, any link existing between categories and subcategories could be

Institutional Logics to Unveil Entrepreneurial Universities’ Performances:. . .

185

established and data reorganized, enabling us to outline three institutional logics for each mission, resulting in nine total institutional logics.

4 Results Triangulating the results of the interviews with the analysis of relevant literature and secondary sources of data, we could identify nine different institutional logics. So, we found that three different institutional logics shape each of the three academic missions and their related KPIs as follows.

4.1

Teaching Institutional Logics

Several scholars criticized the shift toward the entrepreneurial university model and opposed the pursuit of third mission activities because they subtract time and resources to teaching and research (e.g., Slaughter & Leslie, 1998; Hayes, 2017). However, performances achieved in pursuing the first mission (i.e., teaching activities) are still one of the main criteria through which funds are allocated to universities (Maassen, 2017). In this sense, some tensions can occur at organizational or individual levels, between different missions and even within the same mission, with scholars asked to pursue entrepreneurship and pure academic activities at the same time. Nonetheless, according to Etzkowitz et al. (2000), the third mission is not meant to weaken these missions but rather to leverage them to their full potential. The Inclusiveness Logic In Table 1, the KPIs related to the inclusiveness logic are presented. Notably, this logic is linked to the necessity of universities to foster social cohesion and integration, enabling as many people as possible to participate in the higher education system (CEC, 2005; Scott, 2001). In this sense, KPIs such as the resources allocated for helping physically and/or learning disabled students or grants for deserving students emerged. Furthermore, this logic is linked to the capacity of the university to enhance its reputation, for example, hiring “star professors” (Macfarlane, 2013) or innovating the teaching activities through digital technologies (Huynh et al., 2003). The Vocational Logic This logic is associated with the idea that employability chances of students are associated with the capability of universities to transmit them the right knowledge and the network of partnerships that such institutions are able to build (Clark, 1998; Gibb et al., 2009; Grimaldi & Fernandez, 2017). So, Table 2 shows the KPIs associated with this logic, such as the percentage of graduates with a curricular internship, the number of courses that have been revised to meet the needs and expectations of the labor market, or the number of companies involved in job placement activities.

186

C. Forliano et al.

Table 1 KPIs of the teaching inclusiveness logic Teaching inclusiveness logic % of incoming visiting students % of limited enrolment courses % of professors recruited from other universities % of students attending e-learning or blended learning courses % of students satisfied with the infrastructure Grants for deserving students N of “star” professors N of courses in e-learning or blended learning N of incoming visiting professors N of MOOCs Partnerships with other leading organizations in the same area Resources allocated for helping physically and/or learning-disabled students Total number of students

U. BIRM X

U. HK X

U. MIB X X X X

X X X

X X X

X

X

X

U. BIRM

U. HK X X X X

U. MIB

X X X

Source: Authors’ own elaboration Table 2 KPIs of the teaching vocational logic Teaching vocational logic % of graduated PhD students currently employed % of graduated with a curricular internship % of students employed in 1 year Companies’ satisfaction of employed graduates Number of companies involved in job placement initiatives Number of degree courses revised according to employment opportunities Number of students accessing career guidance services Partnerships with other leading organizations in the same area Students’ satisfaction with the service provided by the university

X

X

X X X X X

X

X X

Source: Authors’ own elaboration

The Excellence Logic The logic of excellence is founded on the assumption that universities need to attract and/or create talented and skilled people (i.e., both faculties and students), promoting the development of human capital and contributing to the competitivity of the territorial system in which are located (Secundo et al., 2017). In this regard, also the internationalization rate seems to play a crucial role (Tang, 2018; Wang, 2018). Thus, Table 3 shows the KPIs associated with this logic. Examples are the percentage of incoming students and professors, the staff employed, or students’ perceived quality of teaching.

Institutional Logics to Unveil Entrepreneurial Universities’ Performances:. . .

187

Table 3 KPIs of the teaching excellence logic Teaching excellence logic % of academic staff with recognized professional teaching accreditation % of ECTS registered abroad % of incoming visiting students % of limited enrolment courses % of outgoing visiting students Extra-regional mobility Faculties/Number of students Number of double degrees taken through exchange programs Number of incoming visiting professors Number of on-time graduates Number of teaching assistants/Number of enrolled students Students’ perceived quality of teaching

U. BIRM

U. HK X

X

X

X X X X X

X

X X X

X

U. MIB

X X X X X X X X X

X

Source: Authors’ own elaboration Table 4 KPIs of the research focalization logic Research focalization logic Participation in joint research centers Partnerships with other leading organizations in the same area

U.BIRM X

U.HK X X

U.MIB

Source: Authors’ own elaboration

4.2

Researching Institutional Logics

Some authors criticized the entrepreneurial shift of universities, claiming that basic research is no more a preoccupation of such institutions, and recognizing the introduction of private logics for what regards knowledge production and commercialization (Slaughter & Leslie, 1998; Hayes, 2017). However, at the individual level, most academics engage in a combination of basic and applied research (Bentley et al., 2015). Indeed, career progressions of academics are often linked to high performances in researching rather than in teaching or third mission activities. The request is to produce knowledge that is relevant and useful at both an international or local level, considering its impact on society (Owen et al., 2012; von Schomberg, 2012). According to the analysis of the primary and secondary sources, we found that only the excellence logic is well covered by entrepreneurial universities for what regards performance assessment, with a lack of KPIs for mapping the performances achieved in pursuing the focalization and materiality logic. The Focalization Logic This logic is founded on the idea that universities should become a reference point, at both regional and international levels, focalizing themselves on some specific areas (Etzkowitz et al., 2000; Siegel & Zervos, 2002). From the KPIs in Table 4, this logic seems to be essentially neglected concerning

188

C. Forliano et al.

Table 5 KPIs of the research materiality logic Research materiality logic % of papers co-authored with non-academics % of private funding for dissemination activities Incomes from competitively financed researches Number of financed researches

U.BIRM X X X

U.HK

U.MIB

X X X

X X

Source: Authors’ own elaboration Table 6 KPIs of the research excellence logic Research excellence logic % of completed research projects out of planned projects % of foreign PhD students % of multidisciplinary PhD programs % of papers published in highly rated peer-reviewed journals % of papers with international co-authors Investment in research infrastructures and equipment Number of incoming visiting professors Number of PhD students Number of professors who have received international awards Number of received citations

U.BIRM

U.HK

U.MIB

X X X X X

X X X X X X X

X

X X

Source: Authors’ own elaboration

performance evaluation. Indeed, we found only two KPIs related to this logic: the participation in joint research centers and the establishment of partnerships with other leading organizations in the same area. The Materiality Logic This logic is based on the assumption that universities should contribute to solving real-world problems through their researching activity, going beyond the ivory tower and attracting new funding from external subjects (Etzkowitz et al., 2000). Owen et al. (2012) defined this as transitioning “from science in society to science for society, with society.” However, also this logic seems to be only partially covered in the strategic plans of the selected universities, due to a paucity of KPIs (see Table 5). Examples are the percentage of papers co-authored with non-academics or funding attracted for financed researches or promoting dissemination activities. The Excellence Logic As can be observed in Table 6, this is the mostly covered logic for what regards the researching activity. It is linked to the aim of pursuing the second mission at the highest level possible, in terms of both relevance and recognition by the international academic communities (Secundo et al., 2017; Tang, 2018; Wang, 2018). Some KPIs that emerged from the analysis are the percentage of papers published in highly rated peer-reviewed journals and the number of citations received by scholars affiliated with the university, the capacity to attract international PhD students and professors, or the number of international awards received for excelling in researching.

Institutional Logics to Unveil Entrepreneurial Universities’ Performances:. . .

4.3

189

Third Mission Institutional Logics

The third mission includes every activity besides teaching and researching. Due to the vast heterogeneity of activities that universities perform, this mission is highly dependent on the field that each university or department covers (Guerrero et al., 2016). Thus, STEM departments (i.e., science, technology, engineering, and mathematics) would perform more activities that are related to technology transfer and research commercialization (Etzkowitz, 2003), such as granting and licensing patents or participating in spin-offs and start-ups. Conversely, departments from social sciences would focus more on knowledge dissemination and outreach (Rinaldi et al., 2018), such as organizing public engagement initiatives, participating in advisory boards, or providing continuing education programs. Although this is the most recent mission, the analysis of the sources enabled us to identify three institutional logics that are broadly covered by the investigated universities in terms of KPIs. However, it should be noted that most indicators are still focused on capturing the tangible returns of such activities, measured in terms of economic benefits. The Dissemination Logic The third mission consists of disseminating knowledge outside the boundaries of academia, reaching a broader public than just students or the scientific community (Gleeson, 2010). In this regard, the academic can show both a pro-active or reactive role if he actively promotes or let himself being involved in some disseminating initiatives (e.g., organizing or participating in public events or interviews, contributing to nonscientific publications, participating). This logic seems to be better monitored by generalist universities (Table 7), through KPIs such as the number of MOOCs or public engagement initiatives, the economic

Table 7 KPIs of the third mission dissemination logic Third mission dissemination logic % of continuing training activities funded during the year % of private funding for dissemination activities Implementation of tools and/or metrics for evaluating the sustainability impact Income from continuing education activities Number of active continuing education programs Number of collaborative projects with third parties Number of companies participating in continuing education activities Number of consultancy services for knowledge exchange Number of faculties engaged as members of external advisory bodies Number of MOOCS Number of partnerships for providing sponsored continuous training programs Number of public engagement initiatives Source: Authors’ own elaboration

U. BIRM X

U. HK

X X

X X X X

U. MIB

X

X X X X

X X X X

X X

X

190

C. Forliano et al.

Table 8 KPIs of the third mission translational logic Third mission translational logic % of private funding for sponsored research % of granted university patents % of licensed patents % of PhD students with previous experience in a company Number of financed researches Number of registered patents in the last 10 years Revenues from granted patents

U.BIRM X X

U.HK X X X X X

X

X

U.MIB X

X X

Source: Authors’ own elaboration

returns for continuing education programs or sponsored research, or the implementation of tools and/or metrics for evaluating the sustainability impact. The Translational Logic Universities should be able to translate their activities, especially researching, addressing real-world issues and incentivizing companies, institutions, and communities to adopt new solutions at an operational level (Owen et al., 2012). In this sense, patents express one of the best examples to represent how research can address industrial necessities, potentially generating also new sources of income for the university (Etzkowitz & Leydesdorff, 2000). Table 8 shows how universities should pay more attention toward this logic, in which only U.HK seems to excel. Some KPIs that emerged from the analysis are the percentage of granted and licensed patents (and the revenues generated from them) or the percentage from granted patents, the number of registered patents in the last 10 years, and the percentage of university patents registered. The Entrepreneurial Logic In Table 9 the KPIs associated with the entrepreneurial logic are shown. In this regard, it is assumed that universities are actively called to foster innovation and contribute to the development of economies and societies, co-developing the sustainability paradigm through multi-stakeholder partnerships (Trencher et al., 2014; Rinaldi et al., 2018). Some of the most relevant KPIs that emerged from the analysis are the income generated from incubated companies or the percentage of spin-off that have gone through an acquisition or have been listed on the stock market, as well as the revenues generated from granted patents, sponsored researches, and other technology transfer activities. So, we can affirm that this logic is broadly covered, but mainly through KPIs that capture the economic return of such activities.

Institutional Logics to Unveil Entrepreneurial Universities’ Performances:. . .

191

Table 9 KPIs of the third mission entrepreneurial logic Third mission entrepreneurial logic % of acquired or listed spin-offs Implementation of tools and/or metrics for evaluating the sustainability impact Incomes from incubated companies Incomes from sponsored research Number of active spin-offs and start-ups Number of collaborations with spin-offs Number of spin-offs’ employees Participation in incubators, science parks, consortia, or associations for technology transfer Revenues from granted patents Revenues from technology transfer activities Revenues from sponsored researches

U. BIRM X X

U. HK

U. MIB X

X X X

X

X X X X

X X X X

X X X

Source: Authors’ own elaboration

5 Discussion and Conclusion 5.1

Discussion

Building an effective system for managing universities’ performances is undoubtedly a tough challenge (Meissner & Shmatko, 2017; Secundo et al., 2017). Indeed, such institutions are subject to continuous changes that also require adapting their management and organizational structures. In this study, we developed a framework for classifying and mapping the KPIs used to assess entrepreneurial universities’ performances according to different institutional logics. Notably, we identified nine different logics (three for each academic mission) that are linked to nine different views about what role entrepreneurial universities should assume in society. The first mission is the foundational mission that universities have been asked to pursue and represented a benchmark on which measuring their reputation for a long time (Maassen, 2017). So, the performances associated with teaching activities appear covered through a significant number of KPIs in each of the three logics that we identified (i.e., inclusiveness, vocational, and excellence). Conversely, the performances of the second mission are well covered only for what regards the excellence logic. Indeed, the reputation of universities and the careers of scholars are mainly built on this logic nowadays. It is not surprising that the two logics associated with the potential impact of universities on societies, namely the focalization and materiality logics, show a lower coverage rate. Finally, entrepreneurial universities show to pay high attention to map their performances in conducting third mission activities, even if most of them are related to capture the tangible aspects related to those activities. However, in line with the suggestions from Owen et al. (2012) and von

192

C. Forliano et al.

Schomberg (2012), more attention should be paid on the translational logic, which comprehends a lower coverage than the dissemination and entrepreneurial logics.

5.2

Implications

In this study, we aimed at providing both theoretical and practical contributions. From a theoretical point of view, this research contributes to the institutional logics’ theory analyzing a context in which more than two logics are at stake. Indeed, to date, most of the studies addressed the combination of two reciprocally conflicting or complementary logics (e.g., Murray, 2010; Almandoz, 2012; Pache & Santos, 2013), ignoring the existence of several logics (i.e., logics multiplicity) in complex and dynamic contexts such as the academic one. Indeed, entrepreneurial universities are asked to pursue three different missions that can be reciprocally complementary or conflicting, resulting in paradoxical tensions and competition around key resources allocation (e.g., faculty’s time) if not effectively governed. Moreover, this study contributes also to studies that addressed universities’ performance management by providing a comprehensive mapping of the KPIs used to assess such performances in the three different academic missions. Practitioners, and especially decision makers, would benefit from the results of this work as well, having the possibility to better align their strategies with the different logics underlying their decisions. Moreover, understanding the system of practices, values, rules, and beliefs that characterize the organizational field in which entrepreneurial universities operate would help decision makers increase cooperation inside an entrepreneurial university’s ecosystem.

5.3

Limitations and Further Steps

This study is not free from limitations, that can all represent possible future developments. First of all, we considered the organizational field of the investigated universities only for what regards the institutional logics that shape it (and their associated KPIs). So, future studies could investigate the organizational fields of such complex structures considering, for example, the definition of novel strategies, practices, or governance models, decisions on how to allocate resources, the design of adaptive performance management systems, or the active involvement of new actors in executing different tasks (Gibb et al., 2009; Ezzamel et al., 2012; Secundo et al., 2017). Moreover, further research is needed to understand how the normative context and the related coercive pressures influence universities and their institutional logics. Second, we analyzed the strategic and/or performance plans of a specific timespan (2015–2020). However, we encourage scholars to consider developing longitudinal studies to evaluate how decision makers are monitoring the performances of universities in pursuing the three academic missions and their

Institutional Logics to Unveil Entrepreneurial Universities’ Performances:. . .

193

related institutional logics. Moreover, future analyses could verify if the different KPIs are impacting the organizational asset of the investigated universities. Third, some indicators resulted in being cross-sectional with regard to different missions and logics, so it is still not clear how to properly consider and represent them. So, the proposed framework still needs to be revised to give major importance to these cross-cutting elements.

References Almandoz, J. (2012). Arriving at the starting line: The impact of community and financial logics on new banking ventures. Academy of Management Journal, 55(6), 1381–1406. Ardito, L., Ferraris, A., Petruzzelli, A. M., Bresciani, S., & Del Giudice, M. (2019). The role of universities in the knowledge management of smart city projects. Technological Forecasting and Social Change, 142, 312–321. Battaglia, D., Landoni, P., & Rizzitelli, F. (2017). Organizational structures for external growth of university technology transfer offices: An explorative analysis. Technological Forecasting and Social Change, 123, 45–56. Bentley, P. J., Gulbrandsen, M., & Kyvik, S. (2015). The relationship between basic and applied research in universities. Higher Education, 70(4), 689–709. Bryman, B., & Bell, E. (2011). Business research methods (3rd ed.). Oxford University Press. CEC. (2005). European universities: Enhancing Europe’s research base. European Commission. Chau, V. S., Gilman, M., & Serbanica, C. (2017). Aligning university–industry interactions: The role of boundary spanning in intellectual capital transfer. Technological Forecasting and Social Change, 123, 199–209. Clark, B. R. (1998). The entrepreneurial university: Demand and response. Tertiary Education and Management, 4(1), 5–16. De Bernardi, P., Bertello, A., & Forliano, C. (2019). Unpacking Higher Educational Institutions (HEIs) performances through the institutional logics lens. In IFKAD 14th international forum on knowledge assets dynamics-knowledge ecosystems and growth (pp. 1537–1555). Institute of Knowledge Asset Management (IKAM)-Arts for Business Institute-University of Basilicata. Dumay, X., Draelants, H., & Dahan, A. (2017). Organizational identity of universities: A review of the literature from 1972 to 2014. Theory and Method in Higher Education Research, 3, 99–118. E3M. (2010). Needs and constraints analysis of the three dimensions of third mission activities. E3M: European Indicators and Ranking Methodology for University Third Mission. Easton, G. (2010). Critical realism in case study research. Industrial Marketing Management, 39(1), 118–128. Eisenhardt, K. M. (1989). Building theories from case study research. Academy of Management Review, 14(4), 532–550. Etzkowitz, H. (1983). Entrepreneurial scientists and entrepreneurial universities in American academic science. Minerva, 21(2–3), 198–233. Etzkowitz, H. (2003). Research groups as ‘quasi-firms’: The invention of the entrepreneurial university. Research Policy, 32(1), 109–121. Etzkowitz, H., & Leydesdorff, L. (2000). The dynamics of innovation: From National Systems and “mode 2” to a Triple Helix of university–industry–government relations. Research Policy, 29 (2), 109–123. Etzkowitz, H., Webster, A., Gebhardt, C., & Terra, B. R. C. (2000). The future of the university and the university of the future: Evolution of ivory tower to entrepreneurial paradigm. Research Policy, 29(2), 313–330.

194

C. Forliano et al.

Ezzamel, M., Robson, K., & Stapleton, P. (2012). The logics of budgeting: Theorization and practice variation in the educational field. Accounting, Organizations and Society, 37(5), 281–303. Fini, R., Rasmussen, E., Wiklund, J., & Wright, M. (2019). Theories from the lab: How research on science commercialization can contribute to management studies. Journal of Management Studies, 56(5), 865–894. Forliano, C., De Bernardi, P., Bertello, A., & Temperini, V. (2020). Innovating business processes in public administrations: Towards a systemic approach. Business Process Management Journal., 26, 22. Forliano, C., De Bernardi, P., & Yahiaoui, D. (2021). Entrepreneurial universities: A bibliometric analysis within the business and management domains. Technological Forecasting and Social Change, 165, 120522. Geuna, A., & Muscio, A. (2009). The governance of university knowledge transfer: A critical review of the literature. Minerva, 47(1), 93–114. Gibb, A., Haskins, G., & Robertson, I. (2009). Leading the entrepreneurial university. University of Oxford. Gillham, B. (2005). Research interviewing: The range of techniques: A practical guide. McGrawHill. Gleeson, R. E. (2010). The third mission and the history of reform in American higher education. The community engagement and service mission of universities, pp. 121–137. Greenwood, R., & Suddaby, R. (2006). Institutional entrepreneurship in mature fields: The big five accounting firms. Academy of Management Journal, 49(1), 27–48. Greenwood, R., Raynard, M., Kodeih, F., Micelotta, E. R., & Lounsbury, M. (2011). Institutional complexity and organizational responses. Academy of Management Annals, 5(1), 317–371. Grimaldi, D., & Fernandez, V. (2017). The alignment of University curricula with the building of a Smart City: A case study from Barcelona. Technological Forecasting and Social Change, 123, 298–306. Guerrero, M., Cunningham, J. A., & Urbano, D. (2015). Economic impact of entrepreneurial universities’ activities: An exploratory study of the United Kingdom. Research Policy, 44(3), 748–764. Guerrero, M., Urbano, D., Fayolle, A., Klofsten, M., & Mian, S. (2016). Entrepreneurial universities: Emerging models in the new social and economic landscape. Small Business Economics, 47 (3), 551–563. Hayes, D. (Ed.). (2017). Beyond McDonaldization: Visions of higher education. Taylor & Francis. Huynh, M. Q., Umesh, U. N., & Valacich, J. S. (2003). E-learning as an emerging entrepreneurial enterprise in universities and firms. Communications of the Association for Information Systems, 12(1), 3. Jelfs, P. (2016). Financial performance analysis of spin-off companies from a UK ‘regional’ university: A case study of the University of Birmingham. International Journal of Entrepreneurship and Small Business, 29(2), 271–286. Kraatz, M. S., & Block, E. S. (2008). Organizational implications of institutional pluralism. In The Sage handbook of organizational institutionalism (Vol. 840, pp. 243–275). Sage. Kraatz, M. S., & Block, E. S. (2017). Institutional pluralism revisited. In The Sage handbook of organizational institutionalism (Vol. 2, pp. 635–662). Sage. Maassen, P. (2017). The university’s governance paradox. Higher Education Quarterly, 71(3), 290–298. Macfarlane, B. (2013). Intellectual leadership in higher education: Renewing the role of the university professor. Routledge. Marginson, S., & Van der Wende, M. (2007). To rank or to be ranked: The impact of global rankings in higher education. Journal of Studies in International Education, 11(3–4), 306–329. McGrath, J. E., Martin, J. M., & Kulka, R. A. (1982). Judgment calls in research (Vol. 2). Sage.

Institutional Logics to Unveil Entrepreneurial Universities’ Performances:. . .

195

Meissner, D., & Shmatko, N. (2017). “Keep open”: The potential of gatekeepers for the aligning universities to the new knowledge triangle. Technological Forecasting and Social Change, 123, 191–198. Miller, K., McAdam, M., & McAdam, R. (2014). The changing university business model: A stakeholder perspective. R&D Management, 44(3), 265–287. Murray, F. (2010). The oncomouse that roared: Hybrid exchange strategies as a source of distinction at the boundary of overlapping institutions. American Journal of Sociology, 116(2), 341–388. Netval. (2018). XIV Rapporto Netval. In L. Ramaciotti & C. Daniele (Eds.), La rete del trasferimento tecnologico si rafforza con la clinical innovation. Edizioni ETS. Ocasio, W., Thornton, P. H., & Lounsbury, M. (2017). Advances to the institutional logics perspective. In The Sage handbook of organizational institutionalism. Sage. Owen, R., Macnaghten, P., & Stilgoe, J. (2012). Responsible research and innovation: From science in society to science for society, with society. Science and Public Policy, 39(6), 751–760. Pache, A. C., & Santos, F. (2013). Inside the hybrid organization: Selective coupling as a response to competing institutional logics. Academy of Management Journal, 56(4), 972–1001. Patton, M. Q. (1990). Qualitative evaluation and research methods. Sage. Philpott, K., Dooley, L., O’Reilly, C., & Lupton, G. (2011). The entrepreneurial university: Examining the underlying academic tensions. Technovation, 31(4), 161–170. Pierce, P., Ricciardi, F., & Zardini, A. (2017). Smart cities as organizational fields: A framework for mapping sustainability-enabling configurations. Sustainability, 9(9), 1506. Powell, W. W., & DiMaggio, P. J. (Eds.). (2012). The new institutionalism in organizational analysis. University of Chicago Press. Qiu, Y., Chen, H., Sheng, Z., & Cheng, S. (2019). Governance of institutional complexity in megaproject organizations. International Journal of Project Management, 37(3), 425–443. QS ranking. (2020). QS World University Ranking. Retrieved from https://www.topuniversities. com/qs-world-university-rankings Rinaldi, C., Cavicchi, A., Spigarelli, F., Lacchè, L., & Rubens, A. (2018). Universities and smart specialisation strategy. International Journal of Sustainability in Higher Education. Scott, W. R. (1991). Unpacking institutional arguments. In W. W. Powell & P. J. DiMaggio (Eds.), The new institutional in organizational analysis (pp. 162–182). University of Chicago Press. Scott, W. R. (1995). Institutions and organizations. Sage. Scott, P. (2001). Conclusion: Triumph and retreat. The state of UK higher education–managing change and diversity (pp. 186–204). Routledge. Secundo, G., Ndou, V., Del Vecchio, P., & De Pascale, G. (2019). Knowledge management in entrepreneurial universities. Management Decision, 57(12), 3226. Secundo, G., Perez, S. E., Martinaitis, Ž., & Leitner, K. H. (2017). An intellectual capital framework to measure universities’ third mission activities. Technological Forecasting and Social Change, 123, 229–239. Siegel, D. S., & Zervos, V. (2002). Strategic research partnerships and economic performance: Empirical issues. Science and Public Policy, 29(5), 331–343. Siggelkow, N. (2007). Persuasion with case studies. Academy of Management Journal, 50(1), 20–24. Silverman, D. (2013). Doing qualitative research: A practical handbook. Sage. Slaughter, S., & Leslie, L. L. (1998). Academic capitalism: Politics, policies, and the entrepreneurial university. The Johns Hopkins University Press. Strauss, A., & Corbin, J. (1990). Basics of qualitative research, grounded theory procedures and techniques. Sage. Tang, H. H. H. (2018). Academic profession, entrepreneurial universities and scholarship of application: The imperative of impact. Journal of Comparative and International Higher Education, 10(3), 3–5.

196

C. Forliano et al.

Thornton, P. H., & Ocasio, W. (1999). Institutional logics and the historical contingency of power in organizations: Executive succession in the higher education publishing industry, 1958–1990. American Journal of Sociology, 105(3), 801–843. Trencher, G., Yarime, M., McCormick, K. B., Doll, C. N., & Kraines, S. B. (2014). Beyond the third mission: Exploring the emerging university function of co-creation for sustainability. Science and Public Policy, 41(2), 151–179. University of Milano-Bicocca. (2020). Piano strategico 2020/2022. Retrieved from https://www. unimib.it/sites/default/files/allegati/piano-strategico-2020_1.pdf Von Schomberg, R. (2012). Prospects for technology assessment in a framework of responsible research and innovation. In M. Dusseldorp & R. Beecroft (Eds.), Technikfolgen abschätzen lehren (pp. 39–61). VS Verlag für Sozialwissenschaften. https://doi.org/10.1007/978-3-53193468-6_2 Wang, J. (2018). Innovation and government intervention: A comparison of Singapore and Hong Kong. Research Policy, 47(2), 399–412. Wooten, M., & Hoffman, A. J. (2008). Organizational fields: Past, present and future. In The Sage handbook of organizational institutionalism (Vol. 1, pp. 131–147). Sage. Yin, R. K. (2017). Case study research and applications: Design and methods. Sage. Zietsma, C., & Lawrence, T. B. (2010). Institutional work in the transformation of an organizational field: The interplay of boundary work and practice work. Administrative Science Quarterly, 55 (2), 189–221.

Third Mission in Universities from a Performance Management Perspective: A Comparison Between Germany and Italy Pasquale Ruggiero, Patrizio Monfardini, Dieter Wagner, and Dominik Bartsch

1 Introduction The writing of this chapter is taking place in a moment of global crisis, dictated by the spread of Coronavirus and declared by the World Health Organization as a pandemic, which brought two fundamental aspects back to the center of the scientific and public debate: the role of public administrations and science. Both aspects, and above all their appropriateness, are identified as essential in order to overcome the crisis. Above all, it is their interaction that can constitute a fundamental element to face and overcome the current moment of crisis. In this chapter, attention is focused on science and specifically on the knowledge produced, and on the way through which it can spread within society to form the basis for its advancement. A fundamental source of new knowledge production is surely the Higher Education system and specifically universities and their so-called third mission (TM henceforth). Universities are institutions within which individual States directly

P. Ruggiero (*) Department of Business and Law, University of Siena, Siena, Italy School of Business and Law, University of Brighton, Brighton, UK e-mail: [email protected] P. Monfardini Dipartimento di Scienze economiche ed aziendali, University of Cagliari, Cagliari, Italy e-mail: [email protected] D. Wagner Potsdam Centrum für Politik und Management, University of Potsdam, Potsdam, Germany e-mail: [email protected] D. Bartsch Schumpeter School of Business and Economics, University of Wuppertal, Wuppertal, Germany e-mail: [email protected] © The Author(s), under exclusive license to Springer Nature Switzerland AG 2022 E. Caperchione, C. Bianchi (eds.), Governance and Performance Management in Public Universities, SIDREA Series in Accounting and Business Administration, https://doi.org/10.1007/978-3-030-85698-4_10

197

198

P. Ruggiero et al.

or indirectly invest a relevant percentage of their budget. Therefore, because of both the investment of public resources and the effect in terms of economic and social development that this investment could generate, it is essential to analyze the ways adopted for governing the TM both at the university and system level. Despite its importance, the attention that has been dedicated to the study of the TM is still scarce (Dal Molin et al., 2017). Notwithstanding the reform process that has affected higher education systems at international level, universities still seem to attribute little relevance to their TM, still favoring the two institutional activities historically attributed to them, namely research and teaching (Deem, 2001; Dobija et al., 2019; Gebreiter & Hidayah, 2019). In fact, the reform process has mainly focused on the development and implementation of reforms aiming at implementing logics and managerial models within higher education institutions that would allow to increase the effectiveness and efficiency of the processes connected to teaching, such as co-production and student satisfaction (Bunce et al., 2017), and research, such as rankings (Ashton et al., 2009; Martin-Sardesai et al., 2017). The TM, on the other hand, constitutes a fundamental dimension for universities for the external value that, as previously mentioned, this could have but also for the contribution that its effective and efficient implementation could provide to teaching and research (Ćulum et al., 2013). The development of the TM could help bring the academic body down from the ivory tower and remedy the main criticism addressed to it. In the face of this situation, this chapter aims to investigate, through a comparative analysis between Germany and Italy, the planning and control processes adopted at the university system and individual university level to manage the TM. Even if the effects, especially outcomes, of any policy on universities’ TM take a long time to be produced, the focus of this chapter on governance, programming, and control activities derives mainly from the performative character that is attributed to them even in the short term (Skærbæk & Tryggestad, 2010). Therefore, an analysis of these activities could make it possible to identify ways of governing the TM which could be one of the fundamental elements of their success/failure. From a methodological point of view, the pursuit of the cognitive objective is achieved by carrying out four case studies, two for each country analyzed. The case studies have been developed using both secondary sources (documents produced at system level and by individual universities) and primary sources deriving from direct knowledge by the authors for the role they play within their universities. After this brief introduction, the remaining part of the chapter is made up of three paragraphs. The following paragraph reports the main problems identified in the literature relating to the development of the TM in universities with reference particularly to the aspects of planning and control. The third paragraph reports the current governance structure of the TM of universities at the system level in Germany and Italy and the findings coming from the development of the four case studies (two per country). In particular, the programming and control systems of the TM existing in Germany and Italy at the university system level and at the level of individual university institutions are presented. The last paragraph contains the discussion of the cases analyzed and some final reflections.

Third Mission in Universities from a Performance Management Perspective: A. . .

199

2 From the Ivory Tower to Services and Engagement: The Evolution of Universities’ Third Mission The TM of the university system is more a black box than a concept with respect to which it is possible to find common interpretation and implementation, especially following the evolution that has affected universities over the past decades (Berghaeuser & Hoelscher, 2020). Although the TM is nowadays considered an institutional function of universities, the university system is often seen as a phenomenon totally separate from the environment within which it is located and from the other institutions that surround it—a system inside which academics have perched to defend a series of their supposed privileges (Etzkowitz & Leydesdorff, 1998). During last decades, however, universities are facing a vast reform process that on the one hand is asking them to increase their levels of efficiency and effectiveness from a managerial point of view and on the other requires them to open up and adopt an outward looking perspective in order to get closer to their origins and generate a direct and fruitful relationship between two worlds of knowledge, the theoretical and the operational (Parker, 2011). Already in the first half of the twentieth century, universities began to present both internal weaknesses, linked more and more to selfreferential processes resulting from a growing organizational autonomy of academics, and little external transparency on the performance of the activities under their competence (Roper & Hirth, 2005). These problems, associated with the constant growth of the need for resources for financing research activities, have constituted the fundamental push to the birth of universities financed by the economic realm within which they were located, of which the Massachusetts Institute of Technology represents the most famous example. In this type of university, the relationship between academic research and the industrial world has become increasingly close, resulting in the development of large economic regions. This initial change drove toward a transformation of university institutions defined by some scholars as the transition from Mode 1 to Mode 2. In particular, Mode 1 provided that new knowledge was produced mainly within individual disciplines, especially in universities and other academic institutions. The connection with social needs was weak and the results of research activities were transferred at the end of research processes to end users, whether they were able or not to exploit them. Researchers had a wide level of autonomy in organizing their activities and turned their attention to autonomously chosen research problems (Humboldtian model of university). In Mode 2, contrary to the previous one, research activities focused on “objects” for which several fields of study have shown interest and to which a growing number and variety of institutions have shown interest. The boundaries between universities and industries have become increasingly blurred and universities produce knowledge adopting a more practical perspective, in other words, research in universities is influenced and responds to the needs expressed by the surrounding economic and social environment (Mascarenhas et al., 2018). From this perspective, also to provide for their economic sustainability, universities have

200

P. Ruggiero et al.

played a service role in the development of the economic sectors and socioeconomic realities within which they were established. According to Mascarenhas et al. (2018, p. 710) “the triple helix model of academia–industry–government relations is an emerging entrepreneurial paradigm in which universities play an enhanced role in technological innovation. Governments have encouraged this academic transformation as an economic development strategy, which also reflects changes in the relationship between knowledge producers and users.” In particular, the relationship between universities and industries can be framed in four different clusters characterized by a growing level of interaction. In the first cluster, the relationship and the transfer of knowledge are mainly based on the absorption capacity of companies compared to the knowledge produced by universities. In the second cluster, the university–industry relationship is based on the knowledge spillover carried out by universities through the publication of reports, the running of public conferences and meetings, and consultancy. The third cluster is characterized by the establishment of alliances between the parties involved. The last cluster is characterized by a very close cooperation between universities and industries, which is managed through specific bodies such as technology transfer offices and which could lead to the creation of start-ups also in the form of university and/or academic spin-offs (Grossi & Ruggiero, 2008; Shah & Pahnke, 2014). The activities implemented for pursuing TM objectives could differ between technical and humanities universities. Technical universities could be facilitated in linking with the economic system compared to humanities universities. But despite this potential difference, performance measurement and controlling systems should contribute to the pursuing of the TM’s objectives in these different universities. Compagnucci and Spigarelli (2020) state that Social sciences and humanities facilitate both understanding of the complexity of the economic, social, and environmental issues involved, and generate collaborative strategies between university–industry labs. While technical universities are more focused/facilitated in running a continuous process of entrepreneurial discovery and exploration of market opportunities which can more easily foster collaboration between university–industry–government. More recently, another approach is trying to replace the service model adopted by universities, which sees the engagement processes of universities’ stakeholders as a way for making the boundaries of all these organizations more permeable. Instead of adopting an outward looking perspective, universities are trying to increase the participation of stakeholders in their decision-making and operational processes (Roper & Hirth, 2005; Bond & Paterson, 2005). This model could represent the transition to a so-called Mode 3 (Carayannis & Campbell, 2009). Faced with this further process of reform, it is essential to make the TM of universities a dimension governed not only effectively and efficiently but being aware that it is an institutional activity of universities, which implies the necessity to make decisions and therefore conscious choices. In other words, the TM must become a strategic dimension of universities (Hölzle et al., 2014). This new approach to the TM of the universities is pushing legislators of many countries, the governing bodies of the university systems as well as individual universities to define and implement new governance models and control tools. To

Third Mission in Universities from a Performance Management Perspective: A. . .

201

this end particularly relevant is the conceptualization of the research relevance and how it is perceived. According to Tucker and Parker (2020, p. 1247) research is perceived by research leaders “as a means of responding to government and political imperatives, as a pathway to ensuring university legitimacy, and as a means of generating further resources.” This conceptualization of research relevance shows that the link between research and the economic realm is still weak because of the role as mediator played by governmental and political requests, thus restoring the idea that research participates in universities’ TM only for compliance purposes or because it is functional to make universities’ fundraising more successful. Moving to the practice of TM, even if according to Tucker and Parker (2020) the relevance of academic research is defined fundamentally through the implementation of performance measurement and control systems (i.e., rankings of universities), in many cases the tools used for measuring and controlling the activities carried out within the universities’ TM seem to have more a symbolic than a rational use. Measures and reports are used mainly for external purposes and to gain legitimacy in front of external evaluators (Dobija et al., 2019) more than making and implementing effective decisions. TM in universities seems at a turning point because of both the increasing rhetoric on this universities’ function and the increasing need of financial resources for university systems mainly due to the scarce and increasingly limited resources allocated by governments.

3 The Case Studies The case study aims at analyzing and understanding which are the main elements of a performance evaluation system that have been developed for governing and controlling universities’ TM. We have focused the attention on the governance and control aspects of universities’ TM and the documents used for planning and controlling this apparently new universities’ institutional activity. The case study has been developed using four cases (two universities in Germany and two in Italy). The reason for this choice is mainly due to the different governance systems used in Germany and Italy and to the possibility to collect information in the universities selected (discussing with colleagues or analyzing documents) as the authors are part of the staff of these universities. Therefore, the sample is a purposeful sample (Royse et al., 2010).

3.1 3.1.1

Italy The TM at the National (System) Level in Italy

TM’s planning and evaluation in the Italian Higher Education System is actually composed of two institutional levels: the national one in which definitions, criteria,

202

P. Ruggiero et al.

Table 1 Areas of evaluation of the third mission and social impact of universities Research promotion 1. Management of the industrial property (patents; vegetal monopoly)

2. Spin-off companies

3. Activities for third parties 4. Intermediation structures (technological transfer offices, placement offices, incubators, scientific parks, consortia, and associations for TM)

Public goods production 5. Management of properties and cultural activities (archeological sites, museums, musical activities, historical buildings and archives, historical libraries, theaters, and sports facilities) 6. Activities for public health (clinical tests, non-interventional studies, and patients’ empowerment, supporting structures) 7. Life-long training, open teaching 8. Public engagement

Source: ANVUR (2018), Guidelines for drafting the SUA-TM/IS. Available at https://www.anvur. it/wp-content/uploads/2018/11/SUA-TM_Lineeguida.pdf. SUA-TM/IS is an acronym that refers to the “Scheda Unica Annuale Terza Missione ed. Impatto Sociale (Single Annual Third Mission and Social Impact Card), a card used annually for collecting information and evaluating universities’ third mission and social impact

and indicators are built and the local one, involving each academic institution, in which the activities comprised in the TM are planned, executed and then reported. TM’s evaluation is one of the duties in charge of ANVUR, the national agency for evaluating research and universities since its establishment in 2010. The review of TM activities carried out by universities has been included among the activities which are subjects to evaluation by ANVUR only recently, in 2013, with the Ministerial Decree n. 47/2013. Since then, the definition and criteria of assessment became more focused: at the beginning of such activities, the definition of TM was quite broad, such as “the openness toward the external socio economic environment through the knowledge promotion and transfer.”1 After several rounds of improvement, the TM’ content is now defined as composed of two main areas: (a) research promotion and (b) public goods creation. Related activities are under the responsibility of the universities, while research and teaching duties belong to each researcher. ANVUR monitors all the activities at the level in which they are produced; this means mostly at the universities’ top administrative level (to understand the general strategy adopted by universities) but also at the department level in order to evaluate its implementation. Table 1 summarizes the different dimensions in which the two areas are organized. The area named “Research promotion” includes all the forms of using the results of the research conducted in the university to exploit them commercially, including the use of special-purpose entities such as spin-off companies and intermediation structures. According to the guidelines, a spin-off company is defined as a company

1 ANVUR, 2004–2010 Evaluation report, available at: https://www.anvur.it/rapporto/main.php? page¼intro

Third Mission in Universities from a Performance Management Perspective: A. . .

203

exploiting some university’s research results or establishing a stable research relationship. These companies must be accredited by the Board of Directors of the university to gain the status. The area named “Public goods production” is highly differentiated and includes both the cultural activities and the promotion of the historic properties. Such properties enhance the social, economic, and cultural life of the cities where they are located. Moreover, the same area includes public safety activities and training initiatives that universities provide to the public at large. Finally, public engagement includes all the activities aimed at educating the community. Since 2012, all universities and their teaching programs are subjected to an accreditation/assessment system based on three main pillars: (1) an initial, and then, periodic accreditation scheme for programs and venues; (2) a quality, efficiency and effectiveness assurance and assessment system for teaching and research activities; and (3) a reinforced system of university self-assessment to enhance the quality and effectiveness of research and teaching activities. The related regulations and guidelines have changed several times since 2012 resulting in an increase of uncertainty and the impossibility to have in place a stable and constant strategy to pursue. The current version dates to January 2019. This system includes specific prescriptions for TM activities. Universities are required to annually draft a document called SUA-TM/IS; it contains in detail all the information to be sent to ANVUR to allow the evaluation. To obtain initial accreditation, each department must show that the SUA-TM/IS is complete in terms of information about TM activities. The initial accreditation is the first compulsory step for all university departments to make their TM activities evaluated. Then, to gain the periodic accreditation, each university must prove to have in place a quality assurance system dealing with the TM, both in the planning and reporting phases. Thus, each university must plan TM activities in its strategic plan to be, then, monitored and measured. Guidelines provided by the same ANVUR help universities to properly draft the SUA-TM/IS correctly, and whenever possible, data are automatically gathered from existing databases. The evaluation for the activities comprised in the years 2015–2019 is currently running. ANVUR is using several panels of experts, half coming from universities and half not academic. In the previous evaluation, standardized indicators have been adopted to rank the different performance achievements without reaching an overall TM index for each university.2 Data have been collected through the SUA-TM/IS, analyzed by the panel of experts, and performance results have been classified into four bands to be able to differentiate each university’s performance. Presumably, the current evaluation will adopt a similar methodology.

2

See https://www.anvur.it/rapporto-2016/static/VQR2011-2014_TerzaMissione.pdf

204

3.1.2

P. Ruggiero et al.

The TM at the University of Cagliari

Over the years, the planning and evaluation of the TM at the University of Cagliari have become increasingly relevant. Nevertheless, such activities are not under the responsibility and the supervision of a unitary body or central Direction within the University, nor is there a Rector delegate in charge for the whole TM; similarly, there is not a specific part of the official website grouping together all the related information. Thus, at the moment responsibilities and information are instead dispersed and fragmented. The University planning documents include strategies and objectives for the TM with related indicators on a yearly and multiyear basis.3 The implementation of the activities is under the responsibility of several central Directions, with a prominent role of the Direction for Research and Territory. Each University Department, which groups together the teachers and the research activities related to the different disciplinary fields (i.e., medicine, engineering, economics, and business), deals with public engagement and the training activities. Research Promotion, as defined in the national regulations, is particularly relevant for the University of Cagliari since it is the biggest university in the region with a clear leading role in research; thus, related activities have been distributed among specific offices, i.e., the liaison office in charge of dealing with the spin-offs and patenting procedures and purposefully constituted bodies such as CREA, a service unit for Innovation and Entrepreneurship that runs all the initiative and programs on the mentioned fields that is headed by a Rector delegate for Innovation and territory. Since 2019 the University is running a project named “The Shifters, the TM” aimed at divulging to the public audience the TM activities of the University through a series of short movies. Twice a year, the Direction for Research and Territory asks the Departments to fill a form summarizing the TM activities carried out by each of them, so as to aggregate the overall activities carried out. Similarly, offices and special bodies report their activities and results so that the University can aggregate the overall TM results and publish them in the documents.4

3.1.3

The TM at the University of Siena

At the University of Siena, the TM is recognized as an institutional activity even within its statute. In particular, in defining the university’s mission it is reported that, among others, the fundamental objective to be pursued is the transfer of technology and knowledge. To coordinate the activities necessary to achieve this objective, from

3

See as an example the following link: https://trasparenza.unica.it/files/2020/02/Allegato-1obiettivi-strategici-e-indicatori-1.pdf 4 As an example see https://trasparenza.unica.it/files/2019/06/Allegato-1-Monitoraggio-annualeobiettivi-strategici-di-Ateneo.pdf

Third Mission in Universities from a Performance Management Perspective: A. . .

205

the organizational point of view, a specific division has been structured which consists of two offices, the liaison office and the public engagement office. The TM Division5 represents the node of aggregation, coordination, and promotion of activities aimed at technology and knowledge transfer between the University and the other economic sectors located in the local area. To this end, the Division has constant interactions with the university’s research structures. Additionally, the TM Division supports and enhances public engagement activities, dissemination of research results, and training on third mission topics. More specifically, the liaison office promotes and coordinates initiatives between the university and institutions and companies in relation to the processes of technology transfer and enhancement of research. It also carries out economic analyses and provides support services to SMEs. In coordination with the Research and Grants Management Division, it provides administrative assistance for industrial research financing projects based on regional and national calls. It takes care of the technical and administrative procedures for the management and enhancement of the intellectual property of the University and for the establishment of academic spin-offs, supporting and monitoring their activities. In addition to the TM Division, over the past few years, the Rector has identified delegates who have the specific mandate of governing aspects related to TM. Also relevant for the positive effects in terms of TM, at the University of Siena from 2013 a new project was launched which led to the establishment in 2016 of the Santa Chiara Lab.6 This organizational structure has as its mission the contamination of ideas and the hybridization between different knowledge and skills, coming from actors and disciplinary areas inside and outside the University, also with the help of innovative support services and technologies, and the definition of strategic multidisciplinary projects. Moving from the central to the departmental level, the organization and management of TM is less structured and depend on the choices made by each individual department. In some departments, the Directors appoint specific figures intended to support the development of TM-related activities, while in others this specific figure does not exist. From a planning and control point of view, the TM of the University of Siena is mainly managed through the three-year strategic program.7 This document has the dual function of coordinating the University’s activities with the guidelines provided by the Ministry of University and Research but at the same time it aims at defining, with respect for the autonomy assigned to them, a minimum level of coordination of the activities of the various university structures in terms of research, teaching, and TM.

5

See https://www.unisi.it/amministrazione-centrale/divisione-terza-missione See https://santachiaralab.unisi.it/ 7 See https://www.unisi.it/ateneo/programmazione-triennale 6

206

3.2

P. Ruggiero et al.

Germany

3.2.1

The TM at the National (System) Level in Germany

Generally, scientific matters are primarily organized by the 16 German “Länder.” But the federal level plays an important role by financing projects with strategic relevance. In the following section regarding the case studies located in Germany it will be highlighted that the TM of universities level depends on the political responsibilities at two different governmental level: the central government and the decentralized level of the respective State. In the case of the two German universities analyzed in this chapter, they are Brandenburg (Potsdam, in the East of Germany) and NorthRhine Westphalia (Wuppertal in the West of Germany). Besides of that accreditation, rules and Quality management systems are existing based on common regulations. All teaching programs must be accredited nowadays. TM regulations are separately organized by rules on the decentralized level.

3.2.1.1

Federal and Provincial Initiatives

The Federal Ministry for Education and Research’s (BMBF) innovation initiative “Entrepreneurial Regions”8 has the goal to develop regional alliances of a particular innovation profile into regional innovation clusters. Since 1999 the BMBF has pursued this objective through the systematic development of a range of programs for the new German states joining the Federal Republic of Germany since reunification in 1990. The program policies are based on four guidelines, as follows: • • • •

Innovations: creative, strategic, and relevant. Only the best from the region: innovating from regional strengths. Innovations with a market orientation. Establishment of regions with an innovation profile, based on outstanding technological platforms.

Collectively these programs have contributed significantly to increasing the competitiveness of the East German economy, despite an ongoing need to catch up to the West. This represents approximately 20 years of support programs which are also relevant for both the German universities analyzed in this chapter. An important role for the TM of German universities is played by the Federal Ministry for Economic Affairs (BMWi). Within the framework of the “SIGNO” program (“Safeguarding Ideas for Commercial Usage”) the BMWi has supported technology transfer between universities, small and medium-sized businesses (SMB) and freelance inventors in addition to aiding in the legal protection and economic realization of their innovative ideas. This support program has been 8

See https://www.bmbf.de/de/innovation-strukturwandel-5516.html

Third Mission in Universities from a Performance Management Perspective: A. . .

207

running since 1999. Now it is substituted by the “VIPANO” which encompass three activities and one of them is specifically directed to universities: the VIPANOUniversities which supports the efficient collaboration between universities, business, and other publicly financed research institutions by ensuring the rapid transfer of the most current patented business-technical findings to the appropriate business entities.9 Another important program is the EXIST Program (University-Based Business Start-Ups). It is a support program of the BMWi started in 1998 with the aim of improving the entrepreneurial environment at universities and research institutions, and of increasing the number of technology and knowledge-based business start-ups. The EXIST program is part of the German government’s “Hightech Strategy for Germany” and is co-financed by funding from the European Social Fund (ESF).10 A new program has started in June 2020: EXIST-Potentials where more than 150 German universities succeeded in a competitive procedure that took place in two phases. EXIST-Potentials has three different tracks: Promoting Potentials, Regional networking, and Convincing Internationally. The following explanations are concentrated on “Entrepreneurship and Technology Transfer.”.

3.2.1.2

Start-Up Support in the Brandenburg Universities

The State of Brandenburg has a start-up support in addition to the Federal money. It is not so much competitive but more related to the size of the different institutions. Not only Universities and polytechnics11 are funded; also chambers of commerce and craftsmen as well as the regional towns and districts in Brandenburg. With reference to universities, this initiative aims at supporting individuals wishing to start up who either study at a university in the State of Brandenburg, have completed their studies at a university in the State of Brandenburg within the last 7 years (alumni) or are employed as academic staff at a university in the State of Brandenburg and intend to set up a business in the state of Brandenburg.12 The universities, as publicly funded institutions, have to submit a declaration with the application for the promotion of start-up services, which shows that the requested funding is only used for projects that go beyond the area funded by the public sector. The funds are only to be used for additional or supplementary projects. For this reason, in order to ensure additionality (Article 95 (2) of Regulation (EU) No. 1303/ 2013), the allocated funds must be kept separate from the state budget by either

9

See https://www.bmwi.de/Redaktion/DE/Publikationen/Technologie/wipano.html See https://www.ptj.de/exist 11 In Italy polytechnics are quantitatively less than in Germany. Anyway, in the text in order to avoid complexity we have used the two terms interchangeably. 12 See https://www.existenzgruender-helfer.de/gruendungszuschuss-foerderungen/foerderungbundesland/brandenburg/ 10

208

P. Ruggiero et al.

opening a project account or setting up a separate cost unit within the budget. The foundation services perform the following tasks: (a) Improving the start-up climate and sensitizing potential founders. (b) Examination of the suitability for funding. (c) Provision of individual and specific advice as well as qualification and individual and specific coaching and further support to prepare for successful business startups. (d) Support for the use of the EXIST start-up grant and EXIST research transfer by the Federal Ministry for Economic Affairs and Energy.

3.2.1.3

Start-Up Support in North Rhine-Westphalian Universities

Various support strategies, competitions, and programs supporting start-ups from universities can be also identified at the federal state level in North Rhine-Westphalia (NRW) due to the federalist (educational) structure. These activities are currently mainly fostered by the Ministry of Economic Affairs, Innovation, Digitalization, and Energy of the State of North Rhine-Westphalia (MWIDE) to support the universities in realizing the task of knowledge transfer. The transfer of technology and the promotion of spin-offs is often emphasized as it is also anchored in the Higher Education Act of the State of NRW (Hochschulgesetz des Landes NRW) (see § 3 (1) HG NRW 2019). Firstly, there is currently the funding competition START-UP transfer. NRW, which was jointly initiated by the former Ministries of Economic Affairs, Energy, Industry, SMEs, and Handicraft of the State of North Rhine-Westphalia (MWEIMH) and of Innovation, Science, and Research of the State of North Rhine-Westphalia (MIWF) in 2015 (cf. MWIDE, 2019a). Through this program, university graduates and scientists have received financial support in conceptualizing and launching their start-ups (cf. MWIDE, 2019b). The overall objective of this project has been to strengthen the third mission in terms of improved transfer of knowledge and technology from the universities. Furthermore, the projects were supposed to increase practical orientation in the universities, so that both industry and science in NRW could benefit from this (cf. MWIDE, 2019b). A further funding program, which was announced in 2016, was START-UPInnovationslabore NRW. This program was initiated by the former MWEIMH together with MIWF. Among other things, the innovation laboratories shall promote cooperation between several universities in a specific region and serve as a link between universities’ research and the economy or entrepreneurship, i.e., the realization of inventions on the market (innovations) in order to advise and support university graduates and scientists ready to set up a business in the realization of potential start-up projects (cf. MWEIMH, 2016). The third program deserving mention is the competition Exzellenz Start-up Center, which was constituted by MWIDE in 2018. Since 2019, a total of six universities from NRW have been supported with a total financial volume of

Third Mission in Universities from a Performance Management Perspective: A. . .

209

150 million euros to strengthen the role of universities in the regional start-up ecosystem in the long term (cf. MWIDE, 2018). These three funding programs of the State of NRW can be regarded as examples of further, new, and in some cases complementary strategies and competitions compared to the federal system level, which supports the transfer of knowledge, research, and technology in the sense of a TM at universities.

3.2.2

The TM at the University of Potsdam

The TM at the University of Potsdam is mainly based on a “Higher Education Development Plan” (Hochschulentwicklungsplan).13 A special chapter of this plan is related to “Innovations, Knowledge and Technology Transfer.” At least from a conceptual point of view, these fields are named as very important especially in a rural region with not much economy and not much research capacities. Therefore, dense cooperation between universities and their social-economic environment would be very necessary and, within universities, especially applied research and development should be fostered. In that connection the following indicators seem to be very important: – – – –

Professorships in entrepreneurship at every academic institution. Professorships in applied research at every university of applied sciences. A relatively high amount of external grants for this type of academic institution. The achievement of top positions of all research institutions, also universities included and especially the University of Potsdam, in the Entrepreneurship Monitoring in Germany.

The University of Potsdam and all the other academic institutions in the region have the order to cooperate with the regional enterprises and to build development clusters for different branches together with specialized transfer units. In addition, the Development Plan stresses the importance of the academy in furthering education and the building of career services and mentoring programs. The Transfer strategy of the University of Potsdam dated from 2018 is closely connected with the “Higher Education Development Plan” of Brandenburg.14 The transfer strategy is mainly focused on providing support to start-ups, increasing applied research and research cooperation, boosting intellectual property. Meanwhile, the University of Potsdam has got an office for governing and controlling all these activities. The Guidelines of the transfer activities are based on a threefold Organization Structures:

13

See https://mwfk.brandenburg.de/sixcms/media.php/9/Hochschulentwicklungsplan.pdf See https://www.uni-potsdam.de/fileadmin/projects/innovative-hochschule/dokumente/ Informationsmaterialien/UP_Transferstrategie_2017_1_.pdf 14

210

P. Ruggiero et al.

(a) Potsdam Transfer as a central scientific unit with direct report lines to the president of the university where all university-related public services are combined (e.g., guidance on financing opportunities for transfer projects, technology scouting, and validation). (b) UP Transfer GmbH as a private limited company with pre-market-related activities. (c) GO:Incubator GmbH is a private limited company that delivers market-based services in a Science Park as a joint venture with the municipality of Potsdam. In June 2020, a new project has started, the Potsdam International Transfer Collaboration Hub (PITCH). This is funded by the Federal Ministry of Economics. The vision is to enlarge the activities in international Entrepreneurship and to enhance the international start-up Culture (international start-up ecosystem). In this respect, a continuous expansion of the international and regional partnership network is an important issue. To this end, it is primarily important that actors (universities and non-universities) focus on the services relevant especially during the start-up face of any initiative, the training of the partner’s coaches but also on attracting funders to invest in the Potsdam area.

3.2.3

The Evaluation of the Third Mission at the University of Wuppertal

Throughout the past decade, the third Mission has become more and more important at the Bergische Universität Wuppertal (BUW). At present transfer in its various forms is seen as an essential component at the central university level and therefore as a further core task alongside research and teaching (cf. BUW, 2017). This becomes apparent when looking at the current mission statement (Leitbild) as well as the current university development plan (Hochschulentwicklungsplan) of the University of Wuppertal. In those documents transfer is understood as a social mission of the university, which is reflected in regional, supra-regional and international interactions (cf. BUW, 2014). The focus is on promoting and establishing a constructive and synergetic cooperation between the civil society and science (cf. BUW, 2019). TM at BUW thereby is divided into three categories of transfer: communication, consultancy, and application. More detailed information regarding these three categories can be found in the current transfer strategy of the university (cf. BUW, 2017). As a result of the increased relevance of the third mission at BUW a VicePresident for Finance, Planning and Transfer was installed at the central university level (Rectorate) in 2008 and a central, university-wide research and knowledge transfer office was created also. This office records the transfer activities of the individual chairs, faculties, and university institutes, makes them visible for internal and external communication, and finally controls them on the basis of different criteria (cf. BUW, 2017; BUW, 2020).

Third Mission in Universities from a Performance Management Perspective: A. . .

211

An important part of these transfer activities in Wuppertal is the field of entrepreneurship and the promotion of start-ups from universities. These topics have a long tradition at the BUW, which is mainly due to funding from the EXIST initiative presented in Sect. 3.2.1. The University of Wuppertal recruited EXIST I through the Faculty of Economics and Business Administration in 1998 and was also financially supported in the subsequent rounds of EXIST II and III until 2013. Initiated by these subsidies numerous structures have been created and developments in research, teaching, and transfer have been realized till this day. Among other things, several chairs related to entrepreneurship have been established and various entrepreneurship modules as well as the master’s degree course “Entrepreneurship and Innovation” have been created at the Faculty of Management and Economics. Furthermore, the field of Business, Innovation, and Economic Change has become one of the six main profile lines in research and teaching at BUW. Through the integrative combination of research, teaching and transfer, institutions, and services regarding university start-up support have been established and expanded too. One result of many is for example the start-up network called bizeps, which is a competence network of the BUW and regional partners. To develop university-wide start-up and transfer potentials even better in the future, the University of Wuppertal applied and has been accepted for the current EXIST-V-project (EXIST-Potentiale). One of the main aims of this project is to permanently institutionalize a start-up center at the central university level (rectorate) to enhance the university-wide effect of start-up support throughout all departments.

4 Some Final Remarks The case studies presented in the previous section highlight that the TM has become a fundamental element within university systems, so much so that it is part of its institutional activities together with teaching and research. There may be several reasons that led to this recognition. One is the desire to contribute to the general socioeconomic development through the implementation of the so-called triple helix model which sees in the interaction among the state, the economic sector and the university system the best strategy to achieve the desired results in terms of socioeconomic development. Another reason is the need to recover levels of legitimacy adequate to justify the investment of large public resources in research and teaching. These reasons are all valid, but they can lead to understand and draft differently the way to carry out the TM within universities. The two university systems analyzed in this chapter seem to have a different level of interest and focus on the TM. The Italian university system is boosted mainly by the central government, therefore focusing more on the governance and strategy setting of universities’ TM, above all for sensitizing the various universities to undertake strategic paths also for governing their TM. In addition, the system is more centralized also at the university level. Most of the activities are planned and

212

P. Ruggiero et al.

coordinated at the central level of each university. This implies a potentially better capacity to plan and control the TM activities carried out but at the same time, it could be perceived as an obstacle to the free initiatives at departmental level. The German university system seems to be more active and coordinated in governing its TM which appears to be governed at different levels: federal, regional, and at the level of single university institution. Another relevant difference between Italy and Germany regards the subjects involved in the governance and control of TM’s activities. In Italy, it is almost exclusively the Ministry of University and Research that governs the TM of the university system, making it appear as a closed perimeter of activity to which other ministries cannot contribute. In Germany, ministries other than the Ministry of university, such as the Ministries for Economic Affairs and Energy at the federal and regional level, contribute to the governance of the TM, especially through the preparation and financing of specific initiatives. Despite these differences, in both systems, research still seems to prevail over TM, above all because TM results have little influence on the career paths of researchers within the two university systems. This aspect makes it necessary to rethink the relationships between research and TM in universities and to ensure that the results potentially coming from these links are also evaluated in researchers’ career processes. A final issue to underline is the different focus that TM seems to have in the different universities analyzed in Germany and Italy. In Germany, the TM of universities seems to be more focused on the activities related to the processes of technology transfer, and particularly on start-ups and spin-offs. In Italy, universities’ TM seems, especially in recent years, to be wider or not specifically focused on technology transfer but it is more and more focused on public engagement.15 Differences that are also reflected in the settlement of the organizational structures responsible for TM within universities. TM is a phenomenon that is certainly gaining ground within universities, but it has not yet found a common definition per se and in relation to the other institutional activities of universities and especially to research. These aspects need a greater and more in-depth discussion at both political and academic levels to avoid embarking on processes of reforms of university systems that are driven by objectives of legitimacy rather than real purposes. Similarly, at a single university level TM has not to be influenced by specific and individual academic interests but by organizational objectives.

15

University of Potsdam’s Center for Policy and Management (PCPM)—see https://www.unipotsdam.de/de/pcpm/ - is successful for years with teaching programs like “Master of Public Management” or “European Master of Governance and Administration.” Last but not least, new program topics like “Citizen Science” or “Society Campus” are focused on a temporarily base until 2022—see https://www.uni-potsdam.de/de/potsdam-transfer/ueber-uns/drittmittelprojekte-deszentrums/innovative-hochschule

Third Mission in Universities from a Performance Management Perspective: A. . .

213

References ANVUR. (2018). Guidelines for drafting the SUA-TM/IS for universities. Available at https://www. anvur.it/wp-content/uploads/2018/11/SUA-TM_Lineeguida.pdf Ashton, D., Beattie, V., Broadbent, J., Brooks, C., Draper, P., Ezzamel, M., Gwilliam, D., Hodgkinson, R., Hoskin, K., Pope, P., & Stark, A. (2009). British research in accounting and finance (2001-2007): The 2008 research assessment exercise. British Accounting Review, 41(4), 199–207. Berghaeuser, H., & Hoelscher, M. (2020). Reinventing the third mission of higher education in Germany: Political frameworks and universities’ reactions. Tertiary Education and Management, 26, 57–76. Bond, R., & Paterson, L. (2005). Coming down from the ivory tower? Academics’ civic and economic engagement with the community. Oxford Review of Education, 31(3), 331–351. Bunce, L., Baird, A., & Jones, S. E. (2017). The student-as-consumer approach in higher education and its effects on academic performance. Studies in Higher Education, 42(11), 1958–1978. Carayannis, E. G., & Campbell, D. F. J. (2009). ‘Mode 3’ and ‘quadruple helix’: Toward a 21st century fractal innovation ecosystem. International Journal of Technology Management, 46(3/4), 201–234. Compagnucci, L., & Spigarelli, F. (2020). The third mission of the university: A systematic literature review on potentials and constraints. Technological Forecasting and Social Change, 161. https://doi.org/10.1016/j.techfore.2020.120284 Ćulum, B., Rončević, N., & Ledić, J. (2013). Facing new expectations—Integrating third mission activities into the university. In B. M. Kehm & U. Teichler (Eds.), The academic profession in Europe: New tasks and new challenges (pp. 163–195). Springer. Dal Molin, M., Turri, M., & Agasisti, T. (2017). New public management reforms in the Italian universities: Managerial tools, accountability mechanisms or simply compliance? International Journal of Public Administration, 40(3), 256–269. Deem, R. (2001). Globalisation, new managerialism, academic capitalism and entrepreneurialism in universities: Is the local dimension still important? Comparative Education, 37(1), 7–20. Dobija, D., Górska, A. M., Grossi, G., & Strzelczyk, W. (2019). Rational and symbolic uses of performance measurement: Experiences from Polish universities. Accounting, Auditing & Accountability Journal, 32(3), 750–781. Etzkowitz, H., & Leydesdorff, L. (1998). The endless transition: A “Triple Helix” of universityindustry-government relations: Introduction. Minerva, 36, 203–208. Gebreiter, F., & Hidayah, N. N. (2019). Individual responses to competing accountability pressures in hybrid organisations. Accounting, Auditing & Accountability Journal, 32(3), 727–749. Grossi, G., & Ruggiero, P. (2008). Lo spin-off accademico. Attori ed ambiente nella fase di gestazione aziendale. Cedam. Hölzle, K., Puteanus-Birkenbach, K., & Wagner, D. (Eds.). (2014). Entrepreneurship education. BoD. Martin-Sardesai, A., Irvine, H., Tooley, S., & Guthrie, J. (2017). Organizational change in an Australian university: Responses to a research assessment exercise. British Accounting Review, 49(4), 399–412. Mascarenhas, C., Ferreira, J. J., & Marques, C. (2018). University-industry cooperation: A systematic literature review and research agenda. Science and Public Policy, 45(5), 708–718. Ministry of Economic Affairs, Energy, Industry, SMEs and Handicraft of the State of North RhineWestphalia (MWEIMH). (2016). Start-UP-Innovationslabore NRW. Gesucht: Nachhaltige Unterstützungsstrukturen für technologie- und wissensbasierte Gründungen in NRW. https:// www.ptj.de/lw_resource/datapool/systemfiles/cbox/3230/live/lw_bekdoc/start-upinnovationslabore_nrw_bekanntmachung_pdf Ministry of Economic Affairs, Innovation, Digitalization and Energy of the State of North RhineWestphalia (MWIDE). (2018). Exzellenz Start-up Center. https://www.wirtschaft.nrw/ exzellenz-start-center

214

P. Ruggiero et al.

Ministry of Economic Affairs, Innovation, Digitalization and Energy of the State of North RhineWestphalia (MWIDE). (2019a). Landesregierung weitet Förderung von Start-ups an Hochschulen und Forschungseinrichtungen in Nordrhein-Westfalen aus. https://www. wirtschaft.nrw/pressemitteilung/landesregierung-weitet-foerderung-von-start-ups-hochschulenund Ministry of Economic Affairs, Innovation, Digitalization and Energy of the State of North RhineWestphalia (MWIDE). (2019b). Start-up Transfer.NRW. Gesucht: Innovative GründungskonzepteWissenschaft und Forschung. https://www.ptj.de/lw_resource/datapool/ systemfiles/cbox/3037/live/lw_bekdoc/bro_startup_transfer_10_web.pdf Parker, L. (2011). University corporatisation: Driving redefinition. Critical Perspectives on Accounting, 22, 434–450. Roper, C. D., & Hirth, M. A. (2005). A history of change in the third mission of higher education: The evolution of one-way service to interactive engagement. Journal of Higher Education Outreach and Engagement, 10(3), 3–21. Royse, D., Thyer, B., & Padgett, D. (2010). Program evaluation. An introduction (5th ed.). Cengage Learning. Shah, S. K., & Pahnke, E. C. (2014). Parting the ivory curtain: Understanding how universities support a diverse set of startups. The Journal of Technology Transfer, 39, 780–792. Skærbæk, P., & Tryggestad, K. (2010). The role of accounting devices in performing corporate strategy. Accounting, Organizations and Society, 35, 108–124. Tucker, B. P., & Parker, L. D. (2020). The question of research relevance: A university management perspective. Accounting, Auditing & Accountability Journal, 20(6), 1247–1275. University of Wuppertal (BUW). (2014). Hochschulentwicklungsplan 2014–2020. https://www. uni-wuppertal.de/fileadmin/bu/01/images/VeroeffentlichungenBroschueren/HEP_WEB_ 140902.pdf University of Wuppertal (BUW). (2017). Transferstrategie der Bergischen Universität Wuppertal. https://www.transfer.uni-wuppertal.de/fileadmin/forschung/PDFs/Transferstrategie_2017_ aktualisiert.pdf University of Wuppertal (BUW). (2019). Rektoratsbericht 2018. https://www.uni-wuppertal.de/ fileadmin/bu/01/pdf/VeroeffentlichungenBroschueren/Rektoratsbericht/Rektoratsbericht_ 2018_web_190705.pdf University of Wuppertal (BUW). (2020). UniService Transfer. Mission Gesellschaft. Aktuelle Transferprojekte. https://www.transfer.uni-wuppertal.de/de/mission-gesellschaft/aktuelletransferprojekte.html