1、OCTOBER 2023PRODUCT DEVELOPMENT REPORTEXECUTIVE SUMMARYPRODUCT DEVELOPMENT REPORTEXECUTIVE SUMMARYThis report reflects on the activities up to date to create systems of data collection and analysis that could effectively capture institutions progress in Interdisciplinary Science Ranking(ISR),benchma
2、rked worldwide.The report explains the findings of the initial feasibility study,the creation of the metric pillars and metrics,the data collection process,and an analysis of the initial findings of that process at both country and institutional level.Overall,with nearly 1200 institutions participat
3、ing,there is an evident desire of institutions to understand how their activity and output on interdisciplinary research can be measured.University leaders and academic confirmed that metrics can effectively act as performance indicators,driving forward standards,and incentivising change for researc
4、h cultures in institutions towards greater collaboration and interdisciplinarity.The report shows how three areas are key to understanding the progress being made by institutions on interdisciplinary research.Firstly,the inputs into interdisciplinarity,including dedicated funding and jobs.Secondly,t
5、he internal processes and structures that support and incentivise interdisciplinarity,including physical space,administrative support and rewards.Finally,the outputs of interdisciplinary research,as measured by a variety of bibliometrics and global reputation.In these areas,the analysis of data show
6、s that there is greater focus on the inputs and processes in universities from the Global South,whereas universities in the Global North perform well on output measurements related to research quality,volume and reputation.However,overall participation is strongly rooted in the Global South,demonstr
7、ating a keen focus on interdisciplinarity as a priority for solving problems that are specific to those regions.Schmidt Science Fellows(SSF)are committed to advancing interdisciplinary science for global benefit and is dedicated to changing the way science is done.Seeing the potential for a new inte
8、rnational ranking of universities which recognises excellence in interdisciplinary science,and which can be used to highlight best practice,influence career development,and incentivise structural change in the sector toward the pursuit of interdisciplinary science,SSF commissioned Times Higher Educa
9、tion(THE)to explore the viability,scope,and proposed metrics of such a ranking.0302CONTENTSPRODUCT DEVELOPMENT REPORTExecutive Summary0304060708101213161819201.Introduction2.Developing the Interdisciplinary Science Rankings2.1 Research and Feasibility Study2.2 Recommendations for the ranking2.3 Coll
10、ecting the DataConclusions3.Analysis of Year One Data 3.1 Input Pillar Overview3.2 Process Pillar Overview3.3 Output Pillar Overview3.4 Institutional AnalysisINTRODUCTIONINTRODUCTIONPRODUCT DEVELOPMENT REPORTINTRODUCTIONInterdisciplinary science is regarded as vital for addressing the worlds contemp
11、orary challenges.Whilst the pursuit of interdisciplinary science is far from new,it has become increasingly central to both academic interest and government science policies(Dav et al.,2016).There are a number of drivers for this,including:an increased appreciation of the inherent complexity of natu
12、re and society;an increased desire to explore problems and questions which are not confined to a single discipline;the need to solve grand societal challenges;and to fully exploit the power of new technologies(National Academy of Sciences et al.,2005).The pursuit of interdisciplinary science neverth
13、eless has its own obstacles.Individual disciplines tend to have defined research practices,languages,philosophies,and communities which may often seem incompatible.Given these challenges there is scope for developing the incentives for universities to support and pursue interdisciplinary science res
14、earch.A ranking that captures measurable indicators of the maturity and progress being made in interdisciplinary science can provide transparent benchmarks to develop those very incentives.PRODUCT DEVELOPMENT REPORT04Interdisciplinary science is regarded as vital for addressing the worlds contempora
15、ry challenges.Whilst the pursuit of interdisciplinary science is far from new,it has become increasingly central to both academic interest and government science policies.THE consultants and data scientists have worked closely with university leaders and academics to ensure the interests of the sect
16、or are represented in rankings,and have created advisory boards to ensure that there is continuous feedback and buy-in.Overall,THE rankings can support positive change in research;the creation of an interdisciplinary science rankings(ISR)can likewise highlight key areas of improvement across interdi
17、sciplinary research,investment and funding and career development.Times Higher Education(THE)is a leader in developing university rankings and performance metrics.THEs World University Rankings are a robust ranking of the worlds research-intensive universities and have acted as a stimulus for change
18、 in higher education policy(Hazelkorn&Ryan,2013).The development of THEs Impact Rankings,which measure universities contributions toward achieving the UNs Sustainable Development Goals(SDGs),have brought to the fore the vital role universities play in sustainable development.Other bespoke rankings,i
19、ncluding the Sub-Saharan African Rankings,are designed with metrics specifically to meet the needs and aspirations of universities in that region.0504SECTION 2PRODUCT DEVELOPMENT REPORT2.1 Research and Feasibility StudyUndertaken between April and June 2022,THE consultants produced a feasibility stu
20、dy for SSF through research and consultation with the global higher education sector.Five roundtables were held with leading academics and senior university staff to gain their insight into what kind of measures and metrics could capture the progress made by universities in interdisciplinary science
21、.Two roundtables included a mix of global leaders,and three were geography specific:A total of 34 university leaders and subject matter experts participated in the roundtable discussions,supplemented with three one-to-one discussions.The outcomes of the roundtables and discussions can be summarised
22、under the following five areas.1.A positive sentiment towards creating the interdisciplinary science ranking Roundtable participants all agreed that collaboration across different disciplines was vital to solve global societal challenges.There was a belief that metrics used to underpin rankings can
23、show value of the overall contribution of interdisciplinary science research to education overall.Understanding that rankings can be an effective driver of behaviour and attitude for academic institutions and other parties including policymakers.2.Definitions of interdisciplinary research Interdisci
24、plinary research can refer to where experts from distinct disciplines come together to research and solve a problem.It can also increasingly refer to single individuals that work across different disciplines,developing new mindsets through constant multi-disciplinary engagement.There was debate as t
25、o whether interdisciplinary research could be measured if only focusing on STEM subjects.The example of COVID-19 was highlighted to show how other non-science disciplines such as social sciences,business and law were also needed to provide a global solution.3.Key methods that can lead to producing m
26、ore interdisciplinary science research Funding is a big incentive for conducting interdisciplinary research.Creating interdisciplinary space(physical and virtual)with the purpose of bringing faculty from across disciplines together into one space to facilitate collaboration.Recruiting skilled staff
27、specifically for interdisciplinary research.Hosted At:THEs Innovation and Impact SummitIn:April 2022Global Roundtable Hosted At:THEs European University SummitIn:May 2022European Roundtable Hosted At:THEs Asia Universities SummitIn:May 2022Asian Roundtable Held In:Parallel to the SSF Global MeetingI
28、n:June 2022North American Roundtable Hosted By:THEIn:June 2022Virtual Global Roundtable 0706SECTION 2SECTION 2PRODUCT DEVELOPMENT REPORTBased on the findings of the roundtables,and THEs own history and experience in developing rankings,it was determined that to best capture and measure interdiscipli
29、nary research,a framework of measurement should include the following metric pillars;inputs,processes,and outputs.The following Figure 1 shows the 12 metrics selected for the ISR under each metric pillar.SECTION 2PRODUCT DEVELOPMENT REPORTMetrics Selected for ISR Under Three Metric PillarsFigure 1 C
30、reating environments and structures to encourage diversification of research and rewards for interdisciplinary research,particularly through interdisciplinary research centres.Greater demand for students to see interdisciplinarity in their curriculum.Education systems need to change to incorporate m
31、ore interdisciplinary study in response to the skills required in the workplace.4.Challenges Differences in the way data is gathered and measured not only across different universities but even across faculties at a single university.Bias in terms of individual academics own interest and pursuit in
32、interdisciplinary research.Academics may feel siloed within their own academic fields and lack exposure to other fields.5.Existing data on interdisciplinary research National Science Foundation(NSF)and Higher Education Research and Development(HERD)Rankings potentially touch on some areas of interes
33、t and may have utility in an ISR ranking(e.g.R&D expenditure and square footage afforded to different disciplines).12.2 Recommendations for the rankingThe research and feasibility phase put forward a set of metrics that might measure interdisciplinary research.However,a range of challenges related t
34、o the practicality and efficacy of developing a ranking also needed to be considered.This included:How metrics might encourage greater interdisciplinarity(positive behaviours and attitudes).Any potential unintended adverse consequences of the measurement.The potential to collect data effectively,inc
35、luding the capacity of universities to respond to data requests.All metrics need to be:1.Powerful they measure something with meaning.2.Sufficiently accurate they are complex enough to measure within context.3.Understandable explicable to a reasonable person in plain language.4.Universal they should
36、 be widely reportable and applicable.Furthermore,choices were made regarding the scope of inclusion into the rankings.It was determined at this stage that research in the sciences was a necessary condition to be included in the ranking.METRIC PILLARSINPUTSPROCESSESOUTPUTS Does your university have m
37、easures of interdisciplinary success?Does your university provide specific physical facilities for interdisciplinary teams?Does your university provide specific administrative support for interdisciplinary teams?Does your university have a tenure or promotion system in place that recognizes interdis
38、ciplinary research?Proportion of research funding dedicated to ISR Amount of research funding from industry Recruitment of ISR researchers Amount of ISR publications Proportion of ISR to overall outputs Utility of ISR-out of discipline citation Quality of ISR-FWCI Reputation for interdisciplinarity0
39、91 https:/ncsesdata.nsf.gov/profiles/site?method=ranking 082.3 Collecting the dataThere are three 3 ways in which the data for the ISR is collected:1)The THE data collection portal(a system used in other THE rankings)2)Surveys of institutions3)Bibliometric dataThe portal is a repository for institut
40、ions to submit both quantitative data and qualitative evidence for the purposes of being assessed for rankings.For the ISR,institutions are required to provide three quantitative inputs related to the research funding dedicated to ISR,research funding from industry sources,and recruitment of ISR res
41、earchers.Institutions also use the portal to supply evidence for the four qualitative processes.This includes evidence regarding measures of success for ISR,facilities for ISR teams,dedicated administrative support for ISR and tenure/promotion specifically for ISR.The survey use for ISR was the gene
42、ral THE Academic Survey of global university academics that is also used for the THE World University Rankings which gauges university reputation.For ISR,there were additional questions regarding how academics were encouraged,enabled and rewarded for ISR,to ensure that the output reputation metric w
43、as specific to ISR.Using OpenAlex,bibliometric data was sourced for the four other output metrics,regarding the volume of ISR publications,the overall proportion of ISR publications,the out-of-field FWCI of ISR publications,and quality of ISR publications.This drew from a field of 47 million publica
44、tions between 2018 and 2022,including 32.2 million journal articles,and 142 million citations.PRODUCT DEVELOPMENT REPORTTHE Data Collection PortalSurveyBibliometric DataPRODUCT DEVELOPMENT REPORTSECTION 2DATA COLLECTION PORTALSURVEYBIBLOMETRIC DATAQuantitativeQualitative Research Funding Industry Fu
45、nding RecruitmentGeneral ReputationInstitution Specific Measure of Success Facilities Admin Support Promotion Encouragement Enablement Reward Tenure ISR Publications Proporation of ISR Out of field citations Quality of ISRData Collection ProcessesFigure 23DATA COLLECTION PROCESSES12011010SECTION 2PR
46、ODUCT DEVELOPMENT REPORT012Overall,1169 institutions submitted quantitative data to the year one collection for ISR.Of these,629 were valid,where institutions did not withhold submission from any fields.A total of 761 institutions submitted qualitative data around processes,with 710 of those answeri
47、ng yes to at least one of the questions.The following analysis summarizes some of the key high level insights from each of the metric pillars that underpin the ISR;inputs,processes and outputs.3.1 Input Pillar Overview Figure 3 shows that India has the highest number of institutions participating in
48、 ISR with valid submissions,followed by Russia.Only three European countries-Russia,Spain and Italy(Turkey is counted as Western Asia)are represented in the top fifteen.Nigeria,Egypt and Algeria represent Africa,and there is one sole representant from Latin America and the Caribbean,Brazil.Surprisin
49、gly there are no representatives from the Anglosphere,with the UK,US,Canada and Australia all outside the top 15.This shows that there is real diversity in the geographical spread of institutions submitting data to the ISR,with a strong performance from countries in the Global South.0130206040Number
50、 of Institutions per Country Submitting Valid Data for ISR,Including Quantitative Metrics Related to Recruitment of ISR Researcher and ISR-Specific Funding (Top 15 Countries)Figure 3India55Russia42Pakistan31Turkey29Japan27Uzbekistan24Indonesia20Iraq17Iran17Nigeria17Eqypt15Spain15Italy15Algeria13Braz
51、il12There is real diversity in the geographical spread of institutions submitting data to the ISR,with a strong performance from countries in the Global South.SECTION 3PRODUCT DEVELOPMENT REPORTSECTION 3SECTION 3PRODUCT DEVELOPMENT REPORTFigure 4 shows that Egypt has the highest proportion of resear
52、ch income dedicated to ISR,followed by Uzbekistan and Saudi Arabia.Six of the top 10 countries for research income dedicated to ISR are in Asia.Overall,the analysis for the input metric pillar shows that countries from the Global South predominate in terms of the submitting data for ISR,and as such
53、the findings likewise demonstrates those countries have the highest number of research positions dedicated to ISR as well as proportion of funding dedicated to ISR.Egypt is notable as representant from Africa,being in the top 3 for both ISR job adverts and ISR-dedicated research funding.India,Pakist
54、an,Iraq and Indonesia all have significant representation in this metric pillar from Asia,with Russia,Spain and Romania leading the way from Europe.Proportion of Research Income Dedicated to ISR per Country (Top 10 countries with at least 10 valid institutional submissions)Figure 4020604025.7Russia2
55、3.8Pakistan38.6Uzbekistan22.4Indonesia47.7Eqypt22.5Spain32.0Saudi Arabia31.6Romania24.8Chile20.2Thailand015Overall,the analysis for the input metric pillar shows that countries from the Global South predominate in terms of the submitting data for ISR,and as such the findings likewise demonstrates th
56、ose countries have the highest number of research positions dedicated to ISR as well as proportion of funding dedicated to ISR.014016PRODUCT DEVELOPMENT REPORTSECTION 3PRODUCT DEVELOPMENT REPORT3.2 Process Pillar OverviewFor the process pillar,submitting institutions were asked to answer four questi
57、ons reflecting on how their institutions support ISR.These questions are related to the physical facilities of the university dedicated to ISR,dedicated administrative support for ISR,institutional measures of success for ISR and systems of tenure and promotion for ISR.Institutions were asked to ans
58、wer yes/no to these questions and provide evidence of processes or policies where they had answered yes.Figure 5 shows the overall responses from institutions to the process pillar metrics,with those answering yes to having these processes in place for ISR.Having physical facilities dedicated to ISR
59、 had the most responses,with 650 institutions answering yes.The process with the least amount of institutions saying yes was related to having systems of tenure or promotion related to ISR.Figure 6 also shows that across all four questions,the evidence submitted by institutions to show that they do
60、have these ISR-dedicated processes in place was not relevant.This was particularly the case for ISR-related tenure or promotion,where only 4%of institutions had specific evidence to show that processes were in place.There is therefore significant scope for improving the evidence submitted for ISR-re
61、lated processes,and could be an incentive for institutions to develop more robust evidencing mechanisms.SpecificGenericNot Relevant66.8%19.0%14.2%31.0%16.0%52.9%91.6%4.09%4.32%11.6%14.4%74%Evidence Submissions for the Process Pillar MetricsFigure 5017There is significant scope for improving the evid
62、ence submitted for ISR-related processes,and could be an incentive for institutions to develop more robust evidencing mechanisms.66.8%91.6%PRODUCT DEVELOPMENT REPORTSECTION 3SECTION 3PRODUCT DEVELOPMENT REPORT3.3 Output Pillar OverviewThe output metric pillar has five metrics,four of which are measu
63、red using bibliometric data sourced from OpenAlex.The bibliometrics measure total ISR related publications;proportion of ISR-related publication to overall output;utility of ISR(using out-of-discipline citations);and quality of ISR using FWCI.The other metric for this pillar,the reputation of interd
64、isciplinary teams,is sourced through the survey method.019018Figure 6 shows that India,China and Russia lead the way in terms of the proportion of ISR to total research all with over 30%.In terms of volume(indicated by the size of the bubble),China,USA,India and Japan produced the most ISR-related r
65、esearch.In terms of the quality of ISR publication,as measured through FWCI,Hong Kong and Singapore have the highest quality,with Hong Kong also reaching over 30%ISR of total research.As the bibliometrics data shows all countries and institutions but not the input and process metric pillars,these me
66、trics offer a more global understanding of the ISR research landscape in terms of volume,focus and quality.The data shows that India and Russia-which both feature heavily in the input and process metrics-also actually produce a high level of ISR relative to their total volume of research,with a high
67、er quality from India.Overall,figure 6 shows that there is a greater presence of European countries than in the input and process pillars,with thirteen European nations having more than 25%of all research dedicated to ISR.Other European countries such as the UK and Denmark have less that 25%ISR,but
68、relatively high quality above FWCI 1.5.3.4 Institutional AnalysisCountry-level data so far has shown greater participation from the Global South,and less presence from countries and institutions from the Anglosphere and Europe.However,data at an institutional level shows some strong performances fro
69、m these countries,where there has been a long history of producing high impact research.Outside of the traditional research powerhouses,there are some diamonds in the rough that showcase how institutions in countries that face different challenges unique to the Global South are also performing well.
70、From the United States,California Institute of Technology performed exceptionally well across the input pillar metrics,and sets the global standard in the output pillar metrics(four of which are bibliometric).Massachusetts Institute of Technology and University of Illinois-Urbana Champaign also offe
71、r stellar examples in the output metrics.In the US,Boston University demonstrates the best evidence of processes to support ISR.A similar case can be found in the United Kingdom,with University of Glasgow and University of Bristol performing in the output metric pillar.IMT Atlantique in France,perfo
72、rms the best in the input pillar metric in Europe.An emerging trend can be seen with institutions in countries that have traditionally dominated the global research landscape performing well on bibliometric data that showcases volume,proportion and quality of ISR,as well as the reputation of those u
73、niversities in ISR.However,evidence from the data shows that they perform less well in the input and process pillar metrics,which reveal more about the institutions intention and dedication towards ISR.In other areas of the world,there is great diversification of high performing universities from ma
74、ny countries,many located in Asia or Russia.Air University in Pakistan leads the global standard for input pillar metrics,a similar case for Visayas University in the Philippines.National Taiwan University(NTU)is notable for being an Asian university that performs very strongly across all three metr
75、ic pillars.Overall universities in Hong Kong and Singapore have the strongest performances in Asia,including The Hong Kong University of Science and Technology(HKUST)and Nanyang Technological University in Singapore.With the highest number of universities,India has many representatives that perform
76、well in the metrics,though very rarely to the highest global standard across all three.Lovely Professional University is an exception,leading the global standard for the process pillar metrics,as well as excellent performance in the other pillars.In Africa,Cairo University in Egypt,Covenant Universi
77、ty in Nigeria and the University of the Witwatersrand perform very well in the output metrics,with the latter also setting the regional standard for input metrics.Overall,institutions with historic research strengths tend to perform better in outputs than inputs and processes,whereas universities in
78、 the Global South lead on participation,inputs and processes.This suggests that whilst there is a strong dedication to ISR in the global south as evidenced by its inputs and processes,the next stage is to improve outputs around research and reputation for ISR;something that can potentially be achiev
79、ed with greater collaboration with universities in the Global North.The Percentage of ISR to Total Research Versus ISR Quality (for countries/territories with over 100,000 scientific publications between 2018 and 2022)Figure 6AfricaNorth AmericaEuropeOceaniaAsiaSouth America16%0.51.01.52.018%22%24%2
80、6%28%30%32%34%20%ISR percentageISR FWCI P75USAZAFBRAIDNMEXJPNUKRRUSINDCHNHKGSAUCZEMYSIRNDEUPOLTWNKORAUSGBRDNKNLDSWEEGYTURCANBELESPISRCHEFINNOR AUTPAKPRT GRCFRAITASGP019018CONCLUSIONSPRODUCT DEVELOPMENT REPORTPRODUCT DEVELOPMENT REPORTCONCLUSIONSThe feasibility study for this project demonstrated tha
81、t university leaders and leading scholars world-wide believe that interdisciplinary research has an important role to play in solving global problems.Furthermore,they agreed that a ranking of institutions for interdisciplinary research could provide performance indicators and incentives to further s
82、trive for greater collaboration between academic disciplines.Using feedback from the roundtables of what kind of metrics could be used to measure excellence in interdisciplinary research,combined with THE data team expertise in building rankings and collecting and analysing data,a rubric of three me
83、tric pillars focusing on inputs,process and outputs was created,covering a total of 12 metrics.The data collection for year one has 1169 institutions submit data,with varying numbers of institutions providing valid responses across the different quantitative and qualitative evidence submissions.The
84、data clearly shows that participation is being driven by institutions from countries in the Global South,with India having the most participating institutions with valid submissions.Across the three metric pillars,there are different trends.The input pillar,that includes metrics for the proportion o
85、f research income dedicated to ISR and ISR-specific job adverts,shows a strong performance from Asian countries,with Romania and Russia performing strongest out of the European nations.Reflecting the broader trend of participation,input metric pillars suggest a higher level of dedication to ISR in A
86、sia,with Egypt also performing strongly in this metric pillar.For the process metric pillar,there is some room for improving the quality of data submissions that provide evidence of the processes that support ISR.These can help empower and incentivize institutions to increase the supporting infrastr
87、ucture for ISR,including physical facilities,dedicated administrative support,measures of success and systems of tenure and promotion reserved for ISR staff.For the output metric pillar,which primarily reviews bibliometric data,the use of OpenAlex means that all institutions across all countries can
88、 be assessed for their ISR contributions and impact,through metrics such as volume of ISR-related publications,proportion of ISR compared to total research output and quality of ISR.As these metrics include all institutions,rather than just those that voluntarily submit data,there is a greater diver
89、sity of countries represented.The US and China,both conspicuously absent in the input and processes metrics,demonstrate relatively high quality ISR output,though India and Russia still lead the way in terms of proportion of ISR out of total output.Finally,institutional level data analysis shows that
90、 universities in the Global North tend to perform better in output metrics than inputs or process metrics.There is therefore room for greater dedication to ISR.In the Global South,the converse is true;there is real dedication as evidenced in the inputs and processes,but room for developing global st
91、andard research outputs and enhanced reputation.More dedicated policy processes to enhance ISR,including physical facilities,administrative support,and staff incentives such as promotion.More funding for ISR as a percentage of overall research fundingGreater visibility for ISR outputs from the Globa
92、l South,potentially through further collaboration with the Global North,to raise impact and reputation.123021SOURCES:https:/ https:/ https:/ https:/ https:/ report offers some early indicators about how universities could be assessed on ISR.From this initial findings,three areas of potential improvement for universities on ISR include:Furthermore,it is recommended that THE engage a larger range of universities in the Global North to ensure greater transparency in the inputs into ISR and evidence of processes that support the pursuit of ISR.