Fair Research Partnerships in European Commission Funded Research

The EU is clearly not the only research funder that struggles with ‘partnerships’ – in fact, we are not aware of any widely accepted framework related to effectiveness, efficiency, impact or ‘fairness’ of research partnerships anywhere. There is also no systematic learning happening – we are not sharing best practices – we are not learning what happens in other parts of the world.

In response, COHRED has developed the Research Fairness Initiative (RFI)  aimed at creating a due diligence instrument and compliance tool for exactly this : ensuring that partnerships work and are ‘fair’

By Carel IJsselmuiden, Executive Director

and Kirsty Klipp, Research Fairness Initiative – RFI Implementation Manager

Council on Health Research for Development – COHRED

Fair Research Partnerships in European Commission Funded Research – Do We Know What is Actually Happening with Public Funds?

 

The EU Directorate-General for Research and Innovation (DG-RTD) is responsible for EU policy and action on research and innovation aiming at making the EU globally competitive, creating jobs and economic growth, building EU-wide research infrastructure, tackling the big societal challenges, and supporting general EC’s mission to promote justice and human rights and become a global actor.

That is quite a portfolio. Commensurate to the task, the resources available for achieving the mission and strategies of the DG-RTD are substantial. The Horizon 20/20 programme alone already has more than €70 billion allocated to it. With all types of added special funds and special interests of member states, the total is certainly much higher.

In the context of pursuit of justice and of Europe as a global actor, special programmes such as the EDCTP (European and Developing Country Clinical Trials Programme)  add substantially to the total funds available to low and middle income countries to help build their research and innovation systems and resilience. Similar aims within Europe are pursued with programmes like WIDESPREAD that aim to bring ‘underperforming’ EU and Associated countries to a higher level of research performance.

Irrespective of the specific focus of programme – almost all EU funded research programmes and research calls aim for two specific outcomes. Firstly, improvement in knowledge and understanding – advancing specific scientific fields that have been prioritized by the DG-RTD.

The second focus, although implied and hardly visible but nevertheless a cornerstone of any EU funding, is to bring EU institutions together in research partnerships: within the EU to improve the research and innovation infrastructure within the EU, and with external countries to either access expertise needed to make EU more globally competitive (in case of collaboration with high-income countries) or to support low and middle income countries to build up their own research and innovation systems and become more economically resilient.

In brief – EU funded research focuses on specific scientific advancement and on supporting institutional research partnerships.

The first – specific scientific advancement – is well measured.  DG-RTD holds EU wide consultations, involves citizens, pays consultants, holds meeting – and, above all, has an extensive metric with which to assess cutting edge research, researchers and research institutions. This metric is based on globally acceptable standards and on EU developed criteria that are well worked out, public and made obligatory for reviewers in judging proposals for funding submitted in response to research calls by the DG-RTD. So far, no problem.

It is the second one where problems appear – supporting institutional research partnerships. A short view on the H2020 call page today (25 April 2018) shows long lists of calls – almost everyone has ‘collaboration’, ‘partners’, ‘joint’, ‘regional’, and more expressions showing the centrality of partnership to achieving research and innovation goals.

Yet, there is hardly any criteria by which to measure impact nor with which to equip reviewers of calls to make informed and transparent selection of applications successful in this second core aspect of EU calls.

The best summary is informal. We have ‘discovered’ 3 criteria – one of which is ‘hard’ but largely meaningless, and the other 2 cannot really be objectively interpreted, and seem to run against reality.

Criteria 1 – if the call specifies a certain number of partners, then the check is simple. Meets or does not meet. Very objective and very accurate, but hardly of any relevance to the goal.

Criteria 2 and 3 – focus on ‘approximate similar budgets’ and ‘approximately similar responsibilities’ to safeguard against calls serving only one or a few institutions where the others are added pro-forma. This may be especially important in joint research with low and middle income institutions and reduces the massive resource imbalance between partners. However, it is not clear how this rewards partnerships where some partners really have higher expertise, equipment, facilities whereas others are just starting. There is no ‘right figure’ for ‘approximately similar budget’ or ‘approximately similar responsibility’ – and, in fact, this is not a criteria which reviewers can reasonably use transparently. The EU can also not really measure impact of the partnership component – for example in achieving competitiveness, or in achieving research system building in low and middle income countries.

The EU is clearly not the only research funder that struggles with ‘partnerships’ – in fact, we are not aware of any widely accepted framework related to effectiveness, efficiency, impact or ‘fairness’ of research partnerships anywhere. There is also no systematic learning happening – we are not sharing best practices – we are not learning what happens in other parts of the world. It seems that science has deserted its own core – there is no systematic study and learning of the second pillar of successful and competitive science infrastructure: partnerships.

In response, the Council on Health Research for Development – COHRED has developed the Research Fairness Initiative (RFI) aimed at creating a due diligence instrument and compliance tool for exactly this: ensuring that partnerships work and are ‘fair’.

Essentially, the RFI proposes a global reporting system for academic and research institutions, government agencies, research funders and business engaged in research – in fact, it is applicable to all key stakeholders in global (health) research. The RFI Report is written around pragmatic and universally applicable indicators of the quality and fairness of research collaborations. Originally aimed at research collaborations that involved low and middle income countries, it is now clear that it applies across sciences and across socio-economic strata of countries.

The concept is simple: every institution prepares their own RFI Report once every two years. The report consists of the answers to a series of questions focusing on the quality, fairness and equitability of research collaborations. The questions are simple – the answers are usually simple – but action to improve may be intensive : i) what is your institution’s current policy or practice related to … ; ii) if you have examples of good policies and practices, please share this, and iii) what improvements are envisaged in the short term. This is repeated for 15 key topics – each assessed by 3 indicators.

This will achieve transparency in how research partnerships are set up and managed, create a pool of shared practices and systematic learning from which new standards and benchmarks can be developed.

There is no doubt that institutions at the beginning of the research excellence curve need international partnerships to develop their science base further – and there is also no doubt that many of the current partnerships are far from optimal these institutions – for example, in terms of sharing intellectual property, authorship, data ownership, decision making, and access to funding – all issues that the RFI requests answers to.

A first RFI Report has been published now by the Tropical Disease Research and Training Programme of the WHO (WHO/TDR). This enables you to see how the RFI works, what an RFI Report can look like. The RFI Reports of three Senegalese institutions and of the Institute of Tropical Medicine and Hygiene of the New University of Lisbon are nearing completion. Others are in the pipeline.

It is time that one of the world’s biggest research funders begins to focus seriously on improving metrics, transparency and impact of the thousands of partnerships it promotes and supports through its funding by adopting the RFI as a key due diligence tool – that is readily available, increasingly used and creates the first systematic learning platform for improved research and innovation partnerships.

Why not insist that a lead organization in any EU funded partnership submits their own institutional Research Fairness Initiative Report as evidence that it has thought seriously about 15 of the most frequently mentioned aspects that make or break partnerships?

Surely – €70+ billion in partnership funding requires such a tool.