Paper-to-Podcast

Paper Summary

Title: The Costs of Competition in Distributing Scarce Research Funds


Source: arXiv


Authors: Gerald Schweiger et al.


Published Date: 2023-03-25

Podcast Transcript

Hello, and welcome to Paper-to-Podcast.

In today's episode, we're diving into the riveting world of research funding, and folks, let me tell you, the figures are more jaw-dropping than seeing a cat do calculus. A recent paper, titled "The Costs of Competition in Distributing Scarce Research Funds," authored by Gerald Schweiger and colleagues, takes a hard look at the economic high jump researchers have to clear just to get their projects off the ground.

Published on the 25th of March, 2023, the paper reveals the staggering economic costs associated with competitive research funding. Picture this: writing a single grant proposal can take anywhere from 25 to 50 days. With acceptance rates hovering around a measly 10 to 25 percent, we're talking about a whopping 100 to 500 person-days of effort for each funded project. That's right, scientists might be spending more time writing proposals than doing actual science!

And get this: sometimes, the costs for applicants, decision-making processes, and administration can equal the entire amount of funds given out. That's like running a marathon and then being told the finish line was actually the starting point. In schemes with a ton of applicants and slim success rates, researchers might as well be trying to squeeze blood from a stone.

But wait, there's more. The predictability of grant peer review outcomes is about as reliable as a chocolate teapot. The paper states there's no strong evidence that the percentile rankings of grant applications are the crystal ball that predicts the future citation impact of the resulting research. This casts some serious doubt on whether the peer review process is picking the best of the bunch or just playing eeny, meeny, miny, moe.

And if you're into high-risk, high-reward research, you might find the funding process as welcoming as a porcupine in a balloon factory. Yes, this paper suggests that the current system might be inadvertently stifling the innovative breakthroughs we all desperately wait for.

Now, how did Schweiger and his band of merry researchers come to these conclusions? They conducted a comprehensive analysis that looked at research funding from every angle imaginable: economic, epistemic, social, and even ethical dimensions. They scoured literature and proposed empirical experiments to understand the reliability of funding decision processes. They also called for data transparency from funding organizations. Imagine that—asking for data to be as available as kittens on the internet!

The methodology was as meticulous as a cat grooming itself. It suggested controlled experiments to test the reliability and predictive validity of funding decisions and to experimentally compare different evaluation systems. They also wanted to look into the economic costs, pondering the existential questions like, "How much does proposal writing really affect research quality?"

The research team didn't stop there. They were also interested in the epistemic costs of competitive funding and the social and ethical price tags that come with it. They recommended moving beyond just theory and actually testing hypotheses through collaborations with funding organizations. Basically, they wanted to try out new funding models in the wild to see what actually works.

The strengths of this paper are as clear as a freshly cleaned window. It's a multidimensional analysis that doesn't just scratch the surface but digs deep. It's like a treasure hunt for actionable recommendations for policymakers and funding agencies, aiming for a more efficient and fair spread of the research wealth.

However, the paper isn't without its limitations. It seems to lean quite a bit on existing literature and theoretical analysis. There's a risk that they might be missing some of the nitty-gritty details that only empirical data can reveal. It's like trying to understand cats without ever actually observing one trying to fit into a box that's clearly too small.

Potential applications of this research are as broad as a cat's whiskers. Policymakers and funding agencies could use these insights to create funding models that are fairer and more efficient, reduce administrative burdens, and support groundbreaking research that could change the world—or at least make it a better place for cats and humans alike.

That wraps up our episode for today. I hope you've enjoyed this peek into the costly conundrum of science funding. You can find this paper and more on the paper2podcast.com website.

Supporting Analysis

Findings:
One of the most intriguing findings from this paper is the revelation of the high economic costs involved in competitive research funding. Astonishingly, the study found that writing a single grant proposal can take up to 25 to 50 days, which, considering the average acceptance rates of 10 to 25%, translates to an enormous 100 to 500 person-days of effort per funded project. The research also highlights that, in some cases, the costs for applicants, decision-making processes, and administration may equal the total amount of funds awarded, effectively reaching a point of zero net financial gain, particularly in schemes with numerous applicants and low success rates. Another surprising aspect is the low predictability of grant peer review outcomes. The paper indicates that there is no strong evidence linking the percentile rankings of grant applications to the future citation impact of the resulting research, which challenges the effectiveness of the peer review process in identifying truly impactful research proposals. Additionally, the paper points out that the process may be biased against high-risk, high-reward research, potentially stifling innovative scientific breakthroughs.
Methods:
The authors of this paper undertook a comprehensive analysis that viewed research funding from various perspectives such as economic, epistemic, social, and ethical dimensions. They conducted a literature review to examine the reliability of decision processes for funding, the economic costs of competitive funding, the impact of competition on conducting risky research, and the effects of competitive funding environments on scientists themselves, including ethical considerations. To understand these aspects, the authors proposed conducting empirical experiments on decision processes and collecting data related to these processes. They highlighted the need for data transparency and accessibility from funding organizations, suggesting that non-sensitive application and decision data should be publicly available for analysis. The methodology also suggested the use of controlled experiments to test the reliability and predictive validity of funding decisions, as well as to compare different evaluation systems and distribution models experimentally. The approach included a detailed inquiry into economic costs, considering the time invested in writing proposals and the effects of proposal writing on research quality. The paper also called for an investigation into the epistemic costs of competitive funding and explored the social and ethical costs associated with current funding mechanisms. The authors recommended moving beyond descriptive research to empirically test hypotheses through collaborations with funding organizations, which would allow for the experimentation with alternative funding models and a deeper understanding of their broader consequences.
Strengths:
The most compelling aspects of the research are its comprehensive and multidimensional analysis of the research funding system and its effects on various outcomes, including economic, epistemic, social, and ethical dimensions. The paper stands out for its critical examination of the reliability and fairness of the peer review process, as well as its predictive validity in the context of grant funding. A noteworthy best practice is the researchers’ call for a more evidence-based approach to understanding and optimizing research funding. They advocate for increased transparency and data accessibility, which allows for a thorough analysis of grant allocation processes. This includes making non-sensitive data publicly available and using controlled experiments to test the reliability and fairness of funding decisions. The paper also emphasizes the importance of exploring alternative evaluation systems and distribution methods to address the limitations of current funding practices. The researchers propose a holistic approach to examining funding systems by combining different study designs, including ecological studies, simulation models, and causal analysis. Overall, the research is compelling for its rigorous examination of a complex system and its effort to propose actionable recommendations for policymakers and funding agencies, contributing to a more efficient and fair allocation of scarce research funds.
Limitations:
One possible limitation of the research is that it seems to rely heavily on existing literature, expert opinion, and theoretical analysis. While these are valuable sources of information, they may not capture the full complexity of research funding systems and the behavior of individuals within those systems. The study may lack empirical data from controlled experiments, which could provide concrete evidence for the proposed recommendations. Moreover, the analysis might not fully account for the diversity of research fields and the different impacts competitive funding could have across these fields. The paper also discusses the potential for bias and subjectivity in peer review processes, but it may not provide a clear solution to these issues. It's important to recognize that the effectiveness of the recommended policy changes would need to be tested in various real-world settings to understand their true impact, and such testing might be challenging due to the complex and multifaceted nature of research funding systems.
Applications:
The research explores the effectiveness of current research funding systems and their impact on scientific progress. Potential applications of this research include informing policy decisions and funding agency strategies to optimize the allocation and management of research funds. Insights from the study could lead to the development of more equitable and efficient funding models that encourage a diverse range of high-quality scientific investigations, including high-risk/high-reward research projects. The findings could also drive changes to reduce the administrative burden on scientists, allowing them more time to dedicate to actual research. Moreover, the research could influence the creation of new evaluation criteria and peer-review processes that minimize bias and encourage responsible research practices. This could help in addressing ethical concerns and improve the overall health of the scientific community by fostering an environment that supports mental well-being and work-life balance for researchers. Lastly, the study's recommendations might be applied to design funding systems that better identify and support groundbreaking research with significant societal impact, aligning with goals such as the United Nations Sustainable Development Goals.