Metascience Research Grants Round 2 (UK, 2026): How to Win Up to £350,000 to Study How Science Works
If you’ve ever stared at a grant application and thought, “There has to be a better way to do this,” this scheme is literally for you.
Metascience Research Grants Round 2 (UK, 2026): How to Win Up to £350,000 to Study How Science Works
If you’ve ever stared at a grant application and thought, “There has to be a better way to do this,” this scheme is literally for you.
The Metascience Research Grants (Round 2) from UK Research and Innovation (UKRI) are all about researching research itself — how we fund it, organise it, measure it, and increasingly, how artificial intelligence warps or improves every step of that process.
This isn’t about doing more physics, biology, or social science per se. It’s about asking sharper questions like:
- Are current peer review systems fit for purpose?
- How should AI tools be used (or contained) in academic work?
- What incentives actually produce trustworthy, high-impact R&D?
- How do you measure “research excellence” without just rewarding hype and citation games?
In other words: if normal research is studying the world, metascience is studying how we study the world.
Round 2 is a UK-based funding call, backed by multiple UKRI councils, offering up to £250,000 in full economic cost (or £350,000 if you have an international partner), of which UKRI funds 80%. The opportunity will open on 12 February 2026, with a deadline of 23 April 2026 at 16:00 UK time.
It’s still a pre‑announcement, which means some details may shift. But you can already start doing the smart thing: shaping a sharp, fundable idea well before the portal opens.
If you care about making research less wasteful, more honest, and more effective, this is one of the most interesting pots of money on the table.
Metascience Research Grants Round 2: At a Glance
| Detail | Information |
|---|---|
| Scheme | Metascience Research Grants – Round 2 |
| Funder | UK Research and Innovation (UKRI), via ESRC, NERC, BBSRC, AHRC, STFC, EPSRC, MRC |
| Status | Upcoming (pre-announcement) |
| Call Opens | 12 February 2026 |
| Application Deadline | 23 April 2026, 16:00 UK time |
| Funding Available | Up to £250,000 FEC, or up to £350,000 FEC with an international partner |
| UKRI Contribution | 80% of full economic cost |
| Research Focus | Metascience: how research and R&D are conducted, supported, evaluated; impact of AI; research institutions; measuring excellence |
| Eligibility (Host) | UK research organisations eligible for UKRI funding |
| International Collaboration | Strongly encouraged; higher funding cap if you include an international partner |
| Disciplines | Open across councils – social sciences, humanities, natural sciences, engineering, medicine, etc., if focused on metascience questions |
| Contact | [email protected] |
| Official Page | https://www.ukri.org/opportunity/metascience-research-grants-round-2/ |
What This Opportunity Actually Offers (Beyond the Money)
On paper, this is a grant for up to £250k–£350k FEC. In practice, it’s a rare chance to experiment with the plumbing of the research system itself.
Serious Funding for Methodical Experiments on Research
With up to £250,000 FEC (or £350,000 with an international partner), you can do more than run a small survey and call it a day. You can:
- Run randomised experiments on peer review or grant assessment.
- Analyse large bibliometric or grant datasets to see what actually predicts good outcomes.
- Trial new institutional policies (e.g., narrative CVs, open review, team-based recognition) and study the consequences.
- Examine how AI tools reshape productivity, creativity, and integrity in research teams.
Because UKRI funds 80% of the FEC, your institution covers the remainder, as usual for UKRI grants. For a metascience project, that can translate into:
- A postdoc or two working full- or part-time on the project.
- Data acquisition and management (including large-scale data processing, licences, or storage).
- Stakeholder engagement – workshops with funders, research leaders, policymakers.
- Software development, if you’re building tools or dashboards to test new models for evaluation or transparency.
Cross‑Council Support = Wide Disciplinary Scope
This call is backed by heavyweights: ESRC, NERC, BBSRC, AHRC, STFC, EPSRC, MRC.
That’s a pretty loud signal that metascience is not niche anymore. You can come at this from:
- Social sciences (e.g., sociology of science, economics of research funding, behavioural incentives).
- Computer science / data science (e.g., AI in research workflows, large-scale text analysis, open source tooling).
- Humanities (e.g., philosophical foundations of “excellence”, ethics of AI-mediated discovery).
- Medical and life sciences (e.g., clinical trial reproducibility, research waste, publication biases).
- Physical and environmental sciences (e.g., collaborative models for large facilities, open data norms).
As long as your central question is about improving how we do and support research, you’re in the right neighbourhood.
Real‑World Influence
Unlike a theoretical paper that only your subfield reads, metascience results can turn into:
- Changes in funding policy (how grants are scored, how panels work, how CVs are read).
- Institutional reforms (how promotion is assessed, what counts as “output”, how AI is regulated).
- Community norms (data sharing, open methods, team science recognition).
If you’re tired of complaining about “the system” and want to test concrete alternatives, this call gives you both a budget and a megaphone.
Who Should Apply (and Example Project Types)
You must be based at a UK research organisation eligible for UKRI funding. Typically that includes:
- UK universities and approved higher education institutions
- Some independent research organisations and institutes
- Certain public sector research bodies
If you’re unsure whether your organisation qualifies, check your internal research office or UKRI’s eligibility guidance.
Ideal Applicants
You’re a strong candidate if:
- You are PI-eligible at your UK institution (or can be by the time the call fully opens).
- You have a track record or clear interest in metascience, research policy, AI in research, or related analysis.
- You can assemble a multi‑disciplinary team, for example:
- A social scientist + data scientist + institutional partner
- A philosopher of science + AI researcher + funder liaison
- An economist + research office professional + international collaborator
International Collaborations
An international partner is:
- Not mandatory, but
- Financially incentivised – with them, your FEC cap rises from £250k to £350k.
You might team up with:
- A US-based group running trials on new peer review processes
- A European centre working on research assessment reforms
- A global open science initiative testing policies across institutions
They don’t need to receive direct UKRI funds (check the detailed rules when the call text is out), but they can significantly deepen your dataset, comparisons, and credibility.
Example Project Ideas
To spark your thinking:
AI in grant writing and reviewing
How do AI tools change proposal quality, novelty, or bias? Randomise applicants or reviewers to AI‑assisted vs manual processes and test outcomes.Rethinking research excellence metrics
Compare narrative CVs, citation metrics, and qualitative assessments to see which actually predict future impact or reproducibility.Institutional experiments in incentive design
Partner with a university to change promotion criteria (e.g., valuing open data, replication, team science) and track how behaviour and outputs shift.Replication and robustness in AI-heavy fields
Examine how easy it is to reproduce AI-driven research across labs and what policies improve reliability.
If your instinctive reaction is: “We could run a clean experiment on that and actually get data,” you’re exactly who this call is targeting.
Insider Tips for a Winning Metascience Application
This will not be an easy grant to win. You’ll be competing with some of the sharpest people who think about research systems for a living. That said, smart strategy goes a long way.
1. Make the Problem Concrete and Costly
Don’t just say “measuring research excellence is difficult.” Everyone knows that.
Spell out the specific problem in painful detail:
- Where does it currently go wrong?
- Who loses out? Early‑career researchers? Interdisciplinary work? High‑risk ideas?
- What kinds of waste or misallocation does this produce?
- Why does this matter for UKRI’s goals, not just your personal frustration?
When reviewers feel the pain you’re describing, they’re much more likely to care about your solution.
2. Propose Real Experiments, Not Vague Commentary
Metascience that shrugs and says “things are complex” is not very helpful.
You’ll stand out if you propose:
Testable hypotheses
“Narrative CVs reduce gender disparities in shortlisting” is testable. “Narrative CVs might be better” is not enough.Clear experimental or quasi‑experimental designs
Randomised controlled trials, difference‑in‑difference designs, stepped-wedge rollouts across departments, or at least pre‑registered observational studies.Pre‑specified outcome measures
E.g., diversity of funded applicants, predictive validity of panel scores, rates of data sharing, replication success.
3. Build in Collaboration with Real Decision‑Makers
Don’t do this about funders and institutions; do it with them.
- Bring in your university’s research office or HR as a partner.
- Secure informal buy‑in from funders or learned societies who might pilot your interventions.
- Show letters of support that commit to implementing or testing the changes you propose, not just cheering from the sidelines.
Reviewers will ask: If this works, will anything actually change? Your answer should be: Yes, because X, Y, Z are already lined up.
4. Treat AI Seriously (Neither Hype nor Panic)
Because the call explicitly mentions the impact of AI, you’ll score well if you go beyond hand‑waving.
- Be specific about which AI tools, models, or use‑cases you’re considering.
- Address risks (e.g., plagiarism, over‑reliance, bias) and opportunities (e.g., screening, summarisation, code review).
- If you’re using AI in your own analysis (e.g., large‑scale text mining), describe validation and safeguards.
Reviewers have had enough of “AI will change everything” with no plan. Give them numbers, designs, and guardrails.
5. Mix Quantitative Rigour with Qualitative Insight
The strongest metascience doesn’t just count things; it understands context.
Pair:
- Quantitative work – datasets, metrics, experiments
with - Qualitative work – interviews, focus groups, ethnography, document analysis
For example, if you spot that certain groups are consistently under‑funded, use qualitative methods to understand why. That dual approach reads as thoughtful, not naive.
6. Show How Your Results Scale
A project that only produces a PDF report is a missed opportunity.
Think in terms of:
- Reusable tools – open‑source code, dashboards, templates for institutions to run their own analyses.
- Guidelines or policy briefs – concise recommendations for funders, universities, or departments.
- Open data and preprints – so the community can critique, reuse, and build on your work.
You don’t need a full impact plan like an EU grant, but you do need a plausible story from “findings” to “change”.
Application Timeline: Working Backwards from 23 April 2026
The call opens on 12 February 2026 and closes on 23 April 2026 at 16:00. That’s around ten weeks. You’ll want most of your thinking done before the portal opens.
Here’s a realistic backward plan.
November–December 2025: Shape the Idea and Team
- Clarify your core question and why it matters for UKRI and UK research.
- Identify:
- PI and Co‑Is
- Possible international partner(s) (if aiming for the higher cap)
- Institutional collaborators (e.g., research office, HR, library, ethics board)
- Roughly map:
- Methods
- Data sources
- Likely intervention sites (departments, calls, units)
January 2026: Early Engagement and Design
- Talk to:
- Your research development office for internal timelines and approvals.
- Potential institutional or funder partners to confirm they’re willing to implement the experiments or changes you’re proposing.
- Sketch:
- Work packages
- Gantt chart
- Early budget ballpark
12 February 2026: Call Opens – Start Writing in Earnest
From here to mid‑March:
- Draft the case for support, methodology, and impact plan.
- Refine your budget to fit within the £250k/£350k FEC cap.
- Confirm roles, responsibilities, and time commitments for all partners.
Mid‑March to Early April 2026: Review and Polish
- Circulate drafts to:
- Someone in metascience
- Someone who knows UKRI language and expectations
- A smart non‑specialist who can tell you where the argument is muddy
- Tighten logic, strip jargon, sharpen hypotheses.
By 16 April 2026: Internal Sign-Off
Most institutions require:
- Final documents a week or more before the external deadline.
- Institutional approval for the budget and compliance checks.
Lock in everything by mid‑April so you’re not scrambling.
18–21 April 2026: Final Checks and Submission
- Upload all documents to the UKRI system (UKRI Funding Service or successor).
- Check page limits, formatting, CV rules.
- Submit at least 48 hours before 16:00 on 23 April. Systems crash; Wi‑Fi fails. Don’t let that be the reason you miss out.
Required Materials (and How to Make Them Strong)
The final call text will give exact requirements, but UKRI calls usually include some combination of:
1. Case for Support / Project Description
This is your main narrative. Expect sections like:
- Background and rationale – why this problem matters specifically for UK research and UKRI.
- Aims and research questions – precise, testable.
- Methods – data sources, sampling, experimental design, analysis plan.
- Work plan – who does what, when.
- Risks and mitigation – data access, institutional buy‑in, ethical issues.
Treat this as both scientific proposal and policy pitch.
2. Budget and Justification
You’ll specify:
- Staff (PI time, Co‑Is, postdocs, research assistants)
- Travel and subsistence (e.g., workshops, partner visits)
- Data costs, software, transcription
- Dissemination (open access, events)
- Any costs related to international collaboration
Use the justification to show you understand how to get the most insight out of a finite pot of money.
3. CVs / Track Records
Usually short, focused CVs or track record statements:
- Highlight any previous work on:
- Metascience
- Science policy
- Research evaluation
- AI and research
- If you’re newer to the area, emphasise relevant skills and experience more than topic‑specific publications.
4. Letters of Support / Institutional Commitments
These can be very powerful here:
- From a university committing to pilot your intervention.
- From an international partner confirming their role and contributions.
- From funders or learned societies who are prepared to test your findings.
Vague “we think this is great” letters are less persuasive than specific commitments.
5. Ethics and Data Management
You’re quite possibly handling:
- Sensitive staff info
- Peer review records
- Grant application data
- Potentially controversial policy experiments
Outline:
- How you’ll secure approvals (ethics, data access)
- How you’ll anonymise or aggregate data
- How (and when) you’ll share what you can
What Makes an Application Stand Out to Reviewers
Reviewers won’t say it this bluntly, but most are essentially asking:
“If this project works, will we actually know something new and actionable about how to run research better?”
You can help them say “yes” by hitting these points.
1. Sharp, Non‑Obvious Questions
Don’t regurgitate generic critiques. Aim for:
Clear, bounded research questions
e.g., “Does using narrative CVs change who gets shortlisted in mid‑career fellowship schemes?”A sense that you’ve read widely beyond your discipline, including:
- Previous metascience work (e.g., on replication, peer review)
- Policy reports from UKRI and other funders
- Comparative examples from other countries or sectors
2. Feasible but Ambitious Design
They want ambition, but not fantasy.
- Show that your design is doable in the grant period.
- Don’t propose to overhaul the entire UK funding system; instead, propose well-scoped experiments or analyses with clear endpoints.
- Demonstrate access to:
- Data
- Institutional sites
- Technical expertise
3. Credible Team Composition
Reviewers will check whether:
- The team has the methods covered (quantitative, qualitative, AI, policy).
- There’s at least one person who knows how to navigate institutional politics.
- The international collaboration (if any) adds real value, not just extra logos on the application.
4. Clear Path to Impact
You don’t need a glossy brochure, but you do need:
- Defined audiences – UKRI programme managers, university leaders, department chairs, early-career researchers.
- Sensible channels – policy briefs, invited talks, targeted workshops, engagement with existing initiatives.
- A realistic timeline for when interim findings will be shared (don’t keep everything under wraps until the end).
Common Mistakes to Avoid
You can have a brilliant idea and still miss out if you stumble into these traps.
1. Being Too Vague About Methods
Hand‑waving like “we will collect data from several institutions” is a red flag.
Fix it by:
- Naming the types of institutions (and, where possible, actual partners).
- Outlining your sampling strategy.
- Being honest about what’s observational vs experimental.
2. Confusing Advocacy with Research
This is not a proposal to advocate for open science, diversity, or AI; it’s a proposal to study what actually works.
If you’re clearly pushing a predetermined conclusion, reviewers will be sceptical. Frame your project as an honest test, not a campaign.
3. Underestimating Access and Ethics
Working with grant data, HR records, or internal decision‑making is politically and ethically sensitive.
Don’t assume access will be trivial. Show:
- Prior conversations with gatekeepers.
- A plan for approvals and anonymisation.
- Awareness of potential risks to participants and institutions.
4. Ignoring the UK Context
Global comparisons are good, but remember who’s paying.
Tie your rationale and impact tightly to:
- UKRI’s mission
- UK policy debates on research assessment, AI, productivity, or integrity
- The practical choices faced by UK universities and institutes
5. Writing Only for Specialists
Your reviewers may include policy people, methodologists, and folks from entirely different disciplines.
- Explain niche terms.
- Use clear, direct language.
- Provide intuitive examples for your key ideas.
Frequently Asked Questions
Is this only for social scientists?
No. Social scientists will obviously be heavily involved, but the call is cross‑council.
A strong project could involve:
- Economists + computer scientists
- Philosophers of science + medical researchers
- AI researchers + research managers
The key is that the topic is metascience – how research is done, evaluated, or supported – not regular disciplinary research.
Do I need an international partner to apply?
No.
You can apply for up to £250,000 FEC without an international partner.
However, if you include an international collaborator (again, check final rules for costing), the maximum rises to £350,000 FEC, and the comparative perspective can make your proposal richer.
What does “UKRI funds 80% of FEC” mean in practice?
In typical UKRI style:
- You cost your project at full economic cost (FEC) – staff, overheads, estates, etc.
- UKRI covers 80% of that figure.
- Your institution covers the remaining 20%.
Your research office will help compute this. For you as PI, the key thing is to stay within the FEC cap (250k/350k).
Can early‑career researchers lead a project?
Yes, if your institution deems you PI‑eligible on UKRI grants.
For early‑career PIs:
- Surround yourself with experienced Co‑Is, especially those with metascience or policy experience.
- Highlight your independence and prior work, even if that’s mainly methods or smaller pilot projects.
Can I include non‑academic partners?
Absolutely, and they may strengthen your application.
Examples:
- A research funder or charity
- A learned society
- A think tank or policy institute
- Companies providing AI or research analytics tools (with proper conflict‑of‑interest management)
Just make sure their role is clearly defined and genuinely helps answer your research questions.
Will my findings have to be openly available?
UKRI has strong expectations around open access and data sharing, within ethical and legal limits.
Assume you’ll need to:
- Publish main articles via open access routes.
- Share anonymised data or at least detailed synthetic or summary datasets, where possible.
- Make your code publicly accessible.
Factor these into your budget and data management plan.
Can I use this grant to fund a massive new AI infrastructure?
Unlikely, unless it’s tightly aligned with the metascience questions and within the cost cap.
This isn’t an infrastructure or equipment scheme. Any software or tooling you build should serve well-specified research questions about how research is conducted or evaluated.
How to Apply and Next Steps
You can’t submit just yet, but you can absolutely start planning.
Read the official page (and bookmark it):
Metascience research grants round 2 – UKRITalk to your research office now.
Ask:- Are we UKRI-eligible? (almost certainly yes if you’re at a UK university)
- What are our internal deadlines for a 23 April 2026 call?
- Who can advise on cross‑council proposals and metascience?
Sketch a one‑page concept note.
Include:- The core problem
- Your proposed method
- Key partners
- Why it matters for UKRI
Use this to get early reactions from colleagues and potential co‑investigators.
Identify potential partners and champions.
- Institutional offices that could be your experimental sites.
- International collaborators who bring data or comparative insight.
- Policy stakeholders who might implement your findings.
Sign up for updates / contact UKRI if needed.
If you have very specific eligibility or scope questions, contact:
[email protected]
When the call opens on 12 February 2026, you’ll want to be drafting, not still brainstorming. Metascience grants like this are rare, influential, and fiercely contested. If you care about fixing the way research is done, this is absolutely worth the effort.
