Opportunity

Win Up to GBP 4.7 Million to Build the UK National PET Imaging Data Platform Grant 2026: A Practical Guide for Consortium Teams

If you’ve ever tried to do serious medical imaging research across multiple institutions, you already know the villain of the story isn’t a lack of scanners or brilliant clinicians. It’s data chaos. Files in different formats.

JJ Ben-Joseph
Reviewed by JJ Ben-Joseph
📅 Deadline Jun 17, 2026
🏛️ Source GCRF Opportunities
Apply Now

If you’ve ever tried to do serious medical imaging research across multiple institutions, you already know the villain of the story isn’t a lack of scanners or brilliant clinicians. It’s data chaos. Files in different formats. Half-documented metadata. “Final_v3_reallyfinal” spreadsheets. Access rules that change depending on who’s asking and what year it is. And that’s before you even touch the hairiest part: linking imaging data to sensitive health records responsibly.

This UKRI/MRC opportunity is about fixing that—at national scale, with grown-up money. The National PET Imaging Platform (NPIP) Data Platform is a cornerstone piece of infrastructure: a home for imaging and research data that can actually be found, accessed appropriately, connected to other datasets, and reused without everyone reinventing the wheel.

The headline number matters: up to £4.7 million (full economic cost), funded at 100% by MRC, for 42 months, with a fixed start date of 1 October 2026. That’s not a small “pilot” that runs out of steam after a year; it’s a real build.

But here’s the catch—and it’s a big one. You can’t just wander in off the street and apply. You must be invited, and you must be part of the consortium bid developed and approved by NPIP, based at a UK research organisation eligible for MRC funding. So this is less “open call” and more “if you’re already in the room, here’s how to win.”

What follows is a no-nonsense guide to what this grant is trying to achieve, who it’s for, what a strong application tends to look like, and how to plan your work backwards from the deadline so you’re not writing governance sections at 3 a.m. in June.


At a Glance: NPIP Data Platform Funding Snapshot

Key detailWhat it means for you
Funding typeUKRI / MRC infrastructure funding to host a national data platform
Opportunity focusData Platform for the National PET Imaging Platform (NPIP)
Maximum budgetUp to £4.7 million FEC
Funder contributionMRC funds 100% of FEC (unusual and extremely helpful)
Project length42 months
Fixed start date1 October 2026
Deadline17 June 2026, 16:00 (UK time)
EligibilityMust be invited; must be in the NPIP-approved consortium bid; must be hosted at a UK organisation eligible for MRC funding
Core platform goalsMake data findable, accessible, interoperable, reusable; provide user tools/services; support sensitive data linkage
Official pagehttps://www.ukri.org/opportunity/national-pet-imaging-platform-data-platform/

What This Opportunity Offers (And Why It Matters)

Let’s be clear: this isn’t “fund a study.” This is “fund the plumbing that makes many studies possible.”

The Data Platform is expected to make NPIP imaging and research data FAIR—that’s shorthand for Findable, Accessible, Interoperable, and Reusable. Think of FAIR as the difference between a library and a pile of books. A pile technically contains knowledge. A library lets people locate it, understand it, and use it without begging someone for directions.

A strong national imaging data platform typically includes:

  • A data catalogue where researchers can discover what exists (modalities, cohorts, acquisition parameters, phenotypes, derived outputs).
  • Access controls that handle sensitive data properly (permissions, audit trails, role-based access, secure environments).
  • Interoperability standards so datasets from different sites don’t behave like different species.
  • Reusable data and tools, meaning not only storage but also documentation, versioning, pipelines, and ideally reproducible workflows.

On top of that, NPIP specifically calls out services and tools to users and support for sensitive data linkage. Translation: this platform should not be a static archive where data goes to nap. It should be an active service that helps users do real work—while treating privacy and governance as design requirements, not afterthoughts.

Also worth appreciating: 100% FEC funding changes the internal politics of building infrastructure. You can resource the platform properly—engineering, security, data stewardship, user support—without playing whack-a-mole with underfunded roles that quietly become single points of failure.

If you’re in the consortium, this is a chance to set national norms: how PET imaging data is described, accessed, linked, and reused across the UK. Do it well, and you’ll save the research community thousands of hours and unlock analyses that only happen when datasets can talk to each other.


Who Should Apply (Eligibility, Interpreted Like a Human)

This opportunity is designed for a very specific crowd: the consortium already assembled and approved by NPIP. If you’re not in that consortium, your first step isn’t writing an application—it’s understanding whether you’re even eligible to be at the table.

You should be considering this call if you’re part of the NPIP-backed consortium and you can credibly host and operate national-grade research data infrastructure. That usually means you have some mix of:

A UK research organisation eligible for MRC funding is non-negotiable. In practice, that often includes UK universities, certain research institutes, and other approved research organisations. If your organisation has never held MRC infrastructure funding, don’t panic—but do make sure your research finance team can confirm eligibility early. Waiting until a week before submission to discover an eligibility mismatch is the academic version of stepping on a rake.

The “must be invited” condition matters too. It signals that NPIP and the funder want alignment and coordination, not a dozen competing platforms. So “Who should apply?” really means: Who in the consortium is best placed to host the platform and be accountable for delivery?

Here are real-world examples of applicants who tend to fit:

  • A university-based team running a well-established secure data environment (SDE) that already supports health data access, with proven auditability and user onboarding.
  • A research institute with deep experience in imaging informatics, metadata standards, and multi-site data ingestion pipelines.
  • A consortium-led host that can credibly manage operations: service desk, uptime, change control, incident response, and long-term maintenance planning.
  • A partner with demonstrated experience in data linkage (for example, connecting imaging datasets to NHS or cohort data through approved routes), and the governance maturity to do it safely.

And here’s who’s likely not a fit (even if they’re brilliant scientists):

  • A small team with a clever prototype but no operational capacity to run a national service for years.
  • Groups that treat governance, security, and user support as “we’ll figure it out later.”
  • Anyone outside the NPIP-approved consortium or without an invitation to apply.

This is a build-and-run job, not a weekend hackathon.


Insider Tips for a Winning Application (The Stuff That Actually Moves the Needle)

This is infrastructure funding, which means reviewers will read your application with a slightly different mindset than they would a scientific proposal. They’re asking: Will this platform work, will it be used, and will it be safe and sustainable? Here are practical ways to make that “yes” easy.

1) Write like a service owner, not a researcher

A common mistake is treating the Data Platform as a research output. Instead, frame it as a product with users. Who are they? What do they need on day one? What will they need by year three? Spell out the user journey: discover data → request access → analyse securely → publish → deposit derived outputs.

2) Make FAIR concrete with examples

Don’t just promise “FAIR.” Show it.

Explain what “findable” means in your build: searchable metadata catalogue, persistent identifiers, structured fields. Show what “interoperable” means: support for standard formats, harmonised metadata, APIs. Offer a simple scenario: “A researcher wants all NPIP PET scans acquired with parameter X and linked to clinical outcome Y.” Then show how your platform makes that possible without a month of emails.

3) Treat data linkage as a first-class feature with guardrails

Sensitive linkage is where good intentions go to die if governance is vague. Be explicit about how linkage requests will be evaluated, approved, executed, and audited. If you’re using a trusted research environment model, say so and describe the controls. If linkage uses external providers, map responsibilities clearly.

4) Prove you can operate, not just build

Infrastructure dies in operations. Reviewers will want confidence in:

  • uptime and monitoring
  • incident management
  • change control
  • documentation
  • user support
  • onboarding and training

Include an operations plan that sounds like you’ve done this before. If you have service metrics from existing platforms (response times, number of users supported), include them.

5) Budget like you intend to succeed

£4.7m FEC is generous, but platforms are hungry. Under-budgeting key roles (data engineers, security, product management, user support, data stewards) is a red flag. Explain why each major cost exists and what risk it removes. Reviewers don’t hate spending; they hate spending without justification.

6) Show adoption will happen because the consortium will use it

The easiest way to prove uptake is to show that NPIP partners will rely on the platform for their workflows. Include a phased onboarding plan for sites, and describe incentives: standardised deposition requirements, shared pipelines, access to tools, reduced admin burden.

7) Build for reuse and extensibility, not a bespoke snowflake

Avoid designing something only your team understands. Use established standards where possible. Document everything. Make it easy for new datasets and new partners to join without a heroic engineering effort. Reviewers love platforms that reduce friction over time.


Application Timeline: Work Backwards from 17 June 2026

The deadline is 17 June 2026 at 16:00, and the start date is fixed at 1 October 2026, which means you’re building something that needs to be ready to mobilise quickly after award. A realistic schedule is less about “writing the application” and more about coordinating governance, technical plans, and consortium sign-off.

Around 10–12 weeks before the deadline (late March to early April 2026), lock your platform architecture and governance approach. This is when you decide what you’re hosting, what you’re integrating, and what you’re not promising in phase one. Ambition is good; uncontrolled ambition is expensive.

By 8 weeks out (mid-April 2026), get serious about budget and roles. Recruit internal buy-in from IT/security, research governance, and finance. Infrastructure proposals fail in the final stretch when a security reviewer asks a question the team can’t answer.

At 6 weeks out (early May 2026), draft the core narrative: user needs, platform functions, data flows, linkage model, and delivery plan. Circulate early to consortium partners—silence is not agreement, it’s just delayed disagreement.

At 3–4 weeks out (late May 2026), you should be polishing, not inventing. Validate every claim: hosting capacity, security controls, staffing, and timelines.

In the final 10 days, focus on compliance: eligibility confirmations, final partner approvals, and submission checks. Treat the deadline time as real. Submitting at 15:58 is a personality type, not a strategy.


Required Materials: What to Prepare (And How to Make It Less Painful)

The official listing doesn’t spell out every attachment, because UKRI opportunities often use structured application forms with supporting documents. Still, you can predict the workload for an infrastructure host call like this.

Expect to prepare:

  • A project and delivery plan that breaks the 42 months into phases (build, onboarding, operations, enhancements), with milestones that sound achievable.
  • A clear technical description of the data platform: ingestion, storage, metadata, access mechanisms, tooling/services, and how you’ll handle interoperability and reuse.
  • A governance and security approach, especially for sensitive data and linkage—who approves access, where analysis happens, what gets exported, and how auditing works.
  • A budget at FEC, with narrative justification that shows you understand the costs of running a service, not just building software.
  • Evidence of consortium alignment and approvals, since you must be part of the NPIP-developed and approved bid and invited to apply.

Preparation advice: don’t treat documentation as a “writing task” at the end. Infrastructure is trust-based. The clearest writing often comes from diagrams, decision logs, and process descriptions created while designing the system. Start those early, then turn them into application prose.


What Makes an Application Stand Out (How Reviewers Tend to Think)

Reviewers for infrastructure hosting are essentially asking three questions: Can you deliver? Will people use it? Will it be safe and compliant? Your application should make each answer obvious.

Standout applications usually do the following well:

They describe a platform that is usable on day one, not a grand blueprint with everything arriving in month 41. Reviewers like phased delivery: an early minimum viable service, followed by incremental improvements.

They show mastery of the “boring” stuff: identity management, access requests, data deposit processes, documentation, training, and support. That’s not boring to users—it’s the difference between a platform people trust and one they avoid.

They connect FAIR principles to practical mechanics. “Reusable” isn’t a slogan; it’s versioning, provenance, standardised pipelines, and clear licensing/terms of use.

They treat sensitive data linkage with respect. A strong application will specify where linkage happens, what identifiers are used, what is retained, and how risks are mitigated. Vague language (“we will ensure compliance”) reads like someone hoping the reviewer won’t ask follow-ups.

Finally, standout proposals feel co-owned by the consortium. Even if one organisation hosts, the plan should show shared governance, clear responsibilities, and a roadmap shaped by user needs across NPIP—not just the host’s preferences.


Common Mistakes to Avoid (And How to Fix Them Fast)

1) Promising a magical platform with no trade-offs

If your plan implies unlimited features, instant onboarding, and perfect linkage, reviewers will assume you haven’t run a platform before. Fix it by stating what’s in scope for phase one, what comes later, and what depends on external approvals.

2) Treating security like a paragraph instead of a system

Sensitive data demands specific controls. Fix it by describing concrete measures: secure analysis environment, role-based access, auditing, export controls, incident response, and governance committees.

3) Underestimating user support and training

A platform without support becomes a ghost town. Fix it by resourcing onboarding, documentation, a helpdesk function, and training sessions—especially during multi-site rollout.

4) Forgetting interoperability is as much social as technical

Standards don’t enforce themselves. Fix it by describing how you’ll work with sites on metadata requirements, data deposit standards, and QA processes. Include a plan for handling exceptions and legacy data.

5) Writing a budget that looks like wishful thinking

If the numbers don’t match the ambition, reviewers will worry about delivery risk. Fix it by aligning staffing, cloud/on-prem costs, security, and operations with a realistic service model—and justify the big lines.

6) Leaving consortium agreement to the last minute

If partners don’t feel heard, they’ll surface concerns late. Fix it by setting structured review points: architecture sign-off, governance sign-off, budget sign-off, final narrative sign-off.


Frequently Asked Questions (FAQ)

1) Is this grant open to anyone in the UK imaging community?

No. You must be invited and be part of the NPIP-developed and approved consortium bid. If you’re not in that group, your best move is to connect with NPIP partners and understand future routes to contribute.

2) What does it mean that MRC funds 100% of FEC?

Normally, UKRI grants fund less than 100% of full economic cost, leaving institutions to cover the remainder. Here, MRC covers the full economic cost, which makes it easier to staff properly and include the real operational expenses of running a national service.

3) How long does the funding last, and when does the project start?

The award supports 42 months of activity, with a fixed start date of 1 October 2026. Plan staffing and procurement accordingly—fixed start dates can create tight mobilisation windows.

4) What kinds of data will the platform handle?

The call points to NPIP imaging and research data, with an emphasis on making it FAIR and enabling sensitive data linkage. In practice, that likely includes PET imaging datasets plus associated research metadata and potentially linked clinical/cohort variables, handled under appropriate governance.

5) What is sensitive data linkage in plain English?

It means connecting datasets in a way that could identify individuals if handled poorly—like linking imaging data to health records or other personal data. Doing it properly requires strong governance, controlled environments, and auditability so privacy is protected.

6) Does the platform need to provide tools, or just storage and access?

The opportunity explicitly mentions services and tools for users. Storage alone is rarely enough. Think workflows: discovery, access, analysis support, and reuse—ideally including documentation and user-facing utilities that reduce friction.

7) What if our consortium has multiple candidates who could host?

That’s a consortium governance question, but funders generally prefer clear accountability. If there are multiple strong contenders, consider a model where one host holds responsibility while others provide defined technical components or governance roles—spelled out crisply.

8) How competitive is this likely to be?

Because it’s invitation-based and tied to an NPIP-approved consortium, the competition is less “hundreds of applicants” and more “one or a small number of eligible bids.” The real competition is against failure modes: unclear governance, unrealistic delivery, weak security planning, and undercooked operations.


How to Apply: Next Steps (Do This Now, Not Later)

First, confirm the basics: you are invited, you are part of the NPIP-approved consortium bid, and your host organisation is eligible for MRC funding. If any of those are uncertain, resolve them immediately—this call doesn’t reward hopeful interpretation.

Second, run a short internal “platform reality check” meeting with the people who will actually carry delivery: platform engineering, information security, data governance, and user support/operations. Ask two blunt questions: What will break? and What will we need on day one to keep this safe and usable? Then build your plan around those answers.

Third, start writing early and circulate drafts to consortium partners on a schedule. Infrastructure proposals improve through criticism—especially the parts about governance and data linkage, where different organisations have different risk tolerances.

Ready to apply? Visit the official opportunity page for full details and submission instructions: https://www.ukri.org/opportunity/national-pet-imaging-platform-data-platform/