Opportunity

Secure Up to GBP 1.97 Million for UK Business Expectations Data Infrastructure: ESRC Decision Maker Panel Grant 2026 to 2029

Some research projects feel like fireworks: thrilling, bright, over quickly. The Decision Maker Panel (DMP) is the opposite.

JJ Ben-Joseph
JJ Ben-Joseph
📅 Deadline Mar 10, 2026
🏛️ Source UKRI Opportunities
Apply Now

Some research projects feel like fireworks: thrilling, bright, over quickly. The Decision Maker Panel (DMP) is the opposite. It’s more like the national power grid—quietly essential, easy to take for granted, and absolutely catastrophic to lose.

If you’ve ever read an economic forecast, a Bank of England analysis, or a policy briefing that tries to answer “What are firms planning to do next?” there’s a good chance DMP data was somewhere in the plumbing. The panel tracks expectations and uncertainty among business decision makers across sectors and sizes—exactly the kind of information that gets weird, volatile, and politically relevant at the worst possible times (inflation spikes, supply chain chaos, interest rate swings… pick your decade).

This ESRC (Economic and Social Research Council) invite-only funding opportunity is built for one job: keep the DMP running at high quality for another three years (2026–2029), including the survey operations, continuity of the time series, and—crucially—making the data available to other researchers in line with ESRC policy.

And yes, the money is real: up to £1.97 million for up to 36 months. This is a tough one to get (and you can’t casually stroll into it anyway), but it’s the kind of infrastructure award that can define a research team’s reputation for years.


At a Glance: Decision Maker Panel 2026 to 2029 (ESRC, Invite Only)

DetailInformation
Funding typeGrant (data infrastructure / longitudinal survey continuation)
FunderEconomic and Social Research Council (ESRC)
Max awardUp to £1.97 million
DurationUp to 36 months (three financial years)
Opportunity statusUpcoming
Deadline10 March 2026, 16:00 (UK time)
Who can applyInvite only (UKRI Funding Service)
Core purposeSustain DMP data collection, continuity of series, and researcher access
Key activitiesSurveying business decision makers; data availability aligned with ESRC data policy
Primary linksOfficial opportunity page (see end)
Contacts (from listing)[email protected]; [email protected]; [email protected]

What This Funding Actually Supports (And Why It Matters)

Let’s translate the polite UKRI phrasing into plain English: this isn’t “a research project.” This is keeping a national-grade measurement instrument calibrated and running.

Over three years, the award is intended to sustain the machinery that makes DMP valuable: consistent sampling, repeatable questions, stable fieldwork operations, and the behind-the-scenes quality controls that stop a survey from slowly turning into a statistical folk tale.

At minimum, your proposal should convincingly cover two big commitments.

First: collecting data from key decision makers in businesses—not just any respondents with a spare minute, but the people whose expectations become actual hiring plans, pricing decisions, investment choices, and wage settlements. The value here is breadth (different sizes and industries) and continuity (a time series you can trust).

Second: making the data available to other researchers, in line with ESRC’s data infrastructure strategies and research data policy. Think of this as the public-service obligation of the award. You’re not only running a survey; you’re running a data pipeline that other academics and government analysts can use without needing to reverse-engineer your methods or beg for access.

That second part is where infrastructure projects either shine or faceplant. It’s not enough to say “we’ll share the data.” Reviewers will want to see the practicalities: documentation, metadata, governance, disclosure control, user support, and a plan that respects the realities of business-sensitive information.

Finally, the scale of the award (up to £1.97m) signals something else: ESRC expects a professional operation. That can include project management, survey operations, data stewardship, and the kind of risk planning that keeps the series alive even when staff change, costs rise, or response rates wobble.


Who Should Apply (And Who Should Not Waste Their Time)

Because this is invite only, the “who should apply” question is really about internal fit: if you’ve been invited, can your team genuinely carry this for three years without dropping the baton?

The strongest candidates tend to look like a small coalition rather than a lone genius. You’ll likely need people who understand business survey methodology, people who can manage fieldwork and panel maintenance, and people who take data access and governance seriously enough to make it painless for external users.

A good-fit organisation typically has experience with at least one of these worlds:

You might be part of a university-based group with a track record running large-scale surveys or administrative data services, where “data quality” is a daily job rather than a slide in a deck. Or you may be embedded in a research institute that already collaborates closely with government and understands what policy users need: stability, timeliness, and clarity.

You’re also a strong fit if you can show you understand the people behind the data: business decision makers are busy, skeptical, and allergic to wasted time. If your operational plan treats respondents like prized partners—clear comms, smart questionnaire design, sensible contact strategies—you’ll sound like you’ve done this before. Because, frankly, you should have.

Who should hesitate (even if invited)? Teams that mainly want to “add a module” to answer their own research question, with data sharing as an afterthought. This opportunity is about continuity and infrastructure, not building a bespoke dataset for one lab’s publications. You can absolutely do excellent research alongside it—but the central promise is maintaining DMP as a trusted resource for the wider community.


The Non-Negotiables: Continuity, Quality, and Data Access

Continuity of the series

Longitudinal data is like sourdough starter: the value comes from not killing it. Reviewers will be alert to anything that could break comparability across waves—big question rewrites, erratic sampling, inconsistent weighting, or changes in respondent profile that aren’t understood and corrected.

High-quality data collection infrastructure

“Infrastructure” is the unglamorous stuff that saves your results from being politely ignored. Think: response management, rigorous QA, version control for instruments, and clear processes so the survey doesn’t depend on one heroic staff member who remembers where everything is.

Data availability aligned with ESRC policy

This is where you show you’re a good citizen of the ESRC ecosystem. Expect to address how data will be prepared for other researchers, how documentation will be produced and maintained, what access routes will exist, and how you’ll manage confidentiality concerns around firm expectations and uncertainties.

If you get these three right, you’re speaking the language of an infrastructure award: dependable, transparent, and useful to people outside your immediate circle.


Insider Tips for a Winning Application (The Stuff People Learn the Hard Way)

1) Treat respondent time like money—because it is

Business decision makers aren’t a captive audience. If your approach hints at survey bloat, unclear questions, or frequent unnecessary contacts, expect trouble. Build in discipline: explain how you’ll keep the instrument focused, how you’ll test question wording, and how you’ll avoid “just one more item” syndrome.

2) Make quality assurance sound routine, not heroic

Reviewers get nervous when quality is described as a last-minute scramble. Instead, describe QA as a steady set of checkpoints: monitoring response rates by strata, detecting mode effects, tracking item nonresponse, flagging odd distributions early, and documenting changes clearly.

A good infrastructure proposal reads like: “We have systems.” Not: “We have hopes.”

3) Show your plan for maintaining panel health

Panel surveys are living organisms. People churn. Firms restructure. Contacts leave. Your proposal should show practical strategies to keep participation stable across time—contact maintenance, engagement messaging, and procedures for replacing or updating respondents without destroying representativeness.

4) Build a serious data management and sharing plan

Data availability isn’t a line item; it’s a product. Spell out who will produce documentation, how quickly data will be processed after collection, what disclosure checks are expected, and how users will understand variable definitions across waves.

If your plan relies on “we’ll write some docs near the end,” you’ll sound like someone who has never tried to create usable datasets under deadline pressure. Instead, propose continuous documentation that grows wave by wave.

5) Write for two audiences: survey experts and data users

Some reviewers will care about sampling frames and weighting methods. Others will care about whether external researchers can actually use the data without emailing you twelve times. A strong proposal bridges both: methodological credibility plus user experience.

6) Budget like an operator, not a dreamer

With up to £1.97m on the table, vague budgets stand out—in a bad way. Make your costs match your operating model: survey operations, staffing continuity, data processing, governance, and user support. If something is essential (for example, documentation and archiving support), fund it properly.

7) Prove you can survive staff turnover and curveballs

Three years is long enough for people to leave and vendors to change terms. Include resilience: documented processes, cross-training, and clear responsibilities. Nobody expects zero risk; they expect you to have thought about it like an adult.


Application Timeline: Working Backwards from 10 March 2026

Even if your team is experienced, infrastructure proposals have a lot of moving parts. A realistic timeline keeps you from producing a technically accurate but oddly unconvincing application—the kind reviewers describe as “underdeveloped” even when it’s not.

From mid-February 2026, you should be in finalization mode: polishing the narrative, validating the budget, confirming partners’ contributions, and checking your data sharing plan against ESRC expectations. Leave time for the Funding Service admin steps and for internal approvals—those processes have a talent for expanding to fill all available time.

In January 2026, aim to have a complete draft that includes your operational model: sampling and fieldwork plan, QA processes, governance, data pipeline, and a clear statement of what will be delivered and when. This is also the month to do “hostile review”: get someone outside the core team to read it and point out what’s missing or unclear.

In December 2025, you should be locking down who is responsible for what. Infrastructure projects fail when responsibilities are vibes-based. Assign named roles for survey operations, data stewardship, user support, and project management—and make sure those people have time allocated.

In November 2025, start with the spine of the proposal: what continuity means for DMP, what will be maintained, what will be improved (carefully), and how you’ll keep the series comparable. If you can’t articulate that cleanly, the rest becomes expensive decoration.


Required Materials (What You Should Prepare, Even If the Portal Has Its Own Format)

UKRI applications can vary in exact fields and attachments, but for an award like this, you should expect to prepare a coherent package that covers operations, governance, and value.

At minimum, plan for:

  • A core project narrative explaining why DMP continuity matters, what work will be done over the three years, and how success will be measured. Write this like you’re explaining how you’ll keep a plane in the air while also upgrading the cockpit. Calmly. Specifically.
  • A detailed workplan and milestones that makes the three-year period feel manageable: wave schedules, processing deadlines, release points, documentation updates, and planned reviews.
  • A budget with justification that connects costs to delivery. If there are subcontractors or survey vendors, explain procurement/management assumptions and QA oversight.
  • A data management and sharing plan aligned to ESRC research data policy, with clarity on documentation, metadata, access conditions, and any disclosure controls needed for business data.
  • Team capability statements (CVs or roles) showing that the people responsible for survey operations and data stewardship have done comparable work before.

If you’re invited, you likely already know the “official” list will be more precise. The trick is to make these elements feel integrated, not stapled together at the end.


What Makes an Application Stand Out (Beyond Being Competent)

A merely competent infrastructure proposal says: “We will continue the survey.” A standout proposal says: “We will continue the survey and protect what makes it scientifically and publicly valuable.”

Expect the strongest applications to demonstrate four things.

First, credibility of operations: the plan reads like it has been run through real constraints—response fatigue, staffing limits, timing conflicts, and the mundane truth that data pipelines break in boring ways.

Second, clarity on continuity vs. change: reviewers will want reassurance that any updates (question tweaks, process improvements, documentation upgrades) will not wreck comparability. If you propose changes, justify them carefully and show how you’ll document and test them.

Third, commitment to external usability: datasets that other researchers can’t understand might as well be locked in a cupboard. Great proposals treat documentation, metadata, and access pathways as first-class outputs, not chores.

Fourth, alignment with ESRC data infrastructure strategy: you don’t need to quote policy at length, but you should clearly behave like someone who understands the broader system—archiving, standards, and the expectation that publicly funded data should be discoverable and usable.


Common Mistakes to Avoid (And How to Fix Them)

Mistake 1: Writing like continuity is automatic

Continuity is work. Say what you’ll do to preserve it: instrument control, change logs, consistent sampling methods, and explicit governance for approving modifications.

Mistake 2: Overpromising “improvements” without protecting comparability

It’s tempting to propose a shiny new set of questions or a major redesign. Unless there’s a compelling case, this can read like you’re about to break the time series. If you want to improve something, frame it as targeted, tested, and documented.

Mistake 3: Treating data sharing as a compliance checkbox

If your data availability plan is thin, reviewers will assume external users will struggle. Add substance: timelines for releases, documentation standards, support channels, and clarity on access conditions.

Mistake 4: A budget that doesn’t match the workload

If you budget heavily for senior investigators but light on operational roles (survey management, data processing, documentation), you’ll look like you’re funding prestige rather than delivery. Rebalance toward the people who make the machine run.

Mistake 5: Vague governance and accountability

Infrastructure work needs decision-making structures. Who approves questionnaire changes? Who signs off releases? Who handles issues with data quality? If the answer is “we’ll discuss as a team,” reviewers will imagine chaos.

Mistake 6: Ignoring the human side of respondent engagement

Response rates don’t stay healthy by accident. Explain how you’ll communicate value to participating firms, reduce burden, and handle churn. Treat respondents as partners, not sample units.


Frequently Asked Questions

Is this opportunity open to everyone?

No. It’s explicitly invite only. If you haven’t been invited, this one is not a standard open call you can apply to “just in case.”

What is the funding amount and duration?

ESRC will contribute up to £1.97 million. Funding can last up to three financial years (36 months).

What kinds of activities need to be included?

Your proposal should cover continuing DMP survey operations, especially collecting data from business decision makers about expectations and uncertainty, and ensuring data availability to other researchers consistent with ESRC data infrastructure strategy and ESRC research data policy.

What does data availability mean in practice?

At minimum, it means the data is prepared, documented, governed, and shared in a way that other researchers can actually use. That typically includes clear metadata, user documentation, and an access approach appropriate for potentially sensitive business information.

Who are the likely users of the data?

The listing highlights availability to researchers across academia and government. In reality, users can include policy analysts, economists, and research groups studying investment, pricing, hiring, productivity, and uncertainty.

What if we want to add new questions or modules?

Be careful. Changes can be valuable, but they can also damage comparability. If you propose additions, keep them disciplined, justify them, and explain how you’ll test and document them so the time series remains interpretable.

What if we have questions about submission or the system?

UKRI provides a support contact for the Funding Service. Use it early—technical issues have a habit of appearing when deadlines are close and patience is low.

Who should we contact about data sharing expectations?

The listing includes [email protected], which is a sensible starting point if you need clarity on sharing routes, standards, or related expectations.


How to Apply (Invite Only, But Still Worth Getting Right)

If you’re invited, treat the period before submission like a mini-project in itself. Start by confirming your internal roles and resourcing—who owns survey operations, who owns the data pipeline, who owns documentation and releases, and who has final authority when tradeoffs appear (because they will).

Next, map your three-year plan into deliverables a reviewer can picture: wave schedule, processing timeline, release cadence, and clear milestones that prove continuity is being actively managed rather than passively hoped for.

Then pressure-test your data availability plan. Ask a blunt question: “If I were an external researcher, could I use this dataset without personal tutoring?” If the answer is no, improve the documentation and access story until it becomes yes.

Finally, leave yourself enough runway for the Funding Service steps and institutional approvals. Submitting at the last minute is a terrible tradition. You don’t need to keep it alive.

Apply Now: Official Opportunity Page (Full Details)

Ready to apply (or confirm requirements and contacts)? Visit the official UKRI listing here: https://www.ukri.org/opportunity/decision-maker-panel-2026-to-2029-invite-only/