Skip to content
Research themes and methods

UK economic development research built for real decisions

This page summarises the areas we research most often and how we turn evidence into usable outputs. Our approach is practical: we focus on questions that matter to delivery teams and governance committees, document uncertainty, and avoid overstating what data can prove. If you need support turning research into programme design, see our policy support overview.

economic development research charts UK policy analysis

What makes our research usable

  • Traceable sources: we list datasets and references so teams can replicate and update the work.
  • Clear uncertainty: where evidence is weak, we explain the limits and avoid turning assumptions into facts.
  • Decision format: findings are structured around choices, trade-offs, and implications for delivery.

Looking for a research brief or a diagnostic? Contact details are available on the home page. For privacy details, see Privacy.

local growth map analysis UK region
Place

Labour markets, connectivity, housing, and firm dynamics.

productivity data dashboard economic analysis
Productivity

Constraints, adoption, skills, and sector composition.

Research themes

UK economic development spans many interacting systems. Rather than attempting to cover everything, we focus on themes that regularly appear in growth strategies, programme business cases, and evaluation plans. Each theme is approached with the same principles: start from the decision, define what would change if the evidence shifts, and separate what can be measured from what must be judged. We also take care to keep language accurate and avoid claims that would not stand up to scrutiny.

The sections below describe what we typically examine under each theme and the kinds of outputs teams use. If you need help moving from research to actionable policy options, our Policy page outlines common engagement types.

Place-based growth

We analyse how geography shapes opportunity through labour markets, transport connectivity, housing availability, and the mix of firms and sectors. This work often supports local growth strategies and investment prioritisation. Outputs can include a baseline chapter, a comparative peer set, and a short list of constraints with evidence on what has worked in similar contexts.

Sector and cluster dynamics

We examine sector strengths, supply chains, innovation intensity, and workforce pipelines. The goal is to avoid treating every sector as a priority and instead identify a small number of plausible focus areas. We include considerations like substitution risk, displacement, and whether a proposed cluster has the scale to justify targeted intervention.

Skills and participation

We explore participation, qualification profiles, shortages, and employer demand signals. Where possible, we distinguish between cyclical pressures and structural constraints. Outputs often include an indicator dictionary and a measurement plan so stakeholders can track whether interventions change outcomes over time rather than relying on one-off snapshots.

Productivity and firm performance

Productivity work benefits from a careful definition: output per hour is not the same as firm profitability or wages. We examine plausible drivers such as management practices, digital adoption, capital intensity, and innovation. Research is then translated into practical levers that can be tested through programme design and evaluation rather than assumed.

Investment and delivery capacity

We assess how investment proposals interact with delivery realities: procurement, partner capacity, timelines, and reporting obligations. This theme is often used to strengthen business cases by clarifying dependencies and risks. We also support teams in selecting a manageable portfolio of interventions that can be governed properly.

Evaluation and measurement

We produce evaluation-ready research notes: logic models, indicator sets, data collection plans, and realistic approaches to attribution. We pay attention to burden and feasibility so plans can be implemented. Where a counterfactual is not practical, we propose contribution-based methods and triangulation rather than overpromising causal claims.

Methods, transparency, and limitations

Economic development decisions rarely depend on one dataset or one academic paper. They depend on triangulation: official statistics, operational knowledge, and a realistic view of implementation constraints. Our research methods are chosen to match the decision at hand. Where we use modelling, we document assumptions, show sensitivity where possible, and provide interpretation that does not exceed what the inputs support.

We also treat measurement as a product, not an afterthought. A programme can only learn if indicators are defined clearly, data is feasible to collect, and governance understands what is and is not being measured. If you need support developing an evaluation plan or KPI framework, the Policy page sets out typical deliverables.

Data selection

We prioritise reliable sources and explain definitions, time periods, and geography. If a metric is a proxy, we say so and describe what it might miss. This reduces the risk of misinterpretation when results are reused in slides or submissions.

Comparisons

Peer comparisons can clarify context but can also mislead. We explain how peers are selected and avoid implying a single ranking determines success. Where relevant, we show ranges and distributions rather than only point estimates.

Readable outputs

We produce plain-English summaries alongside technical appendices. This helps committees understand what is driving recommendations while allowing analysts to check the details. It also improves consistency between narrative and metrics.

Limits and risks

We highlight common pitfalls such as double counting, displacement, selection bias, and attributing economy-wide trends to one intervention. This protects decision-makers from overinterpreting weak signals and improves the credibility of future reporting.

Using research in funding and governance

When evidence is used in funding submissions or board papers, the standard of proof and the clarity of claims matter. We write research in a way that supports defensible narratives: what the evidence suggests, what remains uncertain, and what should be tested through monitoring or evaluation. This reduces the risk of mismatch between promotional language and what the work actually supports.

Research brief Indicator dictionary Monitoring plan Stakeholder summary

Frequently requested research outputs

Teams often need research outputs that can be used immediately within governance and delivery. A common pattern is a short brief that establishes the baseline, a set of evidence-backed options, and a monitoring plan that supports learning. We also support review of existing documents to improve clarity, ensure consistent definitions, and reduce the risk of overstated claims. Each output is structured so readers can identify: the question, what evidence was used, what was concluded, and what remains uncertain.

If your primary need is to design or refine an intervention, we recommend pairing research with a policy options note. The Policy page describes how we support scoping, programme logic, evaluation design, and funding-ready documentation.

Baseline and peer set

A structured baseline with a transparent peer selection method, designed to support targets and monitoring without implying that a single ranking tells the whole story.

Evidence summary note

A concise synthesis of what the evidence indicates, what it does not, and what should be tested through monitoring or evaluation.

Indicator dictionary

Definitions, sources, update frequency, and owner for each KPI, plus guidance on interpretation and common misreadings.

Document review

A structured critique to improve clarity, consistency, and defensibility, including a list of claims that need stronger evidence or reframing.

Data minimisation reminder

If you share information with us by email, please avoid sending sensitive personal data. For most research and scoping, high-level programme details and organisational context are sufficient. Privacy details are on our Privacy page.