Page Turner Review
SEE OTHER BRANDS

Get your daily news on books and publishing

Review of the analytical framework supporting financial policy at the Bank of England

Oliver Bush, Anne-Caroline Hüser, Pippa Lowe, Rhiannon Sowerbutts and Matthew Waldronfootnote [1] footnote [2]

1: Introduction

The Financial Policy Committee (FPC) was established as a statutory body in 2013, with a primary objective to protect and enhance the stability of the UK financial system. The FPC was created as part of sweeping reforms to financial services regulation following the Global Financial Crisis (GFC).

Reflecting these origins, the early focus of the FPC was on the stability of the banking sector. Since then, the nature of threats to stability have evolved significantly. UK banks are now demonstrably more resilient. They were able to continue supporting the real economy through several recent stresses, such as the collapse of several mid-sized US banks, the failure of Credit Suisse, the pandemic and the sharp increase in interest rates that followed.

But as threats to stability from other sources have grown, the FPC’s focus has shifted accordingly. Notably, the role of non-banks, including market-based finance (MBF), has grown substantially. MBF has also been at the heart of several recent episodes of stress, including the global dash-for-cash in 2020 and the UK liability-driven investment (LDI) crisis in 2022. The financial system has also become more interconnected, including globally, and digitised – which has made operational resilience more important while, arguably, increasing risks to it at the same time.

The analytical framework supporting the FPC has adapted to these developments and is at the forefront of efforts to identify and tackle risks across the system.

The tools available to support financial policy analysis have evolved too. There has been material progress in data analytics, alongside advances in the academic literature on risk assessment and policy evaluation.

Finally, there is increasing focus on the financial sector’s role in supporting economic growth.

Over a decade of experience makes now a good time to take stock of the analytical framework that supports the FPC’s policymaking, and to ready it for the next decade.

This paper is structured around three key elements common to such analytical frameworks, stylised in the Figure 1 (in white).

The first is policy objectives and instruments; the aims of the policymaker and the tools available to it in pursuit of those aims. Second, a transmission framework that describes how the variable(s) targeted by policy can be affected by shocks and policy. In practice, separate frameworks are often needed for risk assessment and policy evaluation. Third, data (and data analytics) are used to describe the state of the system, help calibrate models and inform modelling.

The FPC’s primary policy objective is financial stability. Its responsibilities in this regard are formally described as relating ‘primarily to the identification of, monitoring of, and taking of action to remove or reduce, systemic risks with a view to protecting and enhancing the resilience of the UK financial system’.footnote [3] While particular systemic risks are emphasised – relating to markets, risk distribution and unsustainable levels of leverage, debt or credit growth – the FPC’s primary objective is not further defined.footnote [4]

The FPC is prevented from exercising its functions in a way that would be expected to damage medium or long-term growth of the UK economy. This can be thought of as a constraint on what the FPC should do in pursuit of its primary objective.

Further, the FPC also has a secondary policy objective to support the Government’s economic policy, including for growth and employment.

Reflecting the myriad of ways in which financial stability can be appraised and undermined, the FPC’s objectives leave broad room for interpretation. This could make it harder for the FPC to scale threats to its objectives and performance against them.

To fulfil its functions, the FPC has a range of policy instruments available to it. These include legally-binding powers of direction to the PRA or FCA over certain macroprudential measures,footnote [5] and non-binding powers of recommendation to a broad set of UK bodies. The FPC also sets the UK’s countercyclical capital buffer (CCyB) applying to banks. Other instruments it can call upon on include Bank of England (Bank) balance sheet tools, like market operations, and its own communications.

The Bank’s overall approach to risk assessment and policy evaluation draws on a comprehensive risk assessment framework. Like many other financial stability policy institutions, it has in practice focused on identifying financial system ‘vulnerabilities’ and the ‘resilience’ of key sectors and markets to shocks. But further developing understanding of the real economy effects of financial system propagation of shocks is a growing focus. As is expanding system-wide analysis and identification of frictions underlying vulnerabilities (and inefficiencies). These developments are particularly relevant in the context of non-bank vulnerabilities for several reasons. These include that the frictions underlying them – and their impacts on the real economy – are typically less well understood than for the banking sector. The non-bank sector often also plays more of an indirect or intermediating role in supporting the real economy. Relatedly, it is also highly interconnected.

A further key challenge in the Bank’s approach to assessing risk and evaluating policy is measurement. There are naturally limits – including technical constraints – to quantifying risks and inefficiencies. But there is always scope to build on advances in the academic literature and practice.

Data (and data analytics) are used at the Bank to describe the state of the financial system, help calibrate models and inform analysis of shocks and policy interventions. There have been significant advances in data gathering, analysis and modelling since the FPC’s inception. In particular, for non-banks. This includes the use of system-wide modelling and exercises, such as the Bank’s recent system-wide exploratory scenario (SWES). But gaps in data availability and use remain, and there is more to do to enhance understanding of system dynamics.

Some measurement limitations and data gaps will be inherently unresolvable. This means that quantitative analysis should be supplemented with alternative, complementary approaches such as supervisory and industry intelligence, horizon scanning and the use of SWES-style exercises to build and test understanding of the system. Bank staff already use these approaches in financial stability analysis, but more broadly and systematically integrating them in the analytical framework will enable them to be used more effectively.

The analytical framework described here supports the FPC’s key operating modes, set out in Figure 2.footnote [6] The first is to identify ways in which the FPC’s objectives can be undermined, for example by identifying risks, vulnerabilities and inefficiencies. Where an issue is identified as having the potential to undermine the FPC’s objectives, the FPC takes action, which can involve supporting actions by other authorities or taking direct action itself. If it is determined that the issue is not capable of undermining the FPC’s objectives, or is already being mitigated, the FPC shifts to monitoring mode.

All aspects of the analytical framework described here support the three operating modes of the FPC shown in Figure 2. For example, the transmission framework – supported by data (and data analytics) – enables identification and assessment of the ways that the FPC’s objectives can be undermined, the evaluation of policy to tackle them and monitoring of their incidence and scale. And policy objectives and instruments determine the basis and means for FPC action. Given the importance of the analytical framework to the FPC’s policymaking, it is essential that it continues to evolve with developments in the financial system and remains at the forefront of efforts to identify and tackle risks across the system.

This paper explores the issues discussed above in further detail. Its scope is limited to that of the analytical framework underpinning the FPC’s policymaking. Neither the institutional framework for UK financial policy nor the FPC’s governance processes or operation are in scope. These elements are taken as given for the purposes of this paper. This paper does not undertake a full review of the analytical frameworks underpinning macroprudential authorities’ policymaking in other jurisdictions. However, the work proposed here will draw on international experience and best practice, taking account of similarities and differences between institutional arrangements.

The remainder of this paper is structured according to the three key elements of financial policy analytical frameworks outlined here. Section 2 discusses the objectives (and instruments) of financial policy. Section 3 considers its transmission frameworks – ie risk assessment and policy evaluation. Section 4 looks at data (and data analytics) and Section 5 concludes.

2: The objectives (and instruments) of financial policy

As illustrated by Figure 1, objectives and instruments are defining elements of any policy framework; determining what a policymaker is trying to achieve and how. Section 2.1 examines key insights from the literature on financial policy objectives. It includes a non-technical summary of the relevant concepts (Box A). Section 2.2 turns to how analysis supports the effective use of policy instruments. Section 2.3 assesses how these elements are reflected in the UK’s financial policy analytical framework. Section 2.4 concludes and suggests some enhancements to this framework.

2.1: The rationale for and features of objectives in financial policy

The financial system delivers significant benefits to society by overcoming some of the costs and barriers to activity that would exist without formal financial services. In a world of purely bilateral transactions, such ‘pure’ frictions (eg asymmetric information) and distortions (eg credit rationing) would severely limit risk sharing and efficiency. But financial systems are not perfect. They cannot (and do not) fix all frictions, and they can themselves create or worsen frictions. These ‘derived’ frictions and associated distortions (eg externalities) can, alongside non-rational behaviour, drive vulnerabilities (eg excessive leverage) and inefficiencies (eg misallocations) in the financial system and real economy. These can pose substantial real economic, and so welfare, costs. This justifies a role for public policy. Box A sets out more detail.footnote [7] This section starts by discussing what the overarching ambition of such policy should be, and the limitations to achieving this. It then considers key insights from the literature on appropriate policy objectives.

2.1.1: The overarching ambition of financial policy and limitations to achieving it

Since public policy ultimately aims to enhance welfare, it is useful to frame its goals through the First Fundamental Theorem of Welfare Economics. This states that, under conditions including perfect competition, complete markets and full or perfect information, outcomes are Pareto efficient.footnote [8] This is the ‘first-best’ (or ‘unconstrained efficient’) outcome. For policymakers, this implies intervening to reduce the welfare costs of financial frictions.

But in practice, many of the conditions underlying the First Fundamental Theorem do not hold. For example, some frictions and non-rational behaviours are inherent to financial activity and cannot be fully eliminated. Other frictions and distortions may be associated with fundamental features of supporting activities, such as the form and functioning of the legal system, technology, and tax.

For these reasons, the first-best outcome is unattainable. In that case, the theory of the second best applies (Lipsey and Lancaster (1956)) and the role of the policymaker is to intervene to achieve a ‘constrained efficient’ second-best outcome. That is, trying to maximise welfare subject to unavoidable constraints.

Some second-best policy interventions may appear to lean against the (unachievable) first-best outcome. For example, it is well established that in some cases a policymaker can achieve better outcomes by limiting the excessive build-up of private debt (eg Lorenzoni (2008)). For example, where borrowers’ reactions to collateral constraints in stresses impose negative externalities. This is despite the fact that, without the social planner’s intervention, private debt is typically sub-optimally low compared with the first best outcome.

Consistent with this, second-best interventions that improve overall welfare are likely to involve trade-offs. Some agents may be made better off and others worse off. And economic welfare may be increased in some states/times but lowered in others. In some cases, interventions that improve welfare might exacerbate inefficiencies to reduce vulnerabilities. Ie redistributing costs from stressed states of the world to less stressed ones.

Any trade-offs must be carefully balanced within the policy framework. Ultimately, interventions should only be made when they deliver net welfare gains. For example, building resilience ex ante is only justified where the benefits (in terms of reduced welfare costs of crises) are not outweighed by any costs (for example, to growth).footnote [9]

Many policy models targeting second-best or constrained efficient outcomes assume that the policymaker has perfect information (and perfect instruments), while relaxing assumptions such as perfect information for other agents. But in practice policymakers often have imperfect: i) information about frictions and behaviour and ii) tools to tackle them (Section 2.2.1 discusses this in more detail).

2.1.2: Insights from the literature on desirable features of financial policy objectives in the real world

As alluded to in the previous section, the literature is useful for understanding the role of particular frictions and distortions, like pecuniary externalities, in reducing welfare. But it is less useful in guiding what policy objectives should be in practice. This is because most papers assume that a single policymaker (or social planner) directly maximises welfare. There are two practical problems with doing this. First, it is not possible to measure economic welfare in a sufficiently robust way to guide policy. Second, different aspects of public policy are generally delegated to different policymaking bodies.

The practical solution to these problems is for a (delegated) policymaker to minimise a ‘loss function’footnote [10] that defines their policy objectives in an approximately welfare-consistent way, when accounting for the loss functions and actions of other policymakers. The resultant terms in the loss functions represent specific intermediate targets. There are two distinct, although connected, challenges to this approach: i) the form of delegated loss functions and ii) their delegation to multiple policymakers.

Form of delegated loss functions

Some of the literature has explored what financial stability terms a financial policy loss function could include. These papers often extend otherwise standard monetary macroeconomic (‘New Keynesian’) models by incorporating financial stability-relevant frictions or distortions. These result in terms beyond the conventional inflation and output gap terms typically included in loss functions delegated to monetary policymakers:

  • Ferrero et al (2024) incorporate a household collateral constraint and limits to risk sharing between borrowers and savers. These frictions give rise to two financial-stability relevant terms in the loss function. These are the gap between borrowers and savers in marginal utilities of: i) non-durable consumption and ii) housing. These terms reflect both the incomplete risk sharing and the pecuniary externality driven by the collateral constraint.
  • De Paoli and Paustian (2017) incorporate a moral hazard problem between banks and depositors, which leads to a binding leverage constraint and an endogenous spread between lending and deposit rates. These give rise to an additional effective interest rate term in the loss function.

While expanding the literature to map a broader range of frictions into loss functions would be valuable,footnote [11] it would not alone yield a usable financial policy loss function. As Ferrero et al (2024) show, welfare-consistent loss functions often include terms that are difficult to measure. Moreover, the number, complexity, and model-specific nature of financial frictions would require numerous tailored terms.footnote [12] These challenges likely explain why the literature has yet to: i) determine whether a simple loss function could approximate optimal welfare-consistent policy for financial stability and ii) reach a consensus on what terms it might contain. By contrast, these issues have been overcome for monetary policy with an acceptance that minimising a weighted average of deviations of inflation from target and the output gap is likely to be a good approximation to a welfare-consistent loss function.footnote [13]

Delegation to multiple policymakers

In the real world, policy is typically delegated to multiple policymakers. Reasons why include the benefits of institutional independence, specialisation and expertise. The sheer breadth of public policy also makes it impractical for a single body to manage effectively.

In general, microprudential policy targets the safety and soundness of individual financial institutions, while macroprudential policy targets system-wide stability.

The case for separation of micro and macroprudential policy is largely a practical one. It would be impractical for a single policymaker to oversee the financial system’s vast range of entities and activities, and their interactions with each other and the real economy.

However, theoretically distinguishing the roles of micro and macroprudential policy is difficult, partly because both contribute to system resilience and can interact with each other. These interactions are explored further in Section 3 and are central to understanding how micro and macroprudential goals may align or conflict. The absence of a clean separation in roles could lead to ambiguity in policymakers’ objectives and accountability.

Regardless of how the split is defined, this discussion suggests that practical issues around the delineation of roles and interactions between micro and macroprudential policy are inevitable. This emphasises the importance of co-ordination between them: failures can prevent delegated policymakers from achieving the level of welfare that a single, unified policymaker might. Useful insights can be drawn from papers that formally study the interactions between policymakers. For example, De Paoli and Paustian (2017) has studied macroprudential and monetary policymaker co-ordination. The authors assign welfare-consistent loss functions containing: i) standard inflation and output gap terms and ii) output gap and effective interest rate terms to i) monetary and ii) financial stability authorities respectively. They find that non-coordinated (Nash equilibrium)footnote [14] policy leads to welfare losses compared to the coordinated (ie social planner) equilibrium, with the size of these additional losses depending on the precise institutional arrangement between the two policymakers.footnote [15] Poor co-ordination between macro and microprudential policymakers might also be expected to lead to welfare losses. These losses could be high, given strong links between microprudential tools and macroprudential goals. Macroprudential authorities, with their system-wide perspective, are well placed to facilitate co-ordination and reduce these costs.

Overlaps and interaction between the prudential policies of different country authorities are also relevant. For example, BIS (2017) demonstrates that macroprudential tools – such as loan-to-value limits or countercyclical capital buffers – can have unintended effects on other economies. Moreover, MBF activities in particular tend to operate on a global basis, which increases the degree of international interconnectedness. For these reasons, international fora, which enable co-operation among macroprudential authorities from different countries, are very important.

2.2: Instruments of financial policy

This section first discusses the overarching ambition of policy instruments and constraints on achieving that. It then considers the necessary features of analytical frameworks supporting the use of policy instruments.

2.2.1: Overarching ambition of policy instruments

The previous section argued that financial policymakers’ overall aim should be to maximise economic welfare under ‘second-best’ conditions. They should do this by tackling frictions and their effects, accounting for trade-offs,footnote [16] such as between effects on efficiency and stability.

Doing this requires the policymaker to have enough instruments to target effectively financial frictions and their effects.footnote [17] But, as noted briefly in the previous section, there are limits to what policymakers can achieve in practice, because:

  • Policymakers have imperfect information about frictions, distortions and non-rational behaviour, and so face uncertainty about the effects of interventions. Frictions may also involve significant complexities. For example, they and their costs may be time-varying. In some cases, this is caused by constraints on agents (such as collateral constraints) which only bind occasionally. In others it is caused by the possibility of sudden runs, freezes or disruptions in intermediary funding markets.footnote [18] Some frictions and distortions may also interact with others. For example, asymmetric information can contribute to missing or incomplete markets. And financial services mechanisms that evolved to tackle one friction or distortion may worsen others. For example, the use of collateral may help mitigate the impacts of asymmetric information in lending, but this can create collateral constraints which may drive fire sales under certain conditions. These complexities can make choosing and calibrating an appropriate intervention difficult, particularly for ex-ante instruments.
  • Policymakers may have imperfect instrument sets and face constraints in using them. Imperfections include unclear objectives, gaps in tools’ (collective or individual) coverage, and implementation lags. Some tools may also affect other frictions or distortions. For instance, enhancing resilience at the firm level can sometimes worsen the impact of system-wide vulnerabilities (Section 3). Relatedly, some tools may be unable to tackle frictions and their effects on their own and so multiple instruments may be needed. All these issues can make policymaking more complex.
  • Unclear objectives can also make deploying policy effectively more difficult. As can – often relatedly – co-ordination problems between policymakers. Other factors that might also undermine policy effectiveness include: i) pressure for looser policies as memories of crises fade and create complacency (Palley (2011) for a broader discussion) and ii) ‘leakage’ to jurisdictions and sectors beyond policymakers’ reach, including resident entities domiciled elsewhere (Reinhardt and Sowerbutts (2015) and Frost et al (2019)).

2.2.2: Necessary features of analytical frameworks supporting the use of policy instruments

The challenges set out in the previous section highlight the need for a strong analytical framework to guide policy instrument use. This includes: i) a clear understanding of how instruments ‘work’, ie how they affect frictions, behaviours and/or their effects, ii) mitigants and responses to challenges in using instruments and iii) well-defined objectives and a coherent strategy for instrument use. These are explored in turn:

i) A clear understanding of how instruments ‘work’ is critical for their effective use. In this context, Box B summarises key financial policy instruments and the frictions, distortions and behaviours they target.

ii) Mitigants or responses to challenges in using policy instruments can help improve their effectiveness. For example:

  • Co-ordination problems between policymakers can be mitigated through clear objectives and supportive governance. This includes clear and agreed responsibilities, structured processes for information sharing and cooperation, and a clearly-articulated strategy for instrument use – all of which help clarify accountability.
  • International leakage can be reduced in several ways, including with international standards that address cross-border frictions, ensure a level-playing field, and reduce the risk of financial instability spilling over to other countries.footnote [19]
  • A cautious approach to taking policy action may not always be more appropriate where policymakers lack perfect information. Bahaj and Foulis (2017) show that the conventional assumption that policy should act more cautiously in response to uncertainty (as per Brainard (1967)) may not always apply in a financial stability context. If the costs of instability outweigh the benefits of looser policy in the policymaker’s objective function, then uncertainty can justify more aggressive intervention.footnote [20] Bahaj and Foulis (2017) also consider Knightian uncertainty, where policymakers cannot (reliably) assign probabilities to rare events like financial crises. In such cases, a ‘robust control approach’ (Hansen and Sargent (2001, 2008)) – applying the policy that minimises losses in the worst-case scenario – may be appropriate. But Bahaj and Foulis (2017) suggest that this approach has its limits if it leads to the financial system being overregulated and inefficient in normal times. Ultimately, uncertainty requires a pragmatic approach: focusing on interventions that matter most for policy objectives while remaining alert to unintended consequences (Lipsey (2007)).

iii) Well-defined objectives and a coherent strategy for deploying instruments can help address policy implementation challenges in several ways. These include: a) anchoring policy expectations and enhancing policy credibility, b) aiding prioritisation, c) guiding policy instrument selection where frictions and effects of policy on them may interact, and where there are complexities in the nature of frictions (such as non-linearities and variation over time), d) supporting co-ordination with other policymakers and e) strengthening the framework against undue pressure to loosen or tighten policy.

2.3: Objectives and instruments in the UK’s financial policy framework

The literature discussed in the previous section offers valuable insights and guidance in several areas. These include: the overarching aim of public policy, optimal policy in response to certain frictions and the costs of co-ordination problems. However, the literature offers little consensus in other areas, such as on specific and tangible financial stability objectives.

The formulation of the FPC’s objectives and the key institutional arrangements underpinning its activities are broadly consistent with many of the key features of this literature.

The FPC’s objectives are set out in its establishing Act.footnote [21] Its primary objective is financial stability.footnote [22] Specifically, the Act tasks the FPC with identifying, monitoring, and taking action to remove or reduce systemic risks (defined as risks to the stability of the UK financial system as a whole or a significant part of it). The Act highlights three key sources of systemic risk: i) structural features of financial markets, such as connections between financial institutions, ii) the distribution of risk within the financial sector and iii) unsustainable levels of leverage, debt, or credit growth. The Government must also provide a written remit to the FPC at least once a year which gives guidance on the FPC’s objectives.

However, beyond this, the Act does not further detail the FPC’s primary objective. In practice, the FPC interprets and communicates its objective via the Bank’s Financial Stability Strategy and other publications.footnote [23] The most recent Strategy, published in 2023, defines a stable financial system as one ‘that has sufficient resilience to be able to facilitate and supply vital services by financial institutions, markets and market infrastructure to households and businesses, in a manner that absorbs rather than amplifies shocks’.footnote [24]

The FPC also has a secondary objective – subordinate to the primary objective – to support the Government’s economic policy, including its objectives for growth and employment. The Government’s annual remit letter sets out what the FPC should consider as part of its secondary objective. The Act further specifies that the FPC should not ‘exercise its functions in a way that would, in its opinion, be likely to have a significant adverse effect on the capacity of the financial sector to contribute to the growth of the UK economy in the medium or long term.’ This serves as a constraint on how the FPC pursues its primary objective.

The allocation of a primary financial stability objective and secondary economic policy objective to the FPC is consistent with two key insights from theory discussed in the previous sections. First, the huge costs of financial crises justify a primary focus on financial stability for a policymaker with a system-wide perspective and purview. Second, it is arguably easier to assign a primary financial stability objective to a single policymaker than to do so for the objective of supporting the Government’s economic policy. This is because the latter can be materially influenced by many policy domains. While financial stability is also influenced by other policymakers, there are relatively fewer relevant actors, making it more feasible to allocate a coordinating role to a single policymaker (the FPC).

The FPC’s remit notes that its primary and secondary objectives will often be complementary. It encourages the FPC to support the secondary objective, where that is not in conflict with its primary objective. But the remit also notes that when conflicts between the two objectives do arise, they should be managed and communicated transparently.

Overall, the FPC’s objectives leave broad room for interpretation. This breadth reflects the myriad of ways in which financial stability (and even more for the financial sector’s contribution to the Government’s economic policy) can be appraised and undermined. In other words, it is not a deficiency in the FPC’s remit. Such room for interpretation is also consistent with the state of the literature which offers no consensus on specific and concrete financial stability objectives. But it could create ambiguity around the FPC’s objectives and pose challenges to policymaking in certain circumstances.

The Act also sets out the FPC’s powers and instruments, and specifies several elements of FPC governance and processes, such as requirements around the publication of meeting records.

To fulfil its functions, the FPC has a range of policy instruments it can call upon.

Regular and ad-hoc communications by the FPC, Bank and PRA are an important part of the financial policy toolkit. Key tools here include: i) FPC meeting records, ii) Financial Stability Reports, iii) publications relating to stress tests on banks, CCPs, insurers and the broader market, iv) speeches and v) other publications such as Financial Stability in Focus papers. These communications variously intend to achieve aims such as supporting accountability, policy signalling and expectation management and raising awareness of risks and structural changes affecting them.

Other tools include the setting or application of capital buffers, powers of direction, and recommendation. The FPC can also advise the Bank on the use of its balance sheet for the purpose of protecting and enhancing the stability of the UK financial system.footnote [25] These are summarised in Table 2.A alongside some examples of their use.

Table 2.A: tools and instruments that the FPC can call upon, and examples of their use

Instrument

Details

Examples of use

Capital buffers and legally binding powers of direction to the PRA or FCA over specific macroprudential tools(a)

UK Countercyclical capital buffer rate (CCyB): applies to UK exposures, set quarterly.

Other Systemically Important Institutions (O-SII) buffer: the FPC is responsible for the framework and reviewing it at least biennially.(b)

Sectoral capital requirements, leverage ratio requirements and buffers, and mortgage lending limits.

UK CCyB rate was lowered in March 2020 in response to the economic shock from the pandemic (later raised in July 2022).

Direction in 2015 specifying the minimum level of the leverage ratio for UK banks and building societies as a complement to minimum capital requirements.(c)

Non-binding powers of recommendation to the PRA, FCA, HMT, and other public bodies

Recommendations can cover any area within these bodies’ remit but cannot relate to a specified regulated entity. They can be used to tackle risks outside the FPC’s defined instrument set, including for non-banks.

Recommendations made in 2023 following the LDI crisis – including to the Pensions Regulator (TPR) – regarding the resilience of LDI funds, TPR’s remit and collaboration with other regulators.

Bank of England balance sheet tools

Tools include: liquidity facilities (eg repo facilities), funding schemes and asset purchases.

Contingent Term Repo Facility and Term Funding Scheme (with additional incentives for SMEs) were used in March 2020 amid market and economic stress driven by the pandemic.(d)

A targeted and temporary government bond purchase programme was used in the LDI crisis in 2022 to stabilise the gilt market.

  • (a) These buffers and tools apply to UK-authorised banks, building societies, investment firms and regulated lenders (and in some cases, qualifying parent undertakings and subsidiaries) where relevant for the use of buffer or tool in question. For example, the CCyB applies to all banks, building societies and investment firms (other than those exempted by the FCA) incorporated in the UK. Whereas the UK legislation implementing the O-SII buffer restricts the application of the O-SII buffer to ring-fenced bodies and large building societies. And mortgage lending limits apply to regulated lenders above a de minimis threshold.
  • (b) The PRA applies this O-SII framework to firms. The FSB (in conjunction with the BCBS) is responsible for the framework for global systemically important banks (G-SIBs, G-SIIs in the UK framework) but it is applied to UK firms by the PRA. Technically a firm can be both a O-SII and G-SII; in such cases the higher buffer applies.
  • (c) The 2015 Direction (and accompanying Recommendation) were updated in 2021 with a new Direction (and Recommendation).
  • (d) The Contingent Non-Bank Financial Institution Repo Facility (CNRF) was introduced in 2025.

The Bank and PRA also have a role in financial intermediation, which can support both financial system stability and efficiency. Roles here include: (Bank) operation of payment and settlement infrastructure (RTGS and CHAPS), a founding role in securities settlement infrastructure (CREST)footnote [26] and a (PRA) role in governance and oversight of the Financial Services Compensation Scheme.

This set of tools and instruments is also consistent with many insights from the literature. For example – as considered optimal in response to certain time-varying frictions and distortions – the FPC has an explicit countercyclical tool, the CCyB, at its disposal. The breadth of tools that the FPC can call upon also reflects the many ways its objectives can be undermined and supported. For example, it has broad powers of recommendations over UK policymakers and can call upon both ex-ante tools (such as the CCyB) and ex-post tools (such as Bank market operations).

The UK’s financial stability framework also helps to address problems associated with the delegation of policy to multiple policymakers. Several bodies and committees have financial stability policy responsibilities or influence, including the FPC, PRA, Financial Market Infrastructure Committee (FMIC), FCA and HMT. And various elements of the institutional framework aim to clarify relative responsibilities and establish the basis of effective co-ordination. These include: i) statutory remits and accountability mechanisms that – where relevant - describe each body’s role in, or duty to consider, financial stability, ii) a coordinating role for the FPC in financial stability policymaking, iii) formal and informal co-ordination structures,footnote [27] iv) the housing of the FPC, PRA, FMIC and balance sheet operations together in the Bank and v) other collaboration, such as through joint surveys.

2.4: Summary and proposed enhancements to the analytical framework

Many of the general challenges in using policy to achieve a constrained-efficient or second-best outcome have been tackled by the frameworks underpinning UK financial policymaking. However, there is always room for improvement, not due to deficiencies, but because both the ways that the FPC’s policy objectives can be undermined and the analytical tools available to address them have evolved.

These general challenges, how the UK’s financial policy framework and supporting analytical framework have helped to address them – and how the analytical framework could be further enhanced – are summarised as follows.

General challenge 1: inherent difficulty in specifying precise and concrete financial policy objectives.

The UK’s framework for the FPC sets out broad primary and secondary objectives that are necessarily open to interpretation. But, like other similar frameworks, this reflects many challenges including the wide-ranging, uncertain and hard-to-measure ways in which financial stability and contributions to the Government’s economic policy can be undermined. While some flexibility in objectives is understandable and unavoidable – and may even be beneficial as the financial system evolves – insufficient clarity could create ambiguity around the FPC’s role.

What the UK’s financial policy frameworks do to address these challenges.

Several communications including publications and speeches aim to clarify the FPC’s objectives and set out how threats to them can affect the provision of services.footnote [28] For example, through their impact on the availability of credit for UK households and firms. But these communications stop short of implying quantitative or even very precise qualitative objectives.

Suggested enhancements to the analytical framework.

Efforts to specify the FPC’s objectives in precise and practical terms continue. Further work here can include:

  • Continuing to build and communicate a more specific articulation of the FPC’s financial stability goals, building on work already done to enhance mapping of the provision of financial services to real economic activities.
  • As a complement to that work, the FPC could interpret its objectives in further detail. Box C shows two ways in which this might be done. The first interprets the objectives by identifying the underlying conditions that characterise financial instability and inefficiency – termed ‘instability conditions’ and ‘inefficiencies’. The second proposes ‘financial stability outcome indicators’ which measure how the system performs when subjected to an adverse shock. Such indicators could be the basis for work on loss functions described in the next bullet.
  • Conduct research and analysis to work towards an approximately welfare-consistent financial policy loss function, recognising that this cannot be expected to produce anything like a ‘definitive’ loss function. But it would improve the theoretical underpinnings of the objectives of policy, including an understanding of their links to underlying frictions. And it should help rank different threats to the FPC’s primary and secondary objectives viewed together. Such a loss function should: i) be as parsimonious as possible, ii) contain interpretable and measurable (or estimable) terms, iii) contain terms that can be influenced by policy and iv) reasonably approximate welfare maximisation (in that minimising it supports financial policy goals).

General challenge 2: policymakers may have imperfect tools and face constraints in using them.

Imperfections can include imperfect information, unclear objectives, gaps in coverage and imprecision. Policy effectiveness may also be undermined by ‘leakage’ and complacency over crises that leads to pressure for looser policy.

What the UK’s financial policy frameworks do to address these challenges:

  • Within the regulatory perimeter, the FPC can call upon a broad set of tools to address relevant financial frictions and their effects. Safeguards against policy leakage include CCyB reciprocity and monitoring risks beyond the regulatory perimeter.
  • Ongoing work aims to better link these tools to specific vulnerabilities and their impact on the real economy (Section 3), which should help further enhance instrument selection.
  • The FPC’s independence and formally-defined powers, along with a clearly-defined formal role for Government, should remove ambiguity over relative roles in financial policymaking.
  • The FPC has published its approach to using instruments including the CCyB, leverage ratio and housing tools.footnote [29]

Suggested enhancements to the analytical framework.

Enhancing the FPC’s strategy for intervening in a system-wide manner would help further address some of the challenges discussed here. It would also help ensure that the FPC deploys its tools as effectively as possible, enabling to focus on what is most important to system-wide stability and to understand the system-wide implications of its policy response. Possible further work here includes that on: i) the FPC’s use of different instruments in different circumstances, including the use of ex-ante and ex-post tools and ii) accounting for interactions in both frictions – including between those that undermine the primary and secondary objectives – and tools to address them. This could build on existing work to map frictions to vulnerabilities to effects on the real economy (Section 3).

General challenge 3: different aspects of financial policy are delegated to different policymaking bodies.

This necessarily reflects the breadth of policy scope. But the existence of multiple policymakers can hamper the effectiveness and consistency of policymaking. For example, it could result in risks ‘falling through the cracks’ or contradictory/inconsistent policy.

What the UK’s financial policy frameworks do to address these challenges.

As discussed in Section 2.3, several features of the UK institutional framework aim to clarify responsibilities and establish effective co-ordination between policymakers with financial stability policy roles or influence. The analytical framework underpinning financial stability policymaking supports co-ordination in several ways. These include how analysis is done – such as information sharing, joint analysis and exercises – and what analysis is done. For example, identifying vulnerabilities and their causes can help identify which policymakers should act and how. This also extends to the international level, where the UK plays a leading role in providing analysis around activities undertaken on a global basis, including through its membership of the Financial Stability Board.

Suggested enhancements to the analytical framework.

The enhancements proposed here – providing greater clarity over the interpretation of FPC’s objectives and enhancing its policy strategy – should help further illustrate its role relative to other policymakers, and support co-ordination with them. And proposals to enhance the risk assessment and policy evaluation frameworks (Section 3) should help identify possible actions for different policymakers by supporting the comprehensive and systemic assessment of risks, their effects and causes.

3: Risk assessment (and policy evaluation)

The purpose of risk assessment is to quantify how financial frictions and behaviours transmit and amplify (together, ‘propagate’) shocks to the real economy and ultimately economic welfare. In thinking about how to enhance the existing framework it is useful to start with the features of the ideal risk assessment framework:

  • Includes all financial frictions, behaviours and vulnerabilities at the entity- and system-level relevant for financial stability.
  • Captures how these frictions, behaviours and vulnerabilities vary across time, horizons, and system states, such as stressed periods. For example, entities may amplify shocks more when their shock-absorbing capacity is already depleted. An ideal framework would reflect these dynamics, including by identifying contingent risks.
  • Captures the general equilibrium consequences of shocks, including the (two-way) interaction between different frictions, vulnerabilities and propagation mechanisms and their macroeconomic effect.
  • Is able to assess the impacts of different types of shock drawn from well-defined probability distributions, as well as to capture endogenous financial cycles. This includes shocks that are both exogenous and endogenous to the financial system (eg Covid and the GFC respectively).
  • Quantifies propagation mechanisms in an empirically-accurate way to enable scaling and ranking of risks.

A framework incorporating these elements would enable consistent tracking of systemic risks across time and scenarios. It would make it easier to monitor the impact of changes in vulnerabilities – such as bank or corporate leverage – on overall risk. It would also help identify which frictions and vulnerabilities matter most, supporting prioritisation.footnote [49]

In its purest form, the ideal risk assessment framework is not achievable in practice. This would require a fully specified, empirically grounded, system-wide general equilibrium ‘macrofinancial’ model (‘macrofinancial’ here refers to the interaction between the macroeconomy and the financial system). Such a model in this context would capture all relevant markets, institutions, services, and real-economy linkages. Such a comprehensive ‘transmission framework’ is well beyond current modelling and data capabilities.

Nonetheless, meaningful progress toward something resembling the ideal is possible. In this context, this section discusses key challenges in systemic risk assessment and suggests enhancements that could help overcome them. Section 3.1 briefly summarises the Bank’s current approach to financial stability risk assessment. Sections 3.2 to 3.4 consider possible enhancements in risk mapping, scenario analysis and indicators respectively. Section 3.5 discusses supplementary approaches as an alternative to quantitative analysis. Section 3.6 covers models for policy analysis. Section 3.7 summarises and brings together proposals for further work.

3.1: The Bank’s current approach to financial stability risk assessment

The broad contours of the financial stability risk framework at the Bank of England are similar to those used by other financial stability policymakers (Figure 3). The FPC has most recently applied this framework to MBF and operational resilience. In doing so, consistent with the FPC’s primary task to identify, monitor and remove or reduce systemic risks, the focus has been on identifying vulnerabilities and the system’s resilience to them (left-hand and central sections of Figure 3). Work is ongoing to further develop the understanding of i) the effect of systemic risks crystallising on the real economy and ii) feedbacks between the financial system and real economy (right-hand section of Figure 3).

This focus on vulnerabilities and resilience also applies in routine monitoring of financial stability risks via a set of indicators that contain information about the current state of the financial system.footnote [50] These include, for example, the credit to GDP ratio, private non-financial credit growth, corporate bond spreads, the tier 1 capital ratio of major UK banks, and the share of new residential mortgages with high LTVs. Many of the indicators in the set have been shown to be empirically associated with the probability of financial crises occurring and/or with the depth of recessions and the speed of recoveries.footnote [51]

In parallel, Bank staff have developed models and exercises to quantify systemic risks in particular parts of the financial system. One example is stress testing. The Bank Capital Stress Test (BCST, formerly the Annual Cyclical Scenario), assesses whether UK banks hold sufficient capital to maintain lending to the real economy during severe downturns.footnote [52] It is supported by several models/frameworks, encompassing both disaggregated asset-level impairment models and top-down ‘ready reckoners.’ Those models can also be combined with expert judgement to conduct desk-based variants of the BCST without participating banks submitting their own stressed projections, as was done in 2024. The Bank also conducts stress tests for insurers and central counterparties (CCPs).footnote [53] Similar to the BCST, the objective of these tests is to assess sector and firm resilience to defined stress scenarios.

Staff have extended this approach to MBF as risks there have grown. This includes the Bank’s recent system-wide exploratory scenario (SWES).footnote [54] The latter was conducted with around 50 industry participants – including banks, insurers, CCPs, hedge funds, asset managers, and pension funds. It aimed to test the resilience of core UK markets to non-bank and bank behaviour during market stresses.footnote [55] Like the BCST, the SWES offers valuable granular insights into exposures, interconnections, and behaviours that would be difficult to obtain otherwise. These efforts are supported by a wide range of data sources, including macroeconomic statistics, regulatory, financial market and transaction data, and firm and loan-level datasets.

These activities are a core element of the Bank’s risk assessment capabilities. They could be usefully supplemented with further work – some of which is already in train – to articulate how: i) frictions and behaviours drive vulnerabilities and ii) vulnerabilities propagate shocks to the real economy. This would help make the risk assessment framework (even) more system-wide and help in refining policy strategy. The following sections suggest ways this work could be taken forward, grouped broadly into mapping, modelling, and indicators.

3.2: Possible enhancements to financial system mapping for risk assessment

This section discusses developing mapping of the key components and features of the financial system relevant to financial stability policy. This exercise aims to advance a structured framework for assessing financial stability risks, building on existing and ongoing work. The approach treats the full system map as an ‘atlas,’ from which targeted maps – such as for core markets or major exposures – can be drawn based on specific risks or policy questions. Box F sets out an example of how this mapping approach could be applied in the analysis of risks from private equity in comparison to LDI funds.

3.2.1: Overview of the mapping framework

Figure 4 sets out a high-level schematic of the mapping framework. It shows four major ‘nodes’ of the system.

Starting from the real economy (right-hand side of Figure 4), this connects to the financial system through the end-user servicesfootnote [56] households and firms use. These take two forms. First, provider-intermediated services delivered by firms – like banks and insurers – that fully intermediate between the ‘supply chain’ for such services and their users. Second, market-intermediated services – like equity issuance – that are delivered through financial markets but typically accessed by users via intermediaries like investment banks. The end-user services themselves also rely on intermediate services (for example, payments infrastructure) and markets (for example for funding).

There are two relevant dimensions to the propagation of shocks. Shocks can impact the terms or availability of the supply of service, or they can be transmitted through financial channels including, for example, financial market losses or demands for liquidity, such as collateral calls. And they can be direct, for example, if the availability of services (service channel) is impacted by a cyber attack, or indirect, for example, if higher funding costs for banks result in them charging higher lending rates to end users (financial channel). And the real economy can itself transmit or feedback shocks back to the financial system, for example to lenders through losses on lending.footnote [57]

Systemic risk manifests in relation to the real economy in two simultaneous but interdependent ways:

  • ‘Service disruption’ including both outright interruption of services and other deterioration in their availability and pricing (such as unduly procyclical provision).
  • ‘Real economy amplifiers’: household and firm vulnerabilities or constraints that cause them to amplify economic shocks and which arise from – or are worsened by – the way that financial services have been/are provided to them. For example, tighter credit constraints and higher initial debt servicing costs may mean that more indebted borrowers cut back their spending by more when economic shocks hit, thereby amplifying those shocks.footnote [58]

Both financial service disruption and real economy amplifiers matter because they affect welfare via their effects on consumption and production. Consistent with that, the following discussion of how to operationalise the mapping framework starts (in Section 3.2.2) with mapping propagation to the real economy. It then traces propagation ‘backwards’ (Sections 3.2.3 to 3.2.5) to the mechanisms and frictions that drive it. This ‘real economy first’ approach ensures that risk assessment remains grounded in its ultimate purpose.

3.2.2: Mapping to the real economy

The first step in mapping the propagation of shocks to the real economy is to identify touchpoints of the financial system with real economic activity (Breeden (2024)). These can be broadly grouped by type of service provided: i) finance and credit, ii) hedging and insurance, iii) long-term savings vehicles, iv) liquidity and payments services. These services are vital to the functioning of the real economy. They ensure that households and businesses can make transactions and manage and take risks. For example, payments, credit card, mortgage and insurance services for households. And they facilitate corporate financing activity across a range of forms – bank lending, commercial paper, the bond market, leveraged lending, private debt, private equity and venture capital. They also enable firms to insure against and hedge risks. For example, through the use of derivatives to hedge commodity, interest rate and currency risks. Further, financial services facilitate the financing of the Government.

It follows from the importance of these services to end-users that the financial markets and ‘intermediate’ services supporting their provision matter too. For example, the gilt market – and the associated repo market – underpins a wide set of other transactions, through its role in pricing risk-free assets and in helping liquidity to flow around the system. That ultimately supports the provision of services that households and businesses use to borrow, save, invest, make payments and insure themselves against shocks.

The second step in mapping propagation to the real economy is to compile/collate a library of metrics that help size its costs at these touchpoints (ie in terms of real economic impacts). This will reflect, in part, the systemic importance of the touchpoints (Box D).

Mapping the costs of propagation presents several challenges.

Mapping and sizing propagation costs is difficult if shocks (and propagation) occur rarely. For instance, while digitalisation and interconnectedness have increased operational risks, no associated disruptions have yet caused significant macroeconomic harm. Of course, it would be wrong to take from this that the risk is necessarily low. But it does mean that there are no episodes on which to base an assessment of its potential effects.

Empirical evidence is incomplete, often missing key propagation channels. Work to fill these gaps is discussed elsewhere in this section and in Section 4. But some gaps will remain. The use of models of the sort discussed in Section 3.3 can help here.

And it can be difficult to compare metrics that size: i) propagation channels and ii) their impacts on the real economy. For an example of the former (i), it’s unclear how a shift in the Excess Bond Premiumfootnote [59] of Gilchrist and Zakrajšek (2012) would compare to a similar shift in bank credit supply. While standardising metrics – eg, in terms of standard deviations – could improve consistency, precise comparability requires the more integrated modelling of the sort discussed in the next section. For an example of the latter (ii), estimates of the economic impact of propagation are often – naturally – expressed in varying units (eg consumption, investment), complicating comparison.

The proposals in Section 2 to conduct research and analysis that works towards an approximately welfare-consistent financial stability loss function are relevant here. These could help to start framing propagation and its effects in a more common way (in the context of possible precursors to loss function terms). Expressing propagation effects in common units would aid comparison across propagation mechanisms and over time and so support prioritisation. This is important given limited resources for monitoring and mitigation.footnote [60] Converting estimates into GDP terms, though argued to be a poor proxy for overall welfare,footnote [61] could be a practical way to consistently compare propagation channels.

3.2.3: Describing key nodes in the supply chain of service provision to end users

Figure 4 shows the four key financial system nodes discussed in Section 3.2.2: markets, end-user and intermediate financial services, and the real economy. More detailed maps can disaggregate these nodes into their constituent entities – such as sectors, firms – or activities – as needed. The appropriate level of disaggregation depends on: i) the diversity of constituents in the node and its relevance for systemic risk and ii) the nature of the shock being considered. For example, shocks to services that are universally used – like payment systems – are likely to affect node constituents more uniformly. Table B1.A in the annex outlines a taxonomy for node mapping.

The node mapping exercise could be formalised through a network model, though data limitations constrain its completeness. Some gaps may be addressed using external sources (Section 4) or through exercises like the SWES, which offer valuable granular insights into exposures and interconnections that would be difficult to obtain otherwise. Existing Bank and regulatory datasets also offer significant potential. For example, for a granular view of how securities and liquidity flow through and across financial markets, it would be desirable to link participating entities by combining Securities Financing Transactions Repository, Mifid2, and EMIR datasets, to create a network map of securities, gilt repo, and derivatives markets.

For banking, market infrastructure, and insurance, mapping will mainly draw on existing supervisory information. However, supplementary sources, such as market intelligence and qualitative data, including on how these services interact with the real economy, remain valuable.

Supplementary sources matter even more for non-banks and MBF, where vulnerabilities and their real economic effects are less well charted. The use of SWES-style exercises to build and test understanding of the system is particularly valuable here. And given the system’s evolving nature, incorporating horizon scanning and industry insights is essential to keep the risk map current and responsive.

3.2.4: Vulnerabilities

The next part of the mapping exercise is to map vulnerabilities. For this purpose, they can be distinguished between: ‘microfinancial’ vulnerabilities (financial and operational susceptibility to shocks at an entity level) and ‘macrofinancial’ vulnerabilities (‘topological’ features of the system and real economy that broaden the incidence and impact of microfinancial vulnerabilities). Box E sets out further detail.

One way of characterising micro versus macrofinancial vulnerabilities is to distinguish between the ‘depth’ and ‘width’ of vulnerabilities.

  • Depth describes the severity of microfinancial vulnerabilities. That is, the degree of sensitivity to a shock at any given entity/node. For example, leverage can deepen vulnerability to shocks, by reducing entities’ ability to absorb losses.
  • Width describes how broadly these microfinancial vulnerabilities and their impacts are spread across the system by macrofinancial vulnerabilities. For example, shocks in core markets that much of the system is connected to, like gilt markets, can have very broad effects.

Some financial system features can drive both depth and width. For example, leverage creates microfinancial vulnerabilities such as refinancing dependencies. But it can also drive macrofinancial vulnerabilities. For example, by creating or increasing interconnections between borrowers and lenders.

Other features might instead trade off depth and width, particularly when behaviour is overlaid. For instance, some attempts by entities to reduce the impact of microfinancial vulnerabilities may exacerbate the impact of macrofinancial vulnerabilities. For example, firms may sell assets or raise margins on counterparties to protect themselves during stress. Though rational individually, such actions can amplify systemic risks. For example, by driving asset price declines and liquidity strains.

3.2.5: Identifying how risks might manifest

Having mapped the nodes and key vulnerabilities, the map can then be used to identify how risks might manifest.

In doing this, the map will need to account for how propagation might change in different conditions. For example, due to non-linearities and changes in vulnerabilities in stressed conditions, such as contingent exposures and loss of entity resilience.

The map will also need to integrate international considerations. This presents several challenges. While the approach outlined here is theoretically location-agnostic, in practice, geography affects vulnerabilities and real economy impacts. For example, where cross-border considerations introduce different risks and/or involve different financial activities. This is especially relevant to the UK, which is an open economy with a large financial sector, and therefore particularly exposed to global developments. This applies to both the real economy (such as forex risk and related hedging) and the financial system (eg cross-border risk transfer and booking models). Another challenge is quantifying how global financial activity contributes to UK systemic risk.footnote [62]

In addition to propagation channels, behavioural responses to them (and the distribution of such responses) also play a critical role in determining how shocks affect the real economy. The modelling of behaviour is discussed in Section 3.3.

3.3: Possible enhancements to scenario analysis

Scenario analysis – such as the BCST, SWES and CCP and insurance stress-testing set out in Section 3.1 – is a key tool in the UK’s risk assessment framework. These exercises draw on and complement the mapping exercise by helping to assess propagation channels in the context of specific shocks or types of shock. They can also be used to account for behavioural responses, using various methods (discussed further later in this section). These can range from simple rules of thumb or assumptions about behaviour, to industry/firm information on their likely reactions to specified events, to complex models that attempt to predict behavioural reactions.

In general, any one stress-testing exercise – while very valuable for bringing different elements of the propagation of shocks together – can be resource-intensive and will not fully capture feedback loops between the financial system and the real economy.

For both these reasons, it is desirable to supplement stress tests with desk-based capability that can be used for scenario analysis in a more nimble and flexible way.

3.3.1: Developing desk-based scenario analysis capability

Scenario analysis is a practical and widely-used approach to risk assessment. It can enable ‘joining of the dots’ between different sectors and facilitates ‘end-to-end’ analysis from shocks to their ultimate impact on the real economy. The more specific functions of scenario analysis include:

  • Scenario impacts: quantification of financial propagation to the real economy in a set of adverse scenarios, facilitating comparison both between different scenarios and of the same scenario over time.
  • Counterfactual analysis: quantification of scenario impacts in hypothetical states of the world. This enhances understanding of the effects of specific vulnerabilities and helps identify thresholds at which they become significant.footnote [63]
  • Sensitivity analysis: quantification of the effect of changing an assumption(s) about the behaviour of one or more agents (such as the impact of sudden deleveraging by hedge funds). This will pick up non-linearities.
  • Reverse scenario analysis: starting from a targeted degree of impact, this supports analysis of the conditions – including combinations of vulnerabilities and behaviours – under which such impacts might arise.

In practice, the effectiveness of scenario analysis depends on its design – particularly the scenarios (Box G on scenario design) and inputs used. The latter can be grouped into two broad categories:

  • Models of the system and behaviours, where these exist or can be efficiently developed.
  • A modular approach. This combines a range of inputs, which can include models as well as other quantitative/qualitative evidence and assumptions about vulnerabilities and behaviours.

Whichever approach is used, applying an ‘Occam’s razor’ principle suggests two things. First, each approach should include only the financial propagation mechanisms relevant to the specific scenario. Only certain vulnerabilities are relevant to each shock. For instance, insurers are typically much less exposed to refinancing risk than highly-leveraged firms. Second, system-wide approaches should focus on propagation channels with significant real-economy impact. These points support using multiple targeted models, alongside at least one system-wide model that captures propagation across key sectors in stresses or downturns.

And in all cases, scenario analysis will depend on timely, granular data and behavioural assumptions – which may not accurately describe the real world. And as the financial system evolves, both model structures and behavioural assumptions can become less accurate.

3.3.2: Using models for scenario analysis

Designing an effective modelling approach for scenario analysis involves the usual challenges in applied model design. Feasibility constraints mean not all financial amplification mechanisms can be captured in a single model. And in some areas, there is insufficient knowledge for detailed modelling without significant research. Even within the feasible set of models, there are the usual trade-offs between model size and tractability, and between system-wide coverage and detail.

Recent years have seen major advances in models for financial stability scenario analysis (Aikman et al (2023a)). This new generation of models goes beyond first-round effects by incorporating institutions’ behavioural responses and their systemic impacts. Aikman et al. identify three key drivers of shock amplification: i) the size of firms’ capital and liquidity buffers; ii) their asset liquidation strategies under stress; and iii) how firms are interconnected. Effective models must therefore include assumptions about firm responses to actual (and ideally, expected) changes to cash flow, profit, and balance sheets and capture system interconnections.footnote [64]

These considerations effectively narrow the choice of model type down to semi-structural models, network models (with some added behavioural assumptions), and agent-based models. Table 3.A sets out the pros and cons of these model types for use in financial stability scenario analysis, as well as some example models from the literature.footnote [65]

Table 3.A: Pros and cons of candidate alternative model types for financial stability scenario analysis

Approach

Pros

Cons

Examples

Semi-structural

Flexible and adaptable; capable of good empirical fit.

Empirical identification (given many equations and parameters); not well-suited to granular modelling; usually (near) linear.

Budnik et al (2020).

Catalan and Hoffmaister (2022).

Network

Readily capture granular information and interconnections between financial (and non-financial) entities (though lack of data availability can reduce these benefits).

Require behavioural assumptions to be added; not dynamic (without the addition of dynamic behavioural assumptions).

Covi and Hüser (2024).

Hüser et al (2024).

Sydow et al (2024).

Agent-based

Well-designed to capture heterogeneity and granular information; behavioural responses embedded in model design.

Computationally expensive; need careful justification of ad-hoc behavioural assumptions; system-wide estimation challenging.

Bardoscia et al (2024).

Liu et al (2020).

  • Table 9 from Aikman et al (2023a) and the text therein provide further discussion of semi-structural and network models.

Based on scenario design considerations, the existing modelling landscape, and the practical issues discussed here, there are two areas in which investment in modelling for scenario analysis would be valuable:

  1. Further development of Covi and Hüser (2024) for quantifying macroeconomic recessionary scenarios. This microstructural stress-testing model simulates the impact of rising corporate defaults on bank and insurer capital using a network model. It incorporates amplification through fire sales and solvency contagion. Rather than producing just point estimates, the methodology produces full distributions of profit, loss, and capital outcomes. The model provides a principled method for modelling how the complex interactions between banks and non-banks shape shock propagation in stresses.footnote [66] Recent in-house extensions include a banking liquidity channel and exploring the addition of investment funds (via the Lipper TASS dataset).footnote [67] In practical terms, starting with the Covi and Hüser model is desirable because it is UK-calibrated and already in-house. This lowers the cost of implementation, extensions and data updates.
  2. Development of system-wide models for financial market stress scenarios to include:
  • Development of a desk-based, system-wide model using regulatory data and SWES insights to simulate market price shocks and their amplification via fire sales. This can be viewed as a more granular extension of the representative-agent system-wide model of Aikman et al (2019), which captures key players in the market in the spirit of the Duffie (2011) ‘10-by-10-by-10’ proposals.footnote [68] This will allow for scenario analysis – more quickly and at lower cost than a full SWES exercise – as the financial system and risk-taking behaviours evolve. For example, during the episode of market volatility in April 2025, Bank staff used an early version of the model to estimate the losses LDI funds had incurred and were able to quantify the potential scale of LDI-related activity in real time (as described in Box D of the July 2025 Financial Stability Report). Importantly, this desktop modelling approach will rely on continued collaboration with SWES participants to corroborate modelled results and the behavioural assumptions used.
  • Production of a network financial market simulation model with appropriate propagation mechanisms. Building on planned network mapping of entities in securities, gilt repo, and derivatives markets (Section 3.2.3), this model could adopt approaches such as Koijen and Yogo’s (2019) ‘demand system asset pricing’ or a semi-structural method drawing on the post-SWES modelling agenda described above. Relative to that work, this would draw on broader data sources and capture more interconnections. However, it may give an incomplete picture of entities’ balance sheet positions (due to flow-based data) and face tractability challenges given dataset complexity. It remains to be seen whether data limitations could be overcome.

Relative to the ideal framework, however, this would still leave some gaps in capability.

First, neither proposed modelling approach includes a ‘macroeconomic block’, meaning that propagation to and from the real economy is incomplete. Impacts stop at intermediate financial outcomes (eg asset prices), requiring off-model analysis to flesh out empirical links and complete the risk assessment. Feedback loops from the real economy back to the financial system are also omitted.

Adding such propagation to these models is possible in principle,footnote [69] but is likely to be very challenging in practice. Absent a fully-integrated approach, the best alternative is to include models explicitly designed to capture macrofinancial amplification (Box H).

Second, neither of the modelling avenues proposed for building a macrofinancial scenario capability caters for operational shocks. As summarised in Box B of Adeney et al. (2024), the literature is largely confined to modelling of cyber events with a US focus. Substantial further work would be needed to develop a UK-focused model capable of quantifying operational risk scenarios. More subtly, broader operational considerations and structural changes (like the adoption of AI) can also affect financial propagation mechanisms. For example, operational issues materially exacerbated the fire sale amplification in gilt markets during the LDI episode (Table 1).footnote [70]

3.3.3: Using a modular approach for scenario analysis

Using a modular approach – as discussed in this section – can help address the two key gaps in the use of modelling for scenario analysis discussed in Section 3.3.2. First, it can provide an end-to-end approach to scenario analysis – from the scenario through the system to real economy impacts. Second, it can enable operational risk analysis, by incorporating information on operational dependencies and behavioural reactions to operational shocks.

This could approximate some benefits of a comprehensive model – like capturing feedback loops and expressing risks in comparable terms. This approach can also blend sophisticated analytics with more ‘rough and ready’ assumptions or estimates where necessary. It can draw on a very wide range of information. This can include:

  • Information and insights from the mapping exercise. As discussed previously this can draw on a huge range of data including supervisory information and market intelligence.
  • Models of the type discussed in the previous section. These can be combined with other modules that translate the outputs of these models into real economic effects. Different parts of the system could also be connected using system-wide models or by joining the dots between existing different stress testing exercises – banking, insurance, central counterparties, and market-focused SWES scenarios.
  • Firm information on their responses to specified scenarios. This can include fully modelled responses as in stress tests with industry participation. But desk-based versions can also draw on collaboration with industry to validate assumptions, as discussed in the previous section.
  • Systemic risk (and other) indicators discussed later in this section. These can for example enable assumptions to be made about relationships between different parts of the transmission chain, even where a full understanding of the drivers of that relationship is not (yet) possible.
  • Simple assumptions like ‘rules of thumb’ on behaviour and exposures where necessary.

In this spirit, Figure 5 illustrates a modular approach to systemic risk assessment. It starts by mapping micro and macrofinancial vulnerabilities (boxes). Behavioural assumptions (arrows labelled ‘behaviour’) are then overlaid to reflect how entities respond to these vulnerabilities, ‘activating’ the map and enabling assessment of real economy impacts.footnote [71] In the absence of a single unified model, different behavioural ‘modules’ can be linked to relevant parts of the map to simulate similar dynamics.

For instance, one module could estimate how fund redemptions respond to shocks using simple regressions or rules of thumb. The outputs of that module (redemption sensitivities) could then calibrate fund behaviour in a system-wide stress model. The resulting price impacts – eg on bonds – could feed into further modules estimating effects on bank lending or collateral values. This modular approach supports practical, step-by-step quantification of shock transmission to the real economy. While it may miss complex interactions captured by more integrated models, it offers a foundation that can be built on over time.

Further work – some of which is already ongoing – is required to build this mapping. This includes advancing understanding of the criticality of each type of service and, by extension, their providers. Of relevance to this is how ‘systemic’ each service is (Box D). Other work – discussed later in this section – includes work progressing the FPC’s macroprudential approach to operational resilience.

3.4: Additional systemic risk (and more heterodox) indicators

As a complement to the mapping and scenario analysis discussed in the previous sections, it is also desirable to develop a small suite of indicators that measure systemic risk over time. Relative to naturally disaggregated mapping and modelling approaches, these indicators are more aggregate by design and so may facilitate more consistent comparison over time. Importantly, they do not always rely on an understanding of the exact mapping of vulnerabilities and behaviour to systemic risk. For these reasons, systemic risk indicators provide a natural ‘top-down’ cross-check on conclusions drawn from more disaggregated or ‘bottom-up’ risk assessments.

Examples of potentially useful systemic risk indicators – in addition to indicators already in use at the Bank – include:

  • A composite indicator of Systemic Stress (CISS), which uses standard portfolio theory to aggregate five market-specific indices created from 15 separate financial stress measures (Kremer et al. (2012)).
  • Asset-price based indicators of systemic risk, like SRISK (Brownless and Engle (2017)), the Excess Bond Premium (Gilchrist and Zakrajšek (2012)), MES (Acharya et al. (2017)), and crash risk (Martin and Shi (2023)).
  • A measure of financial market risk perceptions shown to be associated with macroeconomic outcomes (Pflueger et al. (2020)).
  • A credit market sentiment index comprised of economic activity and credit market sentiment factors, and the probability that the economy is in an adverse state, constructed by the Richmond Federal Reserve Bank.footnote [72]
  • A measure of aggregate leverage in financial markets combined with a stability condition which can be used by policymakers as an early warning indicator (Adrian, Borowiecki and Tepper (2022)).

Consideration should be given to the extent to which individual indicators:

  • Provide new information relative to other approaches and indicators. Most valuable are those indicators that shed light on areas where data gaps and challenges to mapping/modelling are greater, such as for non-banks and MBF.
  • Predict the build-up of risk or whether they summarise the current state of the system. For example, there is evidence that some of the asset-price based indicators referenced here have predictive power (Acharya et al. (2024)). By contrast, the CISS metric is more of a summary statistic of current conditions.
  • Distinguish between effects driven by i) financial frictions and ii) fundamentals. For example, rising corporate bond spreads due to weaker business conditions do not indicate financial propagation, whereas those driven by bond fire sales do.footnote [73] Broader Bank research on how to distinguish between such effects includes Banks et al (2024). This looks at the extent to which changes in bank lending are unwarranted given changes in macroeconomic conditions.footnote [74] Extending this analysis to include non-banks/MBF would be a valuable next step.footnote [75]

Alternative ways of thinking about the financial system may also help summarise its current state. For example, the ecological network approach described in Ulanowicz (2020) could be applied to analysis of market stability, recognising that systems can be inherently unstable and sustainability requires a balance between efficiency and flexibility. This approach could be applied to the proposed network modelling of core UK financial markets described earlier. More broadly, adapting stability metrics from the natural sciences, as suggested by Haldane and Turrell (2017), warrants further exploration.

3.5: Broader limitations to quantitative analysis and supplementary approaches

Implementing the proposals discussed in preceding sections would materially improve the financial stability risk assessment framework used at the Bank. Nonetheless, several factors mean that gaps to the ideal framework will inevitably persist. For example:

  • Models are necessarily always gross simplifications of the real world. They can provide insights, but not definitive answers. In particular, behaviour is always difficult to model accurately, especially where there is little evidence on which to base it or where sectors include both domestic and global participants who may respond to shocks differently.
  • The financial system is complex and is constantly evolving. This means that despite efforts to update them, the map and models can be out-of-date in certain respects, to varying degrees.
  • Data gaps can materially hamper quantitative risk assessment. Some can be filled, including via exploratory exercises, but some will always remain.

These types of limitation are generic to all analytical frameworks supporting policy, but are especially relevant for financial stability, given the complexity of the policy problem and the pace of financial system evolution.

More generally, the limitations of quantitative analysis imply that complementary, more qualitative approaches should also be part of the risk assessment framework. These include horizon scanning, supervisory and market intelligence, and alternative approaches to risk identification that are cruder and/or less data-dependent. For example, exploring parallels between historical episodes of financial instability and present circumstances. Other approaches could include the identification of rapid shifts in risks, activity or profitability in certain parts of the financial system, or simple indicators like rapid growth in credit in the real economy.

Ultimately, policymaking is a matter of informed judgement, given the inherent limitations of models. This means risk assessment should be done holistically, combining all relevant quantitative and qualitative information to guide judgement.

3.6: Models for policy analysis

Risk assessment frameworks discussed in preceding sections can be useful for assessing: i) candidate policy interventions and ii) the value of policy interventions in stresses, by identifying how the financial system propagates shocks.

But such analysis of shock propagation is less useful for evaluating the costs of policy outside of stresses.footnote [76] Further, the types of model best suited to system-wide scenario analysis – like network or semi-structural models – are typically less well-suited to policy analysis.footnote [77]

For these reasons, models explicitly designed for policy analysis are considered separately here.

The models discussed in this section should not be considered as anything like definitive guides for real-world policy, given their inherent simplifications. Rather, they serve as experimental tools to explore the rationale for and potential effects of interventions. For example, counterfactual analyses of the impact of different policy interventions (or reaction functions) on the real economy and, ideally, economic welfare. Another important use of these models is to analyse how appropriate policy responses vary with different types of shocks. In this way, model outputs can inform state-contingent policy design and cost-benefit assessments of interventions, and so are a valuable input to policy strategy.

When evaluating models for financial stability policy analysis, the following criteria matter:

  • General equilibrium. The model should be capable of tracing the effects of policy interventions through the financial system to the real economy, accounting for both direct and indirect effects.
  • System-wide. Models should encompass as many relevant parts of the financial system as possible, to try and account for interconnections, risk migration or ‘leakages’ between sectors.
  • Policy interactions. The model should incorporate other policies that affect (or are affected by) the intervention in question.
  • Micro founded. The model should ideally be built up from ‘deep’ behavioural assumptions and associated parameters, ensuring that resulting decision rules are invariant to different policies (per the Lucas Critique). The inclusion of utility-maximising agents would also allow for explicit welfare analysis.
  • Heterogeneity. As discussed in Section 2, financial policy interventions can have distributional effects; assessing these requires sufficiently granular models. Adequately capturing the propagation of shocks through the financial system to the real economy also requires some heterogeneity (eg because financial constraints bind on a subset of entities).
  • Empirical realism. Policy models should approximate relevant empirical features – such as observed effects of an unanticipated policy intervention.
  • Tractability. Model behaviour needs to be comprehensible and explicable (including to non-specialists as required). The model should be deployable quickly – including in terms of compute time – so it can be used to analyse ‘live’ policy questions.

As discussed in Section 3.3.2, model design involves practical trade-offs. For example, a model that performs well along the dimensions of ‘general equilibrium’, ‘system wide’, ‘policy interactions’, ‘micro founded’ and ‘heterogeneity’ would perform less well on ‘tractability’. And while there is some continuity across model classes, key differences often make the choice of model and solution method discrete. Ultimately, the choice depends on the specific policy question. In some cases, using multiple approaches may be beneficial. Figure 6 offers a high-level RAG assessment of key model classes against several of the above criteria.footnote [78]

The use of near-representative agent, near-linear DSGE models is well-established in policy institutions. These models benefit from mature macrofinancial linkage modelling techniques and tractability. They enable the application of powerful toolkits for policy analysis (including those developed at the Bank – Harrison and Waldron (2021)). Although linear, these models can be applied in a piecewise manner – incorporating the effect of occasionally-binding constraints (under a perfect foresight assumption) – to generate non-linearity and asymmetry. They are also well-suited to analysing policies – and their interactions – where: i) theoretical foundations matter (eg for explicit welfare analysis), ii) rich heterogeneity is not critical, and iii) policy questions are more easily tackled in a linear setting (eg De Pauli and Paustian (2017), and Ferrero et al (2024)). Examples of such policy questions include analysis of the interaction between optimal financial stability and monetary policy with a zero lower bound (ZLB) constraint.

Heterogeneous agent DSGE models are based on the same theoretical paradigm but recognise differences between agents.footnote [79] These models are inherently better suited to financial stability policy analysis than near-representative agent models for several reasons. First, they recognise market incompleteness as central to financial stability issues. Second, they capture the uneven distribution of leverage and other state variables that drive amplification, enabling it to be more accurately modelled. Third, they reflect distributional effects of financial stability policies more effectively. Overall, these models are preferable when: i) heterogeneity is crucial to the policy question, ii) theoretical foundations matter (eg, for explicit welfare analysis), and iii) linearity doesn’t. Examples of such policy questions include cost-benefit analyses of all types of financial stability policy.

However, heterogenous agent DSGE models can be less tractable than representative agent models: compute time is expensive and solution methods are more model-specific. The scale of such cost – and these models’ ability to capture rich heterogeneity and non-linearity – depends heavily on how they are specified and solved. Fully non-linear models with high heterogeneity (eg Bewley-Aiyagari-type) suffer from the ‘curse of dimensionality’ computational cost which limits their size and so makes them ill-suited to system-wide policy analysis. But there is something of a continuum of model types between linear (or piecewise-linear) representative agent models and fully non-linear heterogeneous agent models (as captured in the third row of Figure 6). Some of the recent monetary policy-focused Heterogeneous Agent New Keynesian literature falls within this middle ground (eg Ravn and Sterk (2021)).footnote [80] This includes macrofinancial models (examples in Figure 6) but could usefully be developed to include explicit modelling of financial stability policy.

Agent-based models (ABMs) are an alternative type of micro-founded, heterogeneous-agent models that are not built around the utility maximisation and rationality assumptions typical of DSGE models. By dispensing with optimising behaviour, they break the curse of dimensionality, offering several advantages over fully non-linear heterogeneous agent DSGE models. They can be much larger with rich heterogeneity across multiple dimensions (eg households and financial institutions). And they can incorporate more sophisticated and realistic interactions between agents (eg auction mechanisms), and multiple constraints (eg LTV, LTI and debt service constraints on household borrowing). Dispensing with optimisation and rational expectations means that ABMs can also generate a greater range of non-linear dynamics. These advantages make ABMs better suited to analysis that requires a high degree of heterogeneity and/or a fuller modelling of the financial system. For example, where policy questions are system-wide and/or involve policy interaction. Or cases where ABMs are judged to be more realistic along some dimensions (eg where non-rationality adds empirically relevant dynamics, such as endogenous boom-bust financial cycles).footnote [81] Greater computational tractability also allows exploration of a broader range of scenarios, including counterfactual policy experiments.

That said, some of these advantages of ABMs trade-off. Flexibility and the potential for complexity can lead to arbitrary behavioural rules that make overall model behaviour harder to understand.footnote [82] While the absence of an explicit utility function precludes formal welfare analysis, ABMs – like heterogeneous-agent DSGE models – can still support broader policy evaluation like cost-benefit analysis. ABMs have been applied to a diverse range of topics including interbank markets, housing, financial markets, climate change, and payment systems (Borsos et al (2025)).

Semi-structural models cover a wide range of alternative models, including system-wide models of the sort described in Section 2 and DSGE variants (eg Aikman et al (2024)). Their main drawback in policy analysis is a lack of grounding in economic theory, subjecting them to the Lucas critique. While they can be more empirically congruent than micro-founded models, semi-structural models are not without empirical issues. For example, they are prone to identification issues associated with their flexible structure. And larger variants suffer from practical estimation issues given the volume of data required to estimate a (typically) larger number of parameters. Semi-structural models are best suited to policy analysis where microfoundations and associated advantages are not important and where the flexibility or tractability of the model class is. For example, where system-wide analysis is needed, such as in understanding policy ‘leakages’, or where system-wide propagation is a key feature (eg in modelling crystallisation of liquidity risk).

In all cases, there is a question of how the model should be estimated or calibrated. As well as the usual general considerations (not described here), there are some specific challenges for financial stability applications. In particular, time-series approaches risk overfitting of rare events like the GFC and Covid, which may not reflect current dynamics or future shocks.footnote [83] For models with heterogeneity, cross-sectional or panel data can offer alternative estimation or calibration strategies, though with their own limitations. Given these issues and broader model uncertainty, sensitivity of model outputs to alternative model parameterisations is desirable (and the use of alternative models altogether where possible).

In summary, the models of the type discussed in this section can play a valuable role in evaluating policy. Including to: i) routinely examine interactions between monetary and macro-prudential policy; ii) enhance model-based analyses of policies in use; iii) help explore emerging areas of policy development, such as policy for non-banks/MBF.

3.7: Bringing the proposals together

The frameworks underpinning UK financial policymaking aim to tackle several of the general challenges in risk assessment discussed in the preceding sections. However, as for other aspects of the analytical framework, there is always room for improvement, particularly as the nature of threats to UK financial stability and the analytical tools available to address them both evolve.

These general challenges, how the UK’s financial policy framework and supporting analytical framework has helped to address them – and how the analytical framework could be further enhanced – are summarised as follows.

General challenge 1: difficulties in systematically identifying vulnerabilities and amplification of shocks to real economic outcomes via financial service provision.

What the UK’s financial policy frameworks do to address this challenge.

Like other financial stability policy institutions, the UK’s risk assessment framework has focused on identifying vulnerabilities. It has broadened and deepened over time, shifting its focus to building threats to stability, such as those posed by MBF and operational risks. And ongoing work is further developing understanding of the effect of financial system propagation of shocks to and from the real economy.

This includes work (eg the FPC’s macroprudential approach to operational resilience) assessing operational risks to key nodes in systemically important markets and how these risks can be amplified through systemic vulnerabilities and transmission channels, ultimately affecting the delivery of vital services to the real economy. Market-wide mapping and scenario analysis can enhance understanding of how operational disruptions propagate, particularly when they occur alongside existing financial stress. This approach can reveal how and where operational disruption exacerbates financial instability, highlighting areas where regulatory or firm-level intervention may be necessary to mitigate risks.

Suggested enhancements to the analytical framework.

Construct a ‘map’ of the financial system that systematically identifies – and where possible quantifies – propagation of shocks to real economic outcomes. This work would include:

  • Compiling the taxonomy of information of the types set out in the annex tables alongside a system-wide dashboard for monitoring vulnerabilities and behaviours. It is essential to distinguish here between effects driven by i) financial frictions and ii) fundamentals (as discussed earlier along with work that could support this).
  • Building a library of estimates and ready reckoners that help size propagation mechanisms (and identify gaps).
  • Produce a system-wide dashboard of indicators for the vulnerabilities and propagation channels identified in the map and collate estimates and data that help size them.
  • Further analysis of the role of and impact of disruption to different financial services on real economic activity.

General challenge 2: stress tests can be resource-intensive and may not fully capture feedback within and between the financial system and the real economy.

What the UK’s financial policy frameworks do to address this challenge.

Bank staff have developed models and top-down ‘ready reckoners’ that can be used, in conjunction with supervisory and other data, to help run desk-based variants of stress tests. Importantly, these draw on bottom-up industry exercises to fine tune desk-based capabilities.

Suggested enhancements to the analytical framework.

1. Build on existing work to supplement the Bank Capital Stress Tests and exploratory exercises like the SWES with models that can be used for scenario analysis in a more nimble and flexible way:

  • Development of system-wide models for financial market stress scenarios: i) Use of regulatory data, insights from the SWES, and ongoing contact with SWES participants to construct a system-wide model along the lines of Aikman et al (2019), but with heterogeneity within sectors following a variant of the Duffie (2011) ‘10-by-10-by-10’ proposal.footnote [84] ii) Leverage off planned network modelling work to consider whether to build a network financial market simulation model or a demand system asset pricing model.
  • Exploration of modelling approaches that incorporate macrofinancial feedback mechanisms not captured by other proposals here: i) Continued development of an in-house semi-structural model to capture feedback between the banking sector and real economy in adverse macroeconomic scenarios. ii) Further investigation of the merits of extending the semi-structural model of Aikman et al (2024) to improve its empirical realism and mapping to the regular real economic risk assessment and/or investigation of the merits of using any model adopted for capital analysis.
  • A model or models for assessing, over time, the potential implications of macroeconomic and financial system stresses for bank capital buffers, as an input to the FPC's quarterly CCyB decisions. This could be done by extending Covi and Hüser (2024) and developing an in-house semi-structural model.footnote [85]
  • Further development and extension of Covi and Hüser to assess propagation from banks, insurers, and investment funds in macroeconomic recessionary scenarios.

2. Develop a modular approach to scenario analysis. This would build on the mapping, indicators and models discussed earlier to provide an end-to-end approach to scenario analysis – from the scenario through the system to real economy impacts. Depending on the risk being assessed, different inputs would be needed to simulate different parts of the propagation chain supplied by the mapping and indicators. The aim would be to standardise inputs as far as possible and – while recognising the caveats and qualifications – use a modular approach to combine them, to more fully simulate the propagation of shocks to the real economy. Scenarios for different parts of the system could also be connected using system-wide models or by joining the dots between existing different stress testing exercises – banking, insurance, central counterparties, and SWES scenarios.

General challenge 3: limits to mapping and modelling. Some may be possible to overcome in time, for example, some data and modelling gaps. But some are inherent, for example, models are necessarily always gross simplifications of the real world.

What the UK’s financial policy frameworks do to address this challenge.

Bank staff draw on a broad spectrum of data sources – including supervisory insights, horizon scanning, industry and market intelligence – to close data gaps and enhance quantitative analyses (Section 4 for further discussion of staff work with data). The Bank also continuously advances its risk assessment framework by developing, extending, and integrating innovative modelling approaches, such as those proposed by Covi and Hüser (2024), ensuring that it keeps pace with the latest advances in modelling technology.

Suggested enhancements to the analytical framework

  • To use SWES-type exercises to investigate risks or areas of the financial system that are not mapped or understood with enough clarity, but where previous events, intelligence or market features suggest potential systemic risks. Such exercises can inform on vulnerabilities and behaviour and then underpin subsequent desk-based stress tests.
  • To invest further in systemic risk (and more heterodox) indicators as robust cross-checks on the bottom-up, disaggregated risk assessment.

General challenge 4: the absence of a fully-specified comprehensive transmission framework means that separate models are needed for policy analysis (from risk assessment). And different models have different pros and cons – many are limited by computational and data needs, and constraints on specification (such as functional form). As for risk assessment models, some challenges may be overcome in time but some are inherent.

What the UK’s financial policy frameworks do to address this challenge.

Bank staff draw on a broad spectrum of data sources and modelling approaches including drawing on risk assessment models where appropriate. Several strands of ongoing research in the Bank have applicability to policy questions and some have already been used by Bank staff in this way.

Suggested enhancements to the analytical framework.

In addition to the proposals for risk assessment models – which can be used in a limited way for policy analysis given the drawbacks outlined above – the following are proposals for investment in models for policy analysis.

  • Explore the development of a set of models – built from a common ‘trunk’ – for analysing financial stability and monetary policy responses to different shocks under different assumptions.footnote [86] As well as informing on how policy should respond to different shocks, such models could also be used to assess the interaction between monetary and financial stability policy,footnote [87] implications of the ZLB, and the effects of unconventional monetary policy. The trunk would be built from the DSGE paradigm and would be solved with non-linear methods. It would feature a banking sector, non-banks which also lend to producers, housing financed by mortgages, and an interbank market (to allow for repo and associated central bank policies).
  • Continue to investigate and develop capabilities in modelling agent heterogeneity in support of analysis of a range of policy interventions. For example, an overlapping Generations (OLG) model like that in Kaplan et al. (2020) would be well-suited for analysis of mortgage market interventions. Alternatives to this include Greenwald (2016) and Garriga et al. (2021); these offer some of the same capabilities (though arguably with less direct real-world applicability) in more tractable frameworks. Agent-based models – such as an extension of Bardoscia et al. (2024) – could supplement this approach. More broadly, agent-based models could be used to assess a range of prudential policies and their interactions. This would expand the literature in this area.
  • Investigate the development of a heterogeneous agent model that studies the financial stability policy problem when financing constraints are linked to economic supply-side outcomes via the level and distribution of corporate sector capital (and, potentially, total factor productivity growth). As noted in Annex A, the literature has not studied the optimal policy response to externalities arising from supply-side frictions. Such frictions have the potential to create meaningful trade-offs for policy, including between growth and stability.
  • Explore development of structural modelling of the effects of policies that enhance resilience in core markets. For example, the gilt market, like other government bond markets globally, has experienced several financial shocks in recent years. A structural model of it, capturing the behaviour of its participants, both domestic and international, and, potentially, interlinkages with other markets (like the repo and interest-rate derivatives market) would be particularly useful at this juncture. Candidate starting points include the search-and-bargaining models of Duffie et al (2005), Uslu (2019) and Coen and Coen (2022).

The above list would ideally be supplemented with models capable of shedding light on areas of increasing policy interest, like operational risks and AI. In practice, however, the current state of the literature and available resources constrain what can be reasonably achieved, in the near-term at least. Development of models for policy analysis should be kept under review, including to reflect advances in the literature.

Legal Disclaimer:

EIN Presswire provides this news content "as is" without warranty of any kind. We do not accept any responsibility or liability for the accuracy, content, images, videos, licenses, completeness, legality, or reliability of the information contained in this article. If you have any complaints or copyright issues related to this article, kindly contact the author above.

Share us

on your social networks:
AGPs

Get the latest news on this topic.

SIGN UP FOR FREE TODAY

No Thanks

By signing to this email alert, you
agree to our Terms & Conditions