Funding opportunity

Funding opportunity: Pre-announcement: Responsible AI UK keystone projects

Start application

Apply for funding for consortia-led research into responsible artificial intelligence (AI) to ensure that AI technologies are designed, deployed and used responsibly within societies.

This funding is meant for world-leading researchers from across all disciplines to undertake a variety of activities focusing on one strategic research theme.

You and your organisation must be eligible for UK Research and Innovation (UKRI) funding.

Successful projects must begin by 1 May 2024 and last up to 48 months.

This opportunity is part of the Responsible AI UK programme.

This is a pre-announcement and the information may change. The funding opportunity will open on 30 November. More information will be available on this page by then.

Who can apply

Responsible AI needs an interdisciplinary ecosystem that adopts equality, diversity and inclusivity (EDI), trusted research, and responsible research and innovation (RRI) as fundamental principles. We are therefore particularly interested in supporting diverse, multidisciplinary teams that co-create research with industry and the public. Successful applicants will be expected to collaborate with the wider Responsible AI UK programme.

We would encourage researchers from all the disciplines that are supported by UKRI to apply, including but not limited to:

  • applied ethics
  • management science
  • computer science
  • engineering
  • humanities
  • innovation studies
  • medicine and health studies
  • law
  • organisational management
  • philosophy
  • politics
  • psychology
  • sociology

Responsible AI UK is leading this funding opportunity on behalf of UKRI. Standard UKRI eligibility rules apply. Research grants are open to:

  • UK higher education institutions
  • public sector research establishments
  • research council institutes
  • UKRI-approved independent research organisations and NHS bodies with research capacity

Read the guidance on institutional eligibility.

You can apply if you are resident in the UK and meet at least one of the following criteria:

  • you are employed at the submitting research organisation at lecturer level or above
  • you hold a fixed-term contract that extends beyond the duration of the proposed project, and the host research organisation is prepared to give you all the support normal for a permanent employee
  • you hold an UKRI, Royal Society or Royal Academy of Engineering fellowship aimed at later career stages (excluding industry employees)
  • you hold fellowships under other schemes (please contact us to check eligibility, which is considered on a case-by-case basis)

Holders of postdoctoral level fellowships are not eligible to apply for an UKRI grant.

Due to the nature of this underling funding, we are looking for a few high-quality bids that extend the network of researchers engaged in responsible AI. Therefore, you can be named in only one proposal. UKRI reserve the right to reject proposals which do not meet the requirements of the funding opportunity.

What we're looking for

This funding opportunity aims to support projects that address strategic themes that underpin responsible AI. It aims to grow the network of researchers engaged in these topics and complement other key aligned programmes funded by UKRI (for example, ATI, Ada, AI Hubs, Centres for Doctoral Training) and AI institutes funded by third parties.

Your proposed keystone project will form the pillars of Responsible AI UK and will look to deliver flexible funding to bring together the best talent to address multi and interdisciplinary challenges posed by responsible AI.

We particularly encourage proposals that fit the Responsible AI UK vision:

  • help enable responsible AI to power benefits for everyday life
  • work in collaboration with researchers, industry professionals, policymakers and stakeholders to drive an ecosystem that will be responsive to the needs of society

Before publishing the full funding opportunity, Responsible AI UK will refine the topics to be covered by the keystone projects based on the series of roundtables that have been run at the townhall events in London, Cardiff, Belfast and Glasgow.

Further consultation with the AI ecosystem will take place through a series of roundtables, and feedback from members of the Responsible AI UK strategy group and other key members of the UK AI community.

To date the following overarching themes have emerged as part of these consultations and we will look to refine these over the coming weeks, including target specific areas where UK research can make the most impact. We expect all projects to consider both the technology development that embeds trust and responsibility and how this can deliver social benefits through technology implementation, regulation and education.

Responsible AI-powered organisations

Organisations are rapidly evolving new ways of working in response to AI. This has wide-ranging implications for organisations and their employees, and the end-users of the products and services offered by these organisations.

Responsible AI practices need to include systems-based approaches, going beyond testing of individual AI algorithms, to evaluating the wider implications of AI deployment into such complex human and AI collaborative systems.

This includes ensuring that innovators understand the overall consequences and measures, such as reskilling and upskilling that will need to be developed for the uptake of AI to be successful. What are the responsible AI principles and sector-specific approaches to AI innovations that are safe, trustworthy, and provide confidence to organisations, customers, stakeholders, and the wider society?

Addressing harms, maximising benefits of AI

How AI is framed and applied introduces new opportunities but also trade-offs for individuals, society, and industry where the purported benefit may be outweighed by negative impacts on a wide range of issues. This includes issues such as:

  • privacy
  • bias
  • accessibility
  • labour rights
  • social justice
  • sustainability (of people, organisations, environment)

Many of these have direct legal ramifications irrespective of future specific AI regulation.

Moreover, AI solutions often do not generalise to previously unseen settings, exacerbating uncertainties as to whether existing successes translate to new domains, sectors, cultural and global contexts. This introduces the need for deployment, validation, provenance, and auditing regimes for AI, so decision makers can thoroughly understand and manage the limitations of AI systems to ensure they are safe, ethical, and beneficial, while simultaneously highlighting where further development is needed to extend their scope.

AI law, governance, and regulation in an international context

The UK will need to meet the challenges of when and how to govern and regulate AI within the international digital economy. In the National AI Strategy, the UK government emphasises the desire to encourage startups and small and medium-sized enterprises to adopt AI while acknowledging the increasing need to meet AI regulation. This is set against currently different approaches taken by the EU, China, the US, and the Global South, a global debate to which the UK must make a strong contribution. Against this background, we would seek the most effective research contributions that the UK can make to this global debate, to drive for clarity around AI regulation that promotes trust, fairness, and accountability for users, and certainty for international commerce.

Projects will need to fit within at least one of these themes.


The consortium will be from a diverse range of voices, through engaging different academic institutions, policymakers or advisory groups and commercial enterprises. The team will be reflective of the strengths represented around the UK, nationally. It is expected this group will have:

  • thematic expertise reflecting the communities that will be engaged in the investment
  • in-depth understanding of relevant disciplines, technologies, policy challenges and evidence needs
  • an ability to engage with diverse stakeholders, including non-academics
  • an ability to articulate a clear vision for engagement with communities working in this area and new partners from within the appropriate disciplines or elsewhere
  • an ability to deliver complex projects to time and on budget, considering the variety of activities and outputs
  • specialist expertise (academic or non-academic) to support the desired outcomes, which may include:
    • sector representatives
    • programme management expertise
    • a communications function
    • knowledge mobilisation expertise

Key features of a keystone project

Quality and ambition

A keystone project is seen as a scheme that matches best with best and allows researchers to tackle bigger, more open-ended problems, addressed through a more coherent or holistic approach.

The stability in tackling a longer-range vision helps motivate teams, provides the freedom required to take risks, and enables longer term planning.

Partnership and ecosystem development

The scale of activity is seen to create stronger links between the universities involved and greater visibility at a national and international level, leveraging the Responsible AI UK network, partners, and international connections.

The size of keystone projects allows for the assembly of the best team and collaborators, all with complementary expertise leading to the development of effective multidisciplinary and cross-disciplinary working. You will be expected to promote and champion responsible AI in your respective domains, to help grow the community across sectors and disciplines.

The duration of keystone projects allows you to invest in building effective collaborations. The scale of a keystone project should look to attract partnerships beyond the original project partners and seek greater input from the wider community, including public and private sector, resulting in more external visibility on the research direction for the area.

Keystone projects will form part the Responsible AI UK core pillars and contribute activities that will help connect and drive efficiencies within the UK AI ecosystem. You will have access to the broad network of partners brought together by Responsible AI UK and support international conversations on responsible AI using the reach that Responsible AI UK affords. You will also bring on other parts of the AI ecosystem that are currently disconnected from national conversations and research programmes. These activities should be specifically costed into the programme to allow for this flow on knowledge across organisational boundaries.

Please make it clear where the project looks to build on connections to existing networks and research programmes. Responsible AI UK is keen to understand how this funding is building new connections and developing exiting relationships within the ecosystem, providing additional value within the landscape.


The keystone mechanism provides freedom to scope new opportunities, allows you to cross-fertilise ideas and build up new skills sets. This allows you to develop new themes, and to trade ideas and resources. The stability of the grant allows early career researchers (ECRs) to express their creativity and to lead on part of the investigation.

Impact and advocacy

Keystone projects are seen to have greater visibility and recognition within the universities involved and the relevant research communities at both a national and international level, leveraging the Responsible AI UK brand. This gives you more influence than smaller scale research activities.

You can attract more visits and engagement with high quality researchers and external stakeholders, leverage other funding, and influence wider strategies. The visibility also enhances the opportunities for outreach and advocacy, promoting UK science.

We expect you to demonstrate how you will deliver or support the desired outcomes in your applications to support the UK’s transition to an AI-enabled economy, resulting in growth, prosperity and mutual benefit for sectors and citizens.

Career development

Keystone projects will be a good environment for ECRs’ longer term career development. The flexibility and longer durations allow the project lead to empower junior team members giving them greater independence through more responsibility and leadership over activities.

Postdoctoral staff gain a broader experience due to the breadth of experience and expertise in the team and there are greater opportunities for secondments, mentoring and involvement in management. This makes keystone projects an attractive employment prospect leading to higher quality recruitment. PhD students would also be expected to be aligned to keystone project teams, also benefiting from interacting with a team of broader expertise and activity.


The flexibility of the keystone projects is seen as a real strength of the scheme. This enables a more dynamic allocation of resources and a nimble approach to recruitment, and definition of the individual projects being undertaken. Specifically, integrative activities across the Responsible AI UK ecosystem are expected. Also, 20% of funding and researcher time should be reserved for such dynamically defined activities to be undertaken in collaboration with:

  • Responsible AI UK
  • UKRI AI investments
  • keystone projects
  • soon to be announced AI hubs and Centres for Doctoral Training

In addition to collaborating with the wider programme, it is anticipated that successful applicant teams will also engage with stakeholders and users of the research, who are essential to the design, conduct and impact of application-orientated research.

It is also expected that keystone projects demonstrate strong institutional support. Letters of support from participating organisations (at least from the principal investigator’s institution) should demonstrate alignment to the organisations’ strategy and ambitions and indicate contributions to the project.

While substantial contributions (cash or in-kind) are expected from project partners and participating organisations, a minimum requirement is not expected, and partnerships will instead be assessed on their relevance and alignment to the programme.

Note that all proposals submitted will be assessed equally, irrespective of which themes the proposal aligns with. However, a balanced range of projects across disciplines, sectors and themes will be funded.

If you are planning to bring international collaborators (for example, industry or academic partners), you will need to complete the Responsible AI UK’s Trusted Research section and the checklist for academia by National Protective Security Authority and National Cyber Security Centre (PDF, 115KB).

Note that based on the answers to this checklist, you may need to escalate this within your institution or department for a decision.

Management and monitoring

Keystone projects should have effective management and monitoring arrangements for the investment. This should include a risk management strategy and a strategy for how the flexibility of resources will be managed.

Responsible AI UK expects all keystone projects to establish and run an independent advisory board that will include at least one member of the Responsible AI UK leadership team, to provide advice and recommendations on the strategic scientific and research direction and activities (such as impact, advocacy and outreach) of the programme grant.

This independent advisory board must meet at least annually. This group should have at least 50% independent membership and an independent chair.

Responsible AI UK strongly encourages you to consider costing in project management and other administrative support such as employing a full-time equivalent project manager, and not relying on the project lead for these duties. Projects will be able to rely on the centralised communications, networking, and event management resources offered by the Responsible AI UK operations team for activities that look to bring together the wider ecosystem.

What this scheme is not for

We will not fund proposals that:

  • do not embed responsible research and innovation and equality, diversity and inclusion considerations into the research theme itself as well as into the research practices
  • are too focused on applied research and do not demonstrate significant ambitions or risk-taking
  • do not build on the broader set of activities going on across the UK and international AI ecosystem
  • do not have a mix of technical and non-technical disciplines engaged in joint activities
  • do not demonstrate clear impact pathways
  • do not demonstrate an interest in engaging with the ecosystem beyond the investigators’ own disciplines

Learn about the Responsible AI UK programme.

Funding available

Up to £10 million funding is available through the Responsible AI UK programme to support up to four grants for up to 48 months (at 80% full economic cost (FEC)).

Standard UKRI eligibility rules apply, for details on who is eligible to receive funding, please refer to the ‘who can apply’ section.

We expect to fund impact activities requesting funding between £2 million and £3.5 million (at 80% FEC).

Please note that due to the nature of this funding, additional requirements on spending profile, reporting, monitoring and evaluation as well as grant extensions will apply. This will be reflected in the grant additional conditions, and those funded will need to comply with them. Further details are provided in the additional information section.

Please note that any projects funded through this funding opportunity will have a fixed start and end date, and that no slippage to this date will be permitted.

Grants will be funded at 80% of the stated FEC (except for non-academic partners). The remaining 20% must be contributed by the academic or industry partners submitting the proposal.

The grant can support any directly incurred costs, such as research staff time, consumables, travel and subsistence, and directly allocated costs, such as investigator time, and associated overheads.

You are not required to have existing collaborations or contacts within the Responsible AI UK programme.

Current Responsible AI UK investigators may not lead a project nor be costed on the grant, but they may be named either as co-investigator or project advisor depending on their contribution.


Non-capital equipment over £10,000 in value (including VAT) is only available in exceptional, well-justified circumstances. Items of equipment should be in the directly incurred – other costs heading and will need robust justification. Items over £10,000 will be especially scrutinised, and only permitted if its remit for this funding opportunity is clearly justified.

Note that any deviation from the spending profile beyond 5% on an annual basis is not allowed (any underspend will not be refunded, nor any overspend allowed). No-cost extensions cannot be allowed.

How to apply

You should ensure you are aware of, and comply with, any internal institutional deadlines that may be in place.

An intention to submit should be registered by 15 December.

Proposals will need to be prepared using the submission template, completing all of the sections, and submitted in PDF format via the online application portal.

Proposals must contain an explanation of how the proposed work aligns with the objectives of the Responsible AI UK programme and how it fits into the frame of the Responsible AI UK. You should be explicit about the need for this funding and the added value your proposed activity brings to a specific area of the programme. All proposals must also demonstrate how you will ascertain adherence to your spending profile.

Responsible AI UK reserves the right not to fund a project if ethical concerns exist or are raised by the reviewers or panels members. Concerns may include overlooked aspects, or issues not appropriately accounted for. You must complete the equality, diversity and inclusion and responsible research and innovation section to identify and demonstrate how challenges will be addressed as part of the research.

An eligible member of the research investigation team will be identified as the main contact. They will submit the bid and be the point of contact with Responsible AI UK for all communication during the bid and post award (if successful).

The main contact must register their intent to submit a proposal during the registration period. The registration and submission system will be activated by the stated opening date. Details will be published on the website.

More details on how to complete the bid submission will be provided in the full funding opportunity announcement.

Responsible AI UK must receive your application by 2 February 2024 4:00pm UK time.

How we will assess your application

Assessment process

A sift panel will consider all proposals to identify those that meet the assessment criteria. These will then be considered by a panel of experts to select the final successful proposals. While this panel of experts will include members of the core Responsible AI UK team (including UKRI representatives), the panel will also consist of experts outside of the Responsible AI UK consortium.

All criteria will be assessed in determining the final rank ordered list using the panel introducers scale of one to 10. Responsible AI UK will follow UKRI’s principles of peer review to ensure fairness and transparency within the decision making process.

Funding decisions will be made based on the rank ordered lists as well as the nature of the projects. To ensure a balanced portfolio of activities, we will aim to fund at least one project against each of the themes.

Assessment criteria

Details of the assessment process and specific criteria which will reflect the scheme objectives will be published in the full funding opportunity.

Contact details

Get help with developing your proposal

For help and advice on costings and writing your proposal please contact your research office in the first instance, allowing sufficient time for your organisation’s submission process.

Ask about this funding opportunity

All queries regarding the submission of proposals should be directed to the Responsible AI UK operations team.


Our working hours are Monday to Friday, 9:00am to 4:00pm UK Time, excluding bank holidays and other holidays.

Additional info

The Responsible AI UK programme of which this funding opportunity is a part is a £31 million strategic investment by the UK government in responsible and trustworthy AI.

Responsible AI UK acts as a focal point for a broad range of initiatives to work across the community to co-create with all stakeholders an ecosystem to meet society’s needs for justice, equity, sustainability and inclusivity. The Responsible AI UK adopts a strong human-centred approach aligned with the UK AI strategy, to provide platforms for technological futures that promote inclusive and positive outcomes.

The Responsible AI UK moves beyond “making technology responsible and trustworthy” by ensuring the benefits and risks of AI can be recognised and governed by all those whose lives and livelihoods are affected by it.

Responsible AI UK works to ensure society deploys and uses AI in a responsible way. Our approach is to equip the AI community with a toolkit that includes technological innovations, case studies, guidelines, policies and frameworks for all key sectors of the economy.

To achieve this, Responsible AI UK works in collaboration with researchers, industry professionals, policymakers and stakeholders to drive an ecosystem that will be responsive to the needs of society. It is led by a team of experienced, well-connected leaders from all four nations of the UK, with complementary backgrounds, committed to an inclusive approach to the management of the programme.

This ecosystem consists of mechanisms to:

  • co-create research with industry and the public
  • establish contextual understandings of responsible AI for users, customers and developers
  • develop pathways to scale the use of human-centred AI across society, industry and commerce

Grant additional conditions

Please note that due to the nature of this funding stream, there will be specific spending requirements, monitoring and evaluation.

Projects will also be expected to commit to adhere to open-source, open-data and open-innovation guidelines.

Awards will be confirmed upon acceptance of the non-negotiable terms and conditions, which will be set out in the award letter.

The project team of all funded projects will be required to engage fully with the programme, including participating in Responsible AI UK activities and events, alongside attending partner meetings as required during the lifetime of the project, and reporting including on commencing, mid-project, and at the end of the project. In addition, within one month of the end of the project, a final report will be submitted to the Responsible AI UK executive management team highlighting the project outcomes and impact.

This is the website for UKRI: our seven research councils, Research England and Innovate UK. Let us know if you have feedback.