DataCentreNews UK - Specialist news for cloud & data centre decision-makers
Photorealistic uk data centre night view servers technicians large scale

CommonAI wins GBP £16m grant for new AI inference lab

Fri, 27th Feb 2026

CommonAI has added the UK's Advanced Research and Invention Agency (ARIA) as a member and secured an initial £16 million grant as part of a wider £50 million commitment to "scaling inference" for artificial intelligence systems.

The funding comes through ARIA's Scaling Compute programme and centres on a new Scaling Inference Lab. Embedded within live data centres, the lab will focus on the operational phase of AI systems-running models in production rather than training them.

Policy and investor attention has largely focused on the cost and complexity of training ever larger models. The new programme instead targets inference, which developers and infrastructure operators often cite as a major driver of long-term compute costs and energy use once AI services reach sustained deployment.

Lab In Data Centres

CommonAI will establish and operate the Scaling Inference Lab, while ARIA will lead and fund the initiative through its membership. The lab is designed around real-world operating conditions rather than stand-alone research environments.

The lab will integrate hardware, software and operational design, with a focus on cost, efficiency and reliability. It will also provide working clusters and open benchmarks to test systems under real-world data-centre constraints.

The programme will involve researchers, start-ups, scale-ups and larger companies. Planned sector areas include finance, healthcare and national infrastructure, alongside scientific and industrial use cases.

Focus On Inference

Inference is the phase in which a trained model generates outputs in response to new inputs. This can include responding to user queries, analysing data streams, or running background processing tasks. It differs from training, which uses large upfront compute resources to build a model. Inference workloads scale with usage and can become a persistent cost as services grow.

The initiative aims to "dramatically reduce compute costs" for inference and shift the work from theory to industrial-scale delivery. ARIA programme materials include a target of cutting compute costs by a factor of 1,000, implying major improvements in system design and operations.

Sir Andy Hopper, Chairman of CommonAI CIC, said:

"The Scaling Inference Lab creates a practical environment where new AI infrastructure can be tested and proven at system scale. It builds on CommonAI's vision of shared infrastructure, allowing organisations to innovate without needing the scale or resources of large technology providers. By improving access to efficient, trusted computing platforms, we can help create a more accessible AI ecosystem and unlock greater economic opportunity across the UK."

Shared Infrastructure Model

CommonAI describes itself as a collaborative engineering and compute platform that provides shared AI infrastructure through a membership model. Organisations join specific programmes, while CommonAI oversees delivery and operations.

Launched in September 2025, the group has positioned its approach as an alternative to building separate infrastructure stacks. It also frames shared facilities as a way to reduce reliance on hyperscalers, particularly for smaller firms that struggle to secure capacity and specialist expertise.

Dr Gavin Ferris, CEO of CommonAI CIC, said: "CommonAI is focused on delivery, building shared infrastructure that organisations can use to run and improve AI systems in real conditions. Scaling Inference brings partners from industry, academia and the public sector together around working clusters, open benchmarks, and measurable progress. By creating shared infrastructure that organisations can build on, it supports emerging companies, reduces development risk and helps attract investment into the UK AI ecosystem."

UK Compute Agenda

ARIA and CommonAI have linked the programme to the UK's broader compute agenda and the government's aim of connecting research with national testbeds and deployment. The lab's emphasis on operational testing reflects wider interest in infrastructure that developers can use without building their own data-centre estates.

Suraj Bramhavar, ARIA Programme Director, said: "To reduce compute costs by 1000x, we need to move from theory to delivery. CommonAI is the right partner because their DNA is built on translating research into working, industrial-quality foundations. By leveraging their ability to build and operate shared infrastructure in live settings, combined with a proper institutional framework for collaborative research, we're giving startups the rigorous, independent platform they need to prove their hardware is ready for the real world."

Scaling Inference is the first collaborative engineering programme on CommonAI's platform. The organisation is also establishing a High Assurance programme focused on regulated and mission-critical applications, with early members already joining.