Our latest stories, delivered to your inbox every day.
Subscribe
By signing up you agree to our User Agreement (including the class action waiver and arbitration provisions), our Privacy Policy & Cookie Statement and to receive marketing and account-related emails from Newspage News.
You can unsubscribe at any time.
CREATE A

NEWSPAGE
subscribe

THE UK Government’s Sovereign AI Fund this week co-invested in a $1.1 billion seed round by Ineffable Intelligence, founded by AI pioneer and University College London professor, David Silver, at a valuation of $5.1 billion.

But AI experts have questioned whether the UK government investing small amounts alongside significant US venture capital in Ineffable — a UK-built, self-learning AI that can uncover new knowledge rather than simply copying what humans already know — is the right structure for building UK AI independence.

They also queried whether, as part of the investment, the necessary safeguards, controls and accountability that public money should demand are in place.

They said this is particularly important in relation to the downstream use of a technology that, the Government claims, has “the potential to transform entire sectors”. If the right safeguards aren’t in place, they warn, there is also the potential for a critical “governance gap”.

The Ineffable round was led by US behemoths, Sequoia and Lightspeed, with Nvidia, Google and Index Ventures also among the backers and the British Business Bank investing around £15 million alongside the Government.

The exact amount the UK government, via its Sovereign AI fund, invested is unknown as it is “commercially sensitive”, though it states typical equity investments will be worth around £1-10 million.

Published and proven

In the UK AI community, there is wide agreement that Silver is exactly the kind of founder the UK should be backing. Silver is not a pitch deck entrepreneur but instead has two decades of published, proven work, having built AlphaGo, AlphaZero and AlphaProof at DeepMind.

But concerns have arisen around the governance terms, which AI experts say matter as much as the investment terms when it comes to a technology with such potential — and so many unknowns.

After all, who will decide what an AI agent that discovers its own knowledge gets pointed at, how it is used and who ultimately benefits from the breakthroughs in science, medicine, engineering and other sectors that Ineffable has set as its target?

Also, who audits outputs that sit beyond existing human understanding? What guardrails are in place to make a good investment like this a responsible one? Governance conditions are critical, AI specialists say, given the number of overseas backers.

Influence is key

Rohit Parmar-Mistry, Founder at Burton-on-Trent-based Pattrn Data, an AI consultancy, said: “Public money at seed stage should buy more than a press release and a minority stake.

“If the Government is backing frontier AI alongside major international venture capital, the minimum conditions should include clear governance rights, transparency on downstream use and credible safeguards around where the benefits, control and risks actually land.

“Otherwise the public helps de-risk the upside while private investors capture the strategic value.

“But the harder question is less whether AI can discover new knowledge, it is who gets to govern what happens next?

“If a system produces commercially or scientifically significant breakthroughs, it cannot be treated as if funding source and public interest are irrelevant once the cap table is set.

“A Sovereign AI Fund only makes sense if sovereignty means influence, not branding. Small cheques are fine, but only if they buy meaningful public use rather than ceremonial association.”

Governance gap

Katrina Young, Chief Technology Officer at KYC Digital, another AI consultancy, also had concerns.

She said: “The UK should invest in frontier AI, but investing millions in a $1.1bn US-led round is participation, not sovereignty. That is roughly one percent influence over systems that could generate strategically significant knowledge.

“If Ineffable’s ‘self-learning’ model succeeds, the question is not just ownership, but control over how new knowledge is applied, commercialised or restricted. Today, those terms are not visible. That is where the real governance gap sits.

“Public capital should come with enforceable conditions: UK anchoring of capability, independent safety evaluation, transparency on outputs, and defined rights over downstream use where public funding de-risked development.

“The current model prioritises speed. That has value. But sovereignty is not achieved through presence on a cap table. It is secured through control, leverage and accountability before the next funding round closes.”

Maker or taker?

Commenting on the Ineffable investment, Science and Technology Secretary, Liz Kendall, said: “This investment in Ineffable will support a company at the very frontier of AI, with the potential to transform entire sectors, underlining our determination to ensure that the UK isn’t just an AI taker but an AI maker.”

AI Minister, Kanishka Narayan, added: “With support from Sovereign AI and the British Business Bank, we are together showing what British AI can be: the best talent, backed by exceptional state capacity, building AI in Britain, changing the world with it.”

But at what level of investment does a government genuinely become a maker rather than a taker? And can a company with such sizeable US investment genuinely be called a UK AI startup? Or is this, as Parmar-Mistry points out, more of a “ceremonial association”?

The Department for Science, Innovation and Technology has invested in Ineffable in the hope that its AI can discover new and groundbreaking knowledge that humans have never found before.

As part of its investment, the UK Government gets a London address, a small equity stake and first refusal on the next round. But what it doesn’t get is any structural guarantee that Ineffable’s discoveries, IP or commercial value stay in the UK.

But most importantly of all, perhaps, exactly what’s done with those discoveries and who regulates and controls them is, for now at least, unclear.

Share:
Copy this article
Related
Dominic Hiatt/16 minutes ago
4 min read

Bank of England leaves rates on hold but hawkish minutes suggest no “near-term relief for borrowers”

Bank of England leaves rates on hold but hawkish minutes suggest no “near-term relief for borrowers” featured image
Become a subscriber
Become a subscriber
Become a subscriber
Become a subscriber
Our latest stories. delivered to your inbox every day.
By signing up you agree to our User Agreement (including the class action waiver and arbitration provisions), our Privacy Policy & Cookie Statement and to receive marketing and account-related emails from Newspage News.
You can unsubscribe at any time.