Images of RAF Air Controllers aboard an RAF Boeing E-3D Sentry, aka AWACS, conducting a mission in support of NATO. (British Ministry of Defence)

BELFAST — A top British defense official said today he’s not yet satisfied with the United Kingdom military’s adoption of artificial intelligence and, while it is already in use in some “important areas,” AI application needs to “go a lot further.”

“Are we yet at the stage where we can say, in every aspect of the MoD [Ministry of Defence] we’re at the highest level of AI preparedness? No,” James Cartlidge, minister of defence procurement, told lawmakers in a defence committee hearing. “I think the rate of progress is very good but I would never be relaxed and say that we’re now in a steady state that is entirely satisfactory.”

Among the “important areas” Cartlidge said AI has been deployed is in platform development, especially the “spiral” development of British drones deployed in Ukraine and Leonardo’s use of predictive maintenance for Royal Navy Wildcat helicopters.

Aside from supporting Ukraine’s war against Russia, the main use case for AI remains “the ability to hoover up data to analyse it at a far greater scale, far more quickly, than if you rely on” older methods, explained Cartlidge.

Such an approach has obvious implications for improved intelligence gathering. Significant inroads have been made to tackle problems around data access for software developers after the MoD recently opened up 60 different datasets, equivalent to 1 million gigabytes of data to developers “at higher classification,” said Paul Lincoln, second permanent secretary at the UK MoD.

Developers are also being “signposted” to “high-quality open source” datasets through the UK’s Defence AI Centre, he added.

Without data for algorithms to learn from, AI and machine learning is impossible, but in defense settings the ways in which the data is shared with external agencies, combined with strict vetting processes for the new organisations or individuals involved, tends to delay progress. Data labelling must also be implemented so algorithms can classify information accurately.

“We’re also looking at working with industry to think about synthetic labelling so that we can look at our own datasets and look at where we can actually increase the datasets which might be available to support that kind of development,” said Lincoln.

The MoD is also in the process of “looking for” a commercial cloud computing contractor so the department can work off a single classified cloud network. Such a secure network is typically required for algorithms to process data.

“There’s not too much more I can say about [the cloud acquisition] because of commercial confidentiality, but we are in the process of doing that,” Lincoln shared.

RELATED: JWCC 1 year in: Military branches test the waters as DoD envisions cloud service 2.0

The steps to better manage cloud computing and data access are in conjunction with efforts to use AI in the development of “complex weapons” and lasers, so the UK can gain competitive advantages over adversaries, acknowledged Cartlidge.

He also explained that a push to better engage with AI small and medium enterprises has led to a “ramping up” of discussions at a classified level, while the MoD also wants to “bring in” dual use suppliers so it can “sensitively” share data with them.

Although the UK envisages AI being integrated to the vast majority of its future weapon systems, it has ruled out integration of the technology for nuclear programs because of ethical implications tied to such an approach, said Lt. Gen. Tom Copinger-Symes, deputy commander of UK Strategic Command.

Revising Acquisition With Expertise

When it comes to AI and basically any other high-tech acquisition, Cartlidge said the UK is about to get a second set of expert eyes.

A new procurement system policy is set to be implemented for the first time on April 8 and will allow Cartlidge to receive “revised advice” about the “technological viability” of high valued acquisitions, before he approves them. The UK’s Defence Science and Technology Laboratory (DSTL), an innovation unit, will play a “key part” in providing such advice.

He described the new power as a “failsafe watchdog” and one that “I don’t think exists in the current system.”