Tech

UK Police Blame Microsoft Copilot for Intelligence Mistake

AC
Alex Chen
Tech Journalist & Product Reviewer
WIRED Tested Dozens of Blenders. These Are Our 8 Favorites (2026)
Image source: WIRED

UK Police Admit AI Assistant Mistake Led to Israeli Football Fans Being Banned

The chief constable of one of Britain's largest police forces has admitted that Microsoft's Copilot AI assistant made a mistake in a football intelligence report. The report, which led to Israeli football fans being banned from a match last year, included a non-existent match between West Ham and Maccabi Tel Aviv.

AI Assistant 'Hallucinated' Non-Existent Match

According to [1] The Verge, the AI assistant 'hallucinated' the non-existent match, which was included in the intelligence report. This led to the Israeli football fans being banned from the match.

Implications of AI Mistake

The implications of this AI mistake are significant. It raises questions about the reliability of AI assistants in sensitive situations like intelligence gathering. It also highlights the need for human oversight and verification of AI-generated information.

Microsoft's Response

Microsoft has not commented on the incident, but it is clear that the company needs to take steps to prevent such mistakes in the future.

Conclusion

The UK police's admission that Microsoft's Copilot AI assistant made a mistake in a football intelligence report is a wake-up call for the use of AI in sensitive situations. It highlights the need for human oversight and verification of AI-generated information.

Sources

[1] UK police blame Microsoft Copilot for intelligence mistake