Consumers may soon start delegating everyday decisions to AI systems. From cancelling subscriptions to switching providers or negotiating refunds, AI agents are designed not just to answer questions but to take actions on a user’s behalf.
The idea is attractive. But it also raises difficult questions. What happens if the system gets it wrong? What if it prioritises commercial incentives over the consumer’s interests? And if people begin relying on AI agents to make financial or purchasing decisions, are existing consumer protection rules still enough?
These questions are explored in new research from the UK’s Competition and Markets Authority (CMA), which has published analysis on ‘Agentic AI and consumers’, alongside guidance on complying with consumer law when using AI agents.
What is an AI agent?
Unlike traditional AI tools that simply answer prompts, AI agents can plan and execute tasks autonomously. For example, a personal shopping agent might monitor spending, track subscriptions, and automatically search for better deals across providers, and even switch services on the user’s behalf.
Such tools could reduce financial leakage, save consumers time and help them compare complex offers more effectively. The CMA recognises this potential, noting that AI “has the potential to boost economic growth and improve people’s everyday lives”.
Why AI agents worry regulators
The CMA’s research highlights several risks associated with AI agents:
- Misaligned incentives: agents may not act as a ‘faithful servant’ to users and could instead optimise for commercial objectives such as engagement or conversion.
- Hallucinations: incorrect outputs could lead to service disruption or poor financial decisions.
- Loss of consumer agency: consumers may become overly reliant on automated decisions they cannot easily scrutinise.
If widely adopted, these systems could become a new intermediary between consumers and markets, influencing how people choose products and services.
The CMA’s approach
The CMA acknowledges the UK’s current position as the third largest AI market worldwide.
For now, the CMA does not favour sweeping new rules in the UK which may be “heavy-handed”. Instead, it emphasises enabling innovation while ensuring that existing consumer protection frameworks continue to apply.
If agentic AI systems are developed in a compliant way that consumers can rely on, the CMA considers that the UK can position itself “at the forefront of trusted agentic innovation”.
The CMA emphasises the importance of regulatory join-up to facilitate such positive AI innovation. Regulatory collaboration on AI happens at a national level (e.g. via the Digital Regulation Cooperation forum comprising four UK regulators – the CMA, FCA, ICO and Ofcom), and through global fora like the International Competition Network.
What businesses should do
Alongside its research, the CMA has issued guidance for businesses deploying AI agents. In particular, companies should:
- Be transparent about when consumers are interacting with an AI agent.
- Design agents to comply with consumer law, including statutory rights under the Consumer Rights Act 2015.
- Monitor performance, with regular human oversight to guard against errors or misleading outputs.
- Act quickly if problems arise, particularly where agents interact with large numbers of consumers.
The CMA is using AI too
The guidance also coincides with the CMA’s own use of AI tools. According to its draft Annual Plan for 2026 to 2027, the authority is developing systems to detect consumer harms and exploring the use of AI to screen for cartels and identify potential bid rigging in public procurement (see here).
For businesses experimenting with AI agents, the CMA’s message is straightforward. Innovation is welcome, but consumer protection obligations still apply.

/Passle/5f3d6e345354880e28b1fb63/MediaLibrary/Images/2025-09-29-13-48-10-128-68da8e1af6347a2c4b96de4e.png)
/Passle/5f3d6e345354880e28b1fb63/MediaLibrary/Images/2024-08-23-11-31-07-354-66c872fb971eecc249d83d40.png)
/Passle/5f3d6e345354880e28b1fb63/SearchServiceImages/2026-03-13-17-07-49-193-69b44465bf28f5daedf4946f.jpg)