In June 2025, Bristows’ Data Protection, Privacy & Cyber team posted about the imminent passing of the Data (Use and Access) Bill into law (see here).
On 19 June 2025, the Data (Use and Access) Act 2025 (the “Act”) received Royal Assent. Its implementation is being phased in over a twelve-month period until June 2026.
The Act governs the access and use of customer and business data, supplements existing law such as the UK General Data Protection Regulation (“UK GDPR”), Data Protection Act 2018 (“DPA”) and the Privacy and Electronic Communications Regulations (“PECR”) and strengthens the powers of the Information Commissioner’s Office (“ICO”). The primary goal of the Act is to drive economic growth, innovation and competition by facilitating responsible data sharing between organisations, customers and third parties.
But in facilitating easier data sharing, does the Act open the door to increased risks of disputes (including potential litigation)? Here we consider some risks that might arise.
Summary of risks
Access to customer data and business data
Whilst the roll-out is still in the discussion stages, Part 1 of the Act provides a framework to introduce “Smart Data schemes” with the full effect of the provisions being brought in through subsequent secondary legislation. Smart Data is the secure sharing of customer and business data with authorised third parties. Smart Data schemes are “trust frameworks” which set the standards under which such data must be shared, used and protected, alongside the roles and responsibilities of different scheme participants.
Smart Data schemes may apply across many different sectors including transportation, telecommunications and finance. Their implementation will allow customers access to their own data as well as to certain business data including pricing, usage data and the performance or quality of goods and services they use.
Authorised third parties (acting on behalf of customers) will be permitted to use this data to provide individuals with personalised market comparisons and automatic switching services. This is similar to the Open Banking framework introduced in 2018 which may have had some positive impact on the UK economy.
The Act introduces certain obligations for organisations subject to Smart Data schemes. They are required to:
- use, comply with specified standards and maintain specified facilities or services including dashboard services, electronic communication services or application programming interfaces;
- publish information relating to the rights and obligations of persons under the regulations, including information about the rights of customers in relation to customer data processed and information about the activities carried out under the regulations; and
- implement effective procedures for the handling of complaints.
The ICO will monitor compliance with the obligations and investigate non-compliance. It may issue a compliance notice which will have the same effect as a court or tribunal order. The Act has also introduced mandatory compliance interviews, most likely to require attendance by an organisation’s Data Protection Officer. Making false statements in such interviews is a criminal offence.
Failure to comply with requirements imposed by the regulations above or with a compliance notice will likely lead to financial penalties. These can be increased in the event of late payment. They may also be appealed to a court or tribunal.
As well as the risks of non-compliance, in a similar way to data subject access requests, organisations must be able to distinguish between legitimate and illegitimate requests when they receive a request for data. Disputes may arise where data holders wrongly deny legitimate requests or unknowingly disclose confidential data to unauthorised recipients.
Data protection and privacy
Automated decision-making is another area where risks may increase, mainly because the framework for this is more permissive than before.
The Act provides for a complete replacement of Article 22 of the UK GDPR (automated individual decision-making, including profiling) with four new Articles, as follows.
Article 22A confirms that for the purposes of Articles 22B and 22C, a decision is based solely on “automated processing” if there is no meaningful human involvement in the taking of the decision. Additionally, such decision will be regarded as a “significant decision” in relation to a data subject if it produces a legal effect for the data subject or if it has a similarly significant effect for the data subject. Article 22A also provides that when considering whether there is meaningful human involvement in the taking of a decision, a person must consider, amongst other things, the extent to which the decision is reached by means of profiling.
Article 22B sets out restrictions on automated decision making. For significant decisions where there is processing either entirely or partly of special categories of personal data (such as race or ethnic origin, health data or religious/philosophical beliefs (see Article 9(1) UK GDPR), such processing may not be automatic unless certain conditions are met. These conditions are:
- Explicit consent of the data subject.
- That the decision is necessary for entering into or performing a contract between the data subject and a data controller or is required or authorised by law and Article 9(2)(g) UK GDPR applies.
Article 22C ensures that safeguards for the data subject’s rights, freedoms and legitimate interests are in place when a significant decision is taken in relation to a data subject which is based entirely or partly on personal data and based solely on automated processing. These safeguards include enabling the data subject to make representations about such decisions and to obtain human intervention on the part of the controller in relation to such decisions. Safeguards must also enable a data subject to contest such decisions.
Article 22D sets out further provisions about automated decision-making and regulations that the Secretary of State may make.
Failure to implement the required safeguards will amount to a serious breach of data protection laws. Currently, the ICO can issue enforcement notices and fines for serious breaches of the UK GDPR of up to £17.5 million or 4% of global turnover, whichever is higher. The Act increases the maximum fine under PECR to align with this.
As well as regulatory action, individuals affected by unlawful automated decisions may also bring civil claims for damages arising from the decision, which may lead to costly litigation and the risk of being ordered to pay significant sums in compensation.
There also remains the risk of discrimination claims or legal challenges, for example, in areas relating to employment, recruitment and HR and insurance/financial decisions.
Purpose limitation: further processing
Under the UK GDPR and the DPA, personal data could only be collected for specific, explicit and legitimate purposes. Any further processing had to be compatible with the original purpose or justified by a new lawful basis. In some cases, consent from a data subject may also be required.
The Act clarifies when personal data can be reused for a new purpose that is different from the original one. With the introduction of this section, compatibility assessments are no longer required in certain cases, for example, where the processing is for a recognised compatible purpose or a legitimate interest.
Where the data includes special category information, further processing is permitted if one of the 12 conditions under Schedule 1 of the DPA is met, along with other applicable safeguards.
In terms of risks, extra care must be taken to ensure correct application because incorrectly relying on a compatible purpose can result in unlawful processing. This can lead to litigation which may become costly and protracted.
Key takeaway
The Act has brought about some welcome changes, especially regarding re-use of data for new purposes, greater protection from automated decisions affecting individuals and wider access to automatic switching services. Nonetheless, navigating this area remains complex and care needs to be taken in doing so.
Getting it wrong means you are more likely to face any combination of ICO investigations including potential compliance interviews, regulatory action and financial penalties, potential satellite litigation and, as a consequence, an increased risk of reputational damage.