0330 055 2678 | Client Portal |

0330 055 2678 | Client Portal |

The Data (Use and Access) Act 2025 Will Expose Weak Data Habits

The Data (Use and Access) Act 2025 will change how UK organisations use, share, and automate data.

Many organisations believe they handle data responsibly because nothing has gone wrong so far. However, the way data is used in practice has shifted significantly in recent years. Cloud platforms, SaaS tools, and now generative AI move information through organisations faster than governance models typically adapt.

As the phased commencement of the Act progresses towards its final stage in early 2026, a growing gap between formal policy and day-to-day behaviour has become harder to ignore. That gap sits at the centre of what the legislation is designed to address.


Why This Is Not Just Another Compliance Update

The Act refines existing UK data legislation, including UK GDPR, the Data Protection Act 2018, and PECR. Government guidance positions these changes as a way to reduce unnecessary administrative burden while continuing to support innovation and responsible data use (GOV.UK, 2025).

In practice, the legislation shifts attention towards how organisations actually use data, not just how they document it. As a result, regulators now focus less on checkbox compliance and more on accountability and demonstrable control.

The Information Commissioner’s Office has repeatedly highlighted this shift. In particular, enforcement increasingly targets everyday practices such as poor visibility of data flows, unclear ownership, and weak oversight of cloud and third-party services (ICO, 2025).


Data Use Has Changed Faster Than Oversight

Recent industry research helps explain why this matters.

The Netskope Threat Labs reports that the average organisation now experiences around 223 incidents per month where employees send sensitive data to generative AI applications. Typically, this happens during routine tasks such as summarising documents, checking code, or drafting content, rather than through deliberate misuse (Netskope, 2026).

At the same time, government and regulatory commentary shows that AI and cloud adoption continue to accelerate. Meanwhile, many organisations still struggle to maintain visibility and control at the same pace (GOV.UK, 2025; ICO, 2025).

Therefore, intent is rarely the issue. Instead, organisations increasingly use data in ways that remain difficult to see without deliberate, technical oversight.


Where the Real Change Happens

Rather than introducing a single new obligation, the Act affects several practical areas where modern data use often outpaces control.


Smart Data and Structured Sharing

The Act prepares the ground for expanded Smart Data schemes, building on principles established through Open Banking. Over time, these schemes will extend into sectors such as energy, finance, and telecommunications. As a result, individuals and organisations will be able to share their data securely with trusted third parties to access better services and outcomes (GOV.UK, 2025).

However, these opportunities depend on a clear understanding of what data organisations hold, where it sits, and how teams can share it safely. When data estates remain fragmented or poorly documented, organisations struggle to participate in Smart Data ecosystems with confidence.


Digital Identity and Access Control

The Act introduces a regulated framework for Digital Verification Services (DVS) and establishes a public register of trusted identity providers.

This change reflects a wider shift in how organisations protect data. Both the ICO and the National Cyber Security Centre now emphasise identity as a primary control layer in cloud-first environments, particularly where staff access data across multiple services and devices (NCSC, 2025).

Therefore, organisations that treat identity as an afterthought increase exposure as data flows become more distributed.


Automation, Research, and Accountability

The Act also clarifies and relaxes several operational areas of data protection law. Specifically, it introduces:

  • Wider lawful bases for certain types of automated decision-making

  • Clearer rules for scientific research, including the use of broad consent

  • Reduced cookie consent requirements for specific low-risk analytics cookies

Although these changes remove friction, they do not lower standards. Organisations that use automation must still understand how decisions are made, what data those decisions rely on, and how teams can explain outcomes when questions arise.

Industry research reinforces this point. In many cases, AI-related data issues occur because employees use tools outside approved or monitored environments, not because automation itself creates risk (Netskope, 2026).


International Data Transfers and Cloud Services

The Act introduces a more flexible approach to assessing third-country adequacy for international data transfers.

While this flexibility reduces friction, it also increases responsibility. Organisations now need a clearer view of how data moves through cloud platforms, APIs, and third-party services. According to the ICO, poor visibility of cross-border processing remains a common weakness in compliance assessments (ICO, 2025).

Over time, this approach could also affect the UK’s EU adequacy status. Therefore, organisations should prioritise documentation and adaptability as the final stage of the Act approaches.


Public Sector Data Sharing

The Act also supports improved data sharing across public services. For example, it enables more connected data systems through initiatives such as the modernisation of civil registration and the National Underground Asset Register.

Consequently, organisations that work with public sector data will need systems and processes that integrate securely with increasingly connected public data environments.


Enforcement and Regulatory Direction

The Act is being implemented in stages through 2025, with final provisions expected to come into force in early 2026. Alongside this, the ICO gains enhanced enforcement powers, including the ability to impose GDPR-level fines for PECR breaches.

As a result, organisations face a dual responsibility:

  • First, they should take advantage of opportunities such as reduced administrative burden, Smart Data participation, and modern digital identity models

  • Second, they must ensure governance and oversight keep pace as enforcement tightens


A Practical Readiness Check

To prepare, organisations should ask:

  • Do we understand where sensitive data is actually being used, including within AI tools?

  • Can we explain how automated decisions are made?

  • Do we treat digital identity as a foundational control?

  • Have we mapped data sharing and data lifecycle processes across cloud services?

  • Are international data flows visible and governed?

Although gaps in these areas are common, regulators are most likely to scrutinise them as expectations evolve.


Closing Thought

For most organisations, the Data (Use and Access) Act 2025 will not require a complete redesign of systems.

However, as commencement draws closer, it does require a clearer and more honest view of how data is actually used across modern tools and services. Organisations that already understand real behaviour, rather than relying solely on policy, will be better placed as the Act takes full effect.


Not Sure What This Means for You?

The Data (Use and Access) Act 2025 affects organisations differently depending on how they use data.

If you want to talk it through or ask a few questions, get in touch


Sources and references

  • Netskope, Netskope Cloud and Threat Report: 2026, Netskope Threat Labs, 2026.

  • GOV.UK, “Data (Use and Access) Act 2025: policy overview and implementation guidance”, 2025.

  • Information Commissioner’s Office, “Regulatory action policy and data protection enforcement approach”, 2025.

  • National Cyber Security Centre, “Identity and access management guidance for cloud services”, 2025.