Data sovereignty is a hot topic. For commercial and public sector organisations, compliance to ensure personal data is secure is a primary objective. And that means it cannot be subject to foreign laws or interference.
Data sovereignty is also a matter for international relations, where states strive to ensure citizen and organisation data is secure from foreign interference. And, for states, achieving data sovereignty is also a way of protecting and developing national economies.
In this article, we look at data sovereignty, and the key steps CIOs need to take to build their data sovereignty strategy. This centres on auditing, classification and building controls over data location and movement.
What is data sovereignty, and why is it an issue?
At the most general level, data sovereignty is the retention of data within the jurisdiction – usually state boundaries – whose laws govern its use.
Interest in data sovereignty has been building for some time. In one sense, it looks a lot like law catching up with the “wild west” early years of cloud use and popularity. Here, organisations rushed to this new, highly flexible location to process and store data, then later discovered the risks to which they – and their customer data – had become exposed.
More recently, the drive to digital sovereignty stepped up to the level of states. That trend got a big boost during US president Donald Trump’s first term. That saw the country’s introduction of the Clarifying Lawful Overseas Use of Data (Cloud) Act, for example, which potentially allows US law enforcement to access data stored by US companies anywhere. Alarm bells started ringing, especially in Europe.
Organisations achieve digital sovereignty in their operations by making data subject to the laws and control of the state they operate in, or from. But we are far from achieving that, when, for example, Amazon Web Services (AWS), Microsoft Azure and Google Cloud Platform (GCP) have around 70% of the European cloud market, and many European state organisations are completely or overwhelmingly dependent on US hyperscalers for cloud services.
What are the concerns about data sovereignty, and what do CIOs plan to do?
Surveys regularly find IT decision-makers are concerned about data sovereignty. A Gartner survey conducted among 241 IT decision-makers globally found the majority (75%) of those outside the US plan to have a digital sovereignty strategy in place by 2030. Meanwhile, 53% said concerns over geopolitics would restrict future use of global cloud providers, and 61% said such worries would increase their use of regional or local cloud providers.
Complexity – and the potential for contradictory regulations and increased costs – is also a major concern, says Simon Robinson, principal analyst for storage and data infrastructure at Omdia.
“Our research found 74% of organisations say sovereign clouds have become more important over the last two years,” he says.
“However, it is a complex and fast-moving area. The regulatory and compliance environment is evolving rapidly. But the challenge for global organisations is that some regulations may actually conflict, potentially forcing them to contemplate whether they might break one law or regulation to satisfy another.”
Robinson adds: “At the very least it pushes up costs, may lead to inconsistent data policies around retention, and could slow down the adoption of advanced technologies, such as AI [artificial intelligence].”
So, while risks around stored data being in datacentres in a foreign country, on foreign infrastructure and subject to that country’s laws are a major worry, resolving that situation can bring its own issues too.
What is a data sovereignty audit, and why is it so important?
Core to an organisation’s responses to an unknown or uncontrolled data sovereignty situation is an audit of its data. This is the first step towards ensuring data is kept and processed within the appropriate state boundaries.
That will likely take the form of identification of the risks around different classes of data, according to Jon Collins, vice-president of engagement and field chief technology officer at GigaOm.
“Not all data is created equal, and not all parts of the architecture are created equal,” he says. “The first step is to classify what you’ve got. Identify whether it needs to fall within the scope of sovereignty, understand what kind of data it is, and consider how it might be impacted in terms of privacy, localisation and compliance.”
Key parts of a digital sovereignty strategy include mapping digital assets and data flows throughout their lifecycle and the laws to which they are subject at all stages. Then classify the data to assess risk levels for each class.
This can include geo-tagging, and should be part of an ongoing process, says Bettina Tratz-Ryan, vice-president and analyst at Gartner. “Automated discovery tools help identify and tag sensitive data, whether in physical storage or incidental locations like shared drives and folders,” she adds.
“Regular audits and compliance checks are non-negotiable and require strong governance policies and periodic manual reviews.”
How to minimise exposure to data storage risks
A data storage strategy that addresses data sovereignty builds on the classification of data in the data audit to limit what data can go where.
As part of the classification process, data will be subject to a policy that manifests in metadata tagging that indicates its sensitivity and tolerance for movement.
“Organisations should adopt a data governance as code approach, automating compliance through infrastructure as code techniques for consistent enforcement and rapid remediation,” says Tratz-Ryan.
That means sensitive data should be stored locally or in regional datacentres to meet residency requirements, with the cloud used for scalability under strict, region-specific compliance requirements.
“Continuous monitoring, encryption and geo-fencing are essential, and governance must be built in, not bolted on,” adds Tratz-Ryan.
Such approaches address the difficulties that potentially arise with data in transit. With the ability to monitor compliance and auditability built in via classification and tagging, critical workloads can be more easily segregated from less sensitive data at rest and in transit.
“Strict governance over location and movement is the cornerstone of risk mitigation,” says Tratz-Ryan.
Challenges in maintaining knowledge and control
There are many challenges to data sovereignty auditing. Data moves, and it moves across borders. We might believe we have nailed down data in our infrastructure, while data finds other backdoor routes across frontiers. Meanwhile, proprietary systems present huge challenges to audits and tagging, and staff create shadow IT, use emails, attach files, and so on.
In short, data movement in an organisation can be very complex indeed. It is potentially simple to audit and control the vast bulk of our data, but the problems come with incidental cases of data movement, says Tratz-Ryan.
“In globally connected organisations, sovereignty risks will occur even if data is stored in local servers. Remote access, backups, and software-as-a-service integrations can create cross-border exposure, triggering compliance challenges under laws like the US Cloud Act. Also, governance can be bypassed by incidental data movement via virtual private networks, personal devices, or email,” she says.
“And, for example, an automotive manufacturer may store design files on-premise in one location, but metadata and backups can flow through global product lifecycle management systems, creating sovereignty exposure.
“Incidental data movement, such as emails, shared drives and collaboration tools, often push data into unsanctioned cloud folders, outside sovereign governance. Shadow IT compounds the problem when employees use external apps without IT oversight, creating blind spots.”
GigaOm’s Collins believes that for most, the key elements needed to incorporate data sovereignty compliance are already present in their organisation.
“It’s practical to consider it within your broader governance, risk and compliance framework,” he says. “The advantage is, as a larger organisation, you already have practices, processes and people in place for audit, reporting and oversight. Sovereignty requirements can be incorporated into those mechanisms.”
Collins says we should not assume all data needs to meet sovereignty rules, and that in many cases, it’s not possible to do so.
“For example, it’s not realistic to make email a fully sovereign, locally contained application because it’s inherently distributed,” says Collins. “But you can prevent sovereign data from being transmitted by email. That’s where data loss prevention and data protection policies come in, to make sure data from certain repositories, or of certain classifications, is not emailed out.”
Similarly with cloud. Rather than try to make all cloud folders sovereign, we should instead decide what data can and cannot be stored there. And if data needs to be stored locally, then it goes to a local on-premise or domestic cloud service or availability zone.
“The core debate is deciding whether a particular dataset is sovereign,” says Collins. “If you operate in a given country and you hold customer data about people in that country, then that data stays in that country. That gives you a clear list of what cannot go into cloud folders, be sent by email, or managed by a system that can’t guarantee localisation. Once you frame it that way, the whole thing becomes much more straightforward.”













