Discover the complexities of data minimisation and storage limitation in today's data-driven world. Learn how to balance these privacy principles with AI development and secure your sensitive data. Explore practical strategies for data governance, secure AI development, and visibility, culture, and automation in your data security posture.
If privacy law has been on your bedtime reading list recently, you’ll be aware that data minimisation and storage limitation are hot topics right now.
With sensitive data spreading uncontrollably across SaaS applications, it's becoming more difficult to discover, control, and protect your data. We're going to break down these vast topics to help you get a clearer idea of what they are and how you can use them to your advantage.
Data minimisation is one of three principles in GDPR regarding data standards, along with accuracy and storage limitations. It essentially refers to collecting and keeping only the personal data that you need, which in turn reduces the risk of over-exposure.
It's also increasingly making the shift from a best practice and condition of safe harbour, particularly in the US, to being explicitly required in modern privacy legislation such as the California Privacy Rights Act (CRPA) and in the current draft federal bill for the American Data Privacy and Protection Act.
In a more practical light for growing scale-ups, data minimisation is an enabler for many elements of a strong data security posture. It involves defining how and why the data is being collected, collecting anonymised data or the least personal data possible, and having tools and systems in place to erase stale sensitive data.
Balancing data minimisation with the development of AI tools is a growing challenge for many businesses.
As AI technologies become increasingly integral to business operations (with 40% of global companies reporting the use of AI in their business), the need to manage data responsibly becomes more complex.
Following the Samsung ChatGPT leak and its temporary ban, businesses need to secure Generative AI tools. Banning them simply isn’t practical because of how essential they’re becoming to business— So companies need to find a way to use them securely to stay competitive.
The crux of the dilemma lies in the requirement for extensive data to effectively train AI systems, versus the need to adhere to data minimisation principles.
Data minimisation, a key component of regulations like GDPR, dictates that only the data necessary for a specific purpose should be collected and retained.
This presents a conflict for businesses that need to accumulate large datasets to power their AI tools.
Finding the right balance between compliance with data minimisation laws and pursuing AI innovation can be challenging. On one hand, strict adherence to data minimisation is crucial for avoiding legal and reputational risks. On the other hand, comprehensive AI models often require vast amounts of data to function effectively. Businesses must carefully navigate this tension, ensuring they are not only compliant with regulations but also able to leverage AI to its fullest potential.
The security risks of retaining excessive data are also significant. Many organisations struggle with data retention compliance, because the guidelines aren’t clear. For example, UK GDPR doesn’t specify exact retention periods for personal data. Instead, it’s up to businesses to justify the duration for which they hold data based on their processing purposes. This can lead to inconsistent data retention practices, where data is kept longer than necessary, increasing the risk of breaches and misuse.
To manage this dilemma, implementing a comprehensive data governance framework is essential. Businesses need to develop clear data retention policies that balance regulatory requirements with the practical needs of AI development. Effective data governance includes maintaining an accurate data inventory and regularly reviewing data retention practices to ensure compliance.
Additionally, building AI tools with security and privacy in mind from the outset is crucial. This means incorporating data protection measures into AI development processes and regularly auditing these tools to ensure they adhere to data minimisation principles. By focusing on secure AI development, businesses can mitigate the risks associated with data retention while still benefiting from AI innovations.
Navigating the balance between data minimisation and AI development is no easy task, but it is vital for businesses aiming to stay compliant and competitive.
By implementing strong data governance practices and prioritising security in AI development, businesses can manage their data responsibly while harnessing the power of AI.
Storage limitation is the principle of keeping data only as long as is necessary. This is important, beyond simply complying with regulations. Personal and sensitive data held for too long quickly becomes excessive, inaccurate or redundant.
There is already a significant enough risk storing sensitive data, let alone inflating the risk with redundant data which could have been erased in the first place. Since GDPR was introduced, many companies implemented data retention schedules and information asset registers to comply specifically with the wording that data is not held for any 'longer than is necessary'.
If you're currently in a scale-up with an ever-growing list of SaaS applications, you'll no doubt feel a bit uncomfortable about storage limitations in practice for both personal and sensitive data. The risk is heightened as the amount of data grows exponentially.
Good security posture is more of a process than an outcome, so we're sharing three ways we've seen scale-ups improve their data security posture through their day to day processes and operations.
The first step in securing data is to know what you have. Ultimately, it's a record of what information your business holds, where it is stored, who has access to it, how sensitive it is and how old it is. This visibility forms a spine for the next two areas!
Data security is more than just a compliance activity. When embedded into the ways of working and culture of the business, it can create significant value to both the operations and revenue. Naturally, training is part of this. But employees are not all the same and any approach to security education needs to be tailored to an individual, in terms of their role and meeting them where they are in their security knowledge.
Likewise, we're seeing a greater shift to considering the impact that good security posture has on employee behaviours, particularly in SaaS applications where the risk of over-exposure is heightened. We know that most data leaks are a result of accidental mistakes rather than malicious intent, so policies and general training are only as effective as the behaviours they consistently drive.
High growth companies are constantly hiring, onboarding and trying to train and align new staff with preferred ways of working. In a scrappy environment, it can feel like an impossible task, but it’s an essential part of minimising data and thus protecting your business long term.
Two of our favourite words at Metomic. Automation lightens the load which is vital during periods of growth to ensure security teams are focused on more impactful initiatives (like cultural awareness).
Metomic can look across your entire surface—from the infrastructure layer to applications, and across multiple environments—to identify and map sensitive data. It can enforce policies, such as automatically deleting sensitive data when it's no longer needed.
An unfortunate but recurring symptom of the explosion of SaaS applications is how much easier it now is for sensitive data to spread. So it is more important than ever for scale-ups to improve up their data security posture and take steps to protect their customers' sensitive and personal data.
Metomic's sensitive data discovery tool for SaaS apps helps you discover and control sensitive data in cloud applications so that you can focus on growing your business.
Book a personalised demo today for expert insight into the data minimisation effort at your business and recommendations for remediation.