"False"
Skip to content
printicon
Main menu hidden.
Published: 2024-10-29

New Algorithms Increase the Privacy of Sensitive Data

NEWS Sensitive and personal data, such as medical records and bank information, can now be stored more securely than before, thanks to new algorithms developed at Umeå University. These improved algorithms reduce the risk of data leakage during system updates, for example.

Text: Hanna Nordin

When you visit a doctor, information such as medication prescriptions, X-rays, and genetic tests is recorded to assist the physician. In these cases, a technology called federated learning, or collaborative learning, is used to reduce the risk of exposing sensitive data. This technology allows multiple devices to work together without sharing actual data with each other.

Saloni Kwatra, doctoral student at the Department of Computer Science, has identified flaws in the technology in her dissertation and developed new algorithms to enhance user security.

"Federated learning is often used to protect user privacy. However, during system updates, sensitive information can still leak. My research has led to algorithms that can prevent such leaks," says Saloni Kwatra.

To achieve this, she has used two techniques: k-anonymity and differential privacy. With k-anonymity, data is organized so that each combination of identifying details (such as height, age, or eye color) is shared by multiple individuals. This makes it difficult to distinguish or identify anyone, as they are grouped with others who have the same characteristics. Differential privacy, on the other hand, ensures that the results of an analysis are not significantly affected whether or not a specific individual is included in the dataset. This way, individual privacy is protected even when data is used for research or studies.

Methods to Combat Interference Attacks

Saloni Kwatra has also explored how synthetic data, which imitates real patterns without containing actual personal information, can be protected against so-called attribute inference attacks. In these attacks, an adversary attempts to reconstruct specific characteristics of an individual. These new algorithms are particularly relevant for sectors where data integrity is crucial, such as healthcare, finance, and telecommunications.

"In those areas, these algorithms can help maintain user privacy while making systems both more secure and efficient," says Saloni Kwatra.

About the Thesis

On Monday, November 4, Saloni Kwatra from the Department of Computing Science at Umeå University will defend her doctoral thesis titled "Navigating Data Privacy and Tools: A Strategic Perspective." The public defense will take place at 9:15 in BIO.A.206 Aula Anatomica, Biologihuset. The opponent is Sébastian Gambs, professor at the Department of Computer Science, Université du Québec à Montréal (UQAM).

Read the full thesis.