Share these talks and lectures with your colleagues
Invite colleaguesEngineering risk-based anonymisation solutions for complex data environments
Abstract
Technological advancements have dramatically increased the ability to collect, store and process vast quantities of data. The general applicability and precision of analytical tools in artificial intelligence and machine learning have driven organisations to leverage these advances to process personal data in new and innovative ways. As stewards of personal data, organisations need to keep that data safe and ensure processing is legal and appropriate. Having more data, however, has also led to an increased interest to process personal data for purposes other than why they were originally collected, known as secondary purposes. The reuse of personal data introduces important regulatory challenges, increasing the need to disassociate data used for secondary purposes from personal data, be it to safeguard the data, support a legitimate interest, or anonymise the data. Whereas some academics have focused on specific issues preventing more widespread adoption of this privacy-enhancing technology, others have reframed the discussion around anonymisation as risk management. Combining technology-enabled processes with measures of identifiability provides an opportunity to meet complex business needs while ensuring best practice is adopted in reusing sensitive data. This paper examines these many considerations and demonstrates how risk-based anonymisation can and should be detailed, evidence based and objectively supported through measures of identifiability. The engineering of privacy solutions, through the application of risk-based anonymisation, is also briefly explored for complex use cases involving data lakes and hub and spoke data collection, to provide the reader with a deeper understanding of real-world riskbased anonymisation in practice.
The full article is available to subscribers to the journal.
Author's Biography
Luk Arbuckle is Chief Methodologist at Privacy Analytics, providing strategic leadership in how to responsibly share and use data. Luk was previously Director of Technology Analysis at the Office of the Privacy Commissioner (OPC) of Canada, leading a highly skilled team that conducted privacy research and assisted in investigations when there was a technology component involved. Before joining the OPC, he worked on developing methods of anonymisation and identifiability measurement tools, participated in the development and evaluation of secure computation protocols, and led a top-notch research and consulting team that developed and delivered data anonymisation solutions. He is author of two books about data anonymisation as well as numerous papers and guidance documents. Previously, Luk did both graduate and industry research in applied statistics and digital image processing and analysis.
Muhammad Oneeb Rehman Mian is a Senior Data Scientist at Privacy Analytics, specialising in cuttingedge privacy engineering solutions. His work entails developing scalable risk-based anonymisation technologies, improving threat modelling and identifiability metrics, and exploring solutions to practical challenges faced in privacy engineering. Previously, Muhammad did graduate and postdoctoral work in biomedical research, utilising applied statistics and Big Data analytics.