Asia's Leading Law & Technology Review

Category: Data Privacy Page 1 of 4

TechLaw.Fest 2023: This Is What’s Next

Reading time: 12 minutes

By Hannah Loo Yuet Ying and Leong Tzi An (Zaine) | Edited by Josh Lee Kok Thong

The theme of this year’s TechLaw.Fest is ‘This is What’s Next”’. I thought this is very apt in the realm of law and technology. Both are forward-looking, and multi-faceted that we constantly, even in practice, ask ourselves ‘what’s next’.

Second Minister for Law and Minister for Community, Culture and Youth Edwin Tong S.C.
Opening Remarks at TechLaw.Fest 2023

Introduction

Since the last edition of TechLaw.Fest in 2022, technology has developed at a rapid pace. It is now trite to say that technology touches every aspect of our lives. It has transformed, and continues to transform, how people work, interact and, play. This is not only embodied in the rise of large language models (“LLMs”) and generative AI applications such as ChatGPT, but also questions about the future of cryptocurrency, immersive technologies, and online safety. Amidst rapid technological developments on multiple fronts, it is important to have robust conversations on the workings of these technologies and their impact – positive or negative – on people and society. 

As one of Asia’s largest law and technology conferences, TechLaw.Fest is an important forum bringing together industry leaders, government, legal professionals, technologists, academics, and civil society to have these robust conversations. As the first fully physical rendition of the event since 2019, TechLaw.Fest 2023 brought together thought leaders from various domains to answer “what’s next” in the vast field of law and technology. This article aims to bring a glimpse into the key insights and themes discussed across both days of Singapore’s signature law and technology conference.

Digging digital dirt: Rethinking the evidentiary landscape in the age of social media

Reading time: 13 minutes

Written by Sim Qian Hui | Edited by Josh Lee Kok Thong

We’re all law and tech scholars now, says every law and tech sceptic. That is only half-right. Law and technology is about law, but it is also about technology. This is not obvious in many so-called law and technology pieces which tend to focus exclusively on the law. No doubt this draws on what Judge Easterbrook famously said about three decades ago, to paraphrase: “lawyers will never fully understand tech so we might as well not try”.

In open defiance of this narrative, LawTech.Asia is proud to announce a collaboration with the Singapore Management University Yong Pung How School of Law’s LAW4032 Law and Technology class. This collaborative special series is a collection featuring selected essays from students of the class. Ranging across a broad range of technology law and policy topics, the collaboration is aimed at encouraging law students to think about where the law is and what it should be vis-a-vis technology.

This piece, written by Sim Qian Hui, seeks to demonstrates the need to rethink the evidentiary landscape in the age of social media. The use of social media evidence in court proceedings is plagued with uncertainty. By assuming that people present themselves in the same way online and offline, the courts misinterpret the relevance of certain types of social media posts.  The courts lack understanding on social media culture and draw mistaken inferences from common types of social media conduct. Further, the overly broad discovery of social media content violates an individual’s right to privacy. Accordingly, courts must consider the unique properties and social norms surrounding social media when utilising social media evidence. Given that social media has become part of today’s society, courts ought to ensure the continued relevance of the evidentiary regime. 

The value of differential privacy in establishing an intermediate legal standard for anonymisation in Singapore’s data protection landscape

Reading time: 11 minutes

Written by Nanda Min Htin | Edited by Josh Lee Kok Thong

We’re all law and tech scholars now, says every law and tech sceptic. That is only half-right. Law and technology is about law, but it is also about technology. This is not obvious in many so-called law and technology pieces which tend to focus exclusively on the law. No doubt this draws on what Judge Easterbrook famously said about three decades ago, to paraphrase: “lawyers will never fully understand tech so we might as well not try”.

In open defiance of this narrative, LawTech.Asia is proud to announce a collaboration with the Singapore Management University Yong Pung How School of Law’s LAW4032 Law and Technology class. This collaborative special series is a collection featuring selected essays from students of the class. Ranging across a broad range of technology law and policy topics, the collaboration is aimed at encouraging law students to think about where the law is and what it should be vis-a-vis technology.

This piece, written by Nanda Min Htin, seeks to examine the value of differential privacy an establishing an intermediate legal standard for anonymisation in Singapore’s data protection landscape. Singapore’s data protection framework recognizes privacy-protected data that can be re-identified as anonymised data, insofar as there is a serious possibility that this re-identification would not occur. As a result, such data are not considered personal data in order to be protected under Singapore law. In contrast, major foreign legislation such as the GDPR in Europe sets a clearer and stricter standard for anonymised data by requiring re-identification to be impossible; anything less would be considered pseudonymized data and would subject the data controller to legal obligations. The lack of a similar intermediate standard in Singapore risks depriving reversibly de-identified data of legal protection. One key example is differential privacy, a popular privacy standard for a class of data de-identification techniques. It prevents the re-identification of individuals at a high confidence level by adding random noise to computational results queried from the data. However, like many other data anonymization techniques, it does not completely prevent re-identification. This article first highlights the value of differential privacy in exposing the need for an intermediate legal standard for anonymization under Singapore data protection law. Then, it explains how differential privacy’s technical characteristics would help establish regulatory standards for privacy by design and help organizations fulfil data breach notification obligations. 

Is the PDPA really sufficient to protect our data?

Reading time: 14 minutes

Written by Moo Wen Si, Amelia | Edited by Josh Lee Kok Thong

We’re all law and tech scholars now, says every law and tech sceptic. That is only half-right. Law and technology is about law, but it is also about technology. This is not obvious in many so-called law and technology pieces which tend to focus exclusively on the law. No doubt this draws on what Judge Easterbrook famously said about three decades ago, to paraphrase: “lawyers will never fully understand tech so we might as well not try”.

In open defiance of this narrative, LawTech.Asia is proud to announce a collaboration with the Singapore Management University Yong Pung How School of Law’s LAW4032 Law and Technology class. This collaborative special series is a collection featuring selected essays from students of the class. Ranging across a broad range of technology law and policy topics, the collaboration is aimed at encouraging law students to think about where the law is and what it should be vis-a-vis technology.

This piece, written by Moo Wen Si, Amelia, seeks to examine the sufficiency of the PDPA in today’s world. In a technologically advanced world where e-commerce, cloud computing and data mining are flourishing, data has become one of the most valuable assets in the economy. This has raised concerns as to whether our data is being fully protected from misuse and the remedial actions available in cases of data breaches. In response, the Singapore Parliament enacted the Personal Data Protection Act 2012 (“PDPA”) seeking to protect individuals’ data from misuse by organisations in the private sectors. The PDPA, aimed to be a comprehensive data protection law, is however severely lacking in the protection it affords to individuals. This paper seeks to argue how the PDPA is insufficient to protect one’s data from being misused and the limited recourse that individuals have even when their data privacy has been compromised. 

Legal Implications of Digital Surveillance: Individual Protection

Reading time: 14 minutes

Written by Lim Hong Wen, Amelia | Edited by Josh Lee Kok Thong

We’re all law and tech scholars now, says every law and tech sceptic. That is only half-right. Law and technology is about law, but it is also about technology. This is not obvious in many so-called law and technology pieces which tend to focus exclusively on the law. No doubt this draws on what Judge Easterbrook famously said about three decades ago, to paraphrase: “lawyers will never fully understand tech so we might as well not try”.

In open defiance of this narrative, LawTech.Asia is proud to announce a collaboration with the Singapore Management University Yong Pung How School of Law’s LAW4032 Law and Technology class. This collaborative special series is a collection featuring selected essays from students of the class. Ranging across a broad range of technology law and policy topics, the collaboration is aimed at encouraging law students to think about where the law is and what it should be vis-a-vis technology.

This piece, written by Lim Hong Wen, Amelia, seeks to analyse three key concerns that may arise from the use of digital surveillance, in particular, the issue of privacy, harassment, and algorithmic bias. This paper then examine how the four modalities expounded by Lawrence Lessig will come into play in regulating the use of digital surveillance (i.e. the law, architecture, social norms, and the market). Part II first explores the developments in the use of digital surveillance by the state, employers, and individuals. Digital surveillance has since transformed over the years and current laws may be insufficient in protecting individuals against certain unwanted forms of digital surveillance. Part III of this paper identified the inadequacies of current laws to address the key concerns identified earlier (i.e. privacy, harassment, and algorithmic bias). Given the lack of legal recourse available, Part IV then analyzed how the use or misuse of digital surveillance can be regulated by the remaining three modalities (i.e. the architecture, social norms, and the market).

Page 1 of 4

Powered by WordPress & Theme by Anders Norén