Asia's Leading Law & Technology Review

Tag: POHA

Poon Chong Ming: Fake porn, real harm: Examining the laws against deepfake pornography in Singapore

Reading time: 15 minutes

Written by Poon Chong Ming | Edited by Josh Lee Kok Thong

We’re all law and tech scholars now, says every law and tech sceptic. That is only half-right. Law and technology is about law, but it is also about technology. This is not obvious in many so-called law and technology pieces which tend to focus exclusively on the law. No doubt this draws on what Judge Easterbrook famously said about three decades ago, to paraphrase: “lawyers will never fully understand tech so we might as well not try”.

In open defiance of this narrative, LawTech.Asia is proud to announce a collaboration with the Singapore Management University Yong Pung How School of Law’s LAW4032 Law and Technology class. This collaborative special series is a collection featuring selected essays from students of the class. Ranging across a broad range of technology law and policy topics, the collaboration is aimed at encouraging law students to think about where the law is and what it should be vis-a-vis technology.

This piece, written by Poon Chong Ming, seeks to examine the laws against deepfake pornography in Singapore. Despite years since the emergence of deepfake pornography, it remains inadequately dealt with by the law. As a result, deepfake pornography is proliferating with greater prominence, inflicting more and more harm on victims while leaving them without proper recourse. This paper attempts to look at the issue of deepfake pornography specifically within Singapore, in light of the stark increase of local sexual abuse cases involving technology. The paper first explains the need for a strong legal framework due to the nature of deepfake pornography (hyper-realism combined with ease of production). Subsequently, the paper proceeds to examine the efficacy of current laws in Singapore (civil, criminal, and regulatory measures) in dealing with deepfake pornography. Finally, by looking at measures taken in the United Kingdom, the paper will provide suggestions as to the direction of the law in Singapore, with the most viable recommendation being to build upon Sections 377BE and 377BD of the Penal Code. 

Criminalising Offensive Speech Made by AI Chatbots in Singapore

Reading time: 16 minutes

Written by Loh Yu Tong | Edited by Josh Lee Kok Thong

We’re all law and tech scholars now, says every law and tech sceptic. That is only half-right. Law and technology is about law, but it is also about technology. This is not obvious in many so-called law and technology pieces which tend to focus exclusively on the law. No doubt this draws on what Judge Easterbrook famously said about three decades ago, to paraphrase: “lawyers will never fully understand tech so we might as well not try”.

In open defiance of this narrative, LawTech.Asia is proud to announce a collaboration with the Singapore Management University Yong Pung How School of Law’s LAW4032 Law and Technology class. This collaborative special series is a collection featuring selected essays from students of the class. Ranging across a broad range of technology law and policy topics, the collaboration is aimed at encouraging law students to think about where the law is and what it should be vis-a-vis technology.

This piece, written by Loh Yu Tong, demonstrates how Singapore’s present criminal framework is ill-prepared to address offensive speech made by autonomous AI chatbots. The author examines the possible regulatory challenges that may arise, and identifies a negligence-based framework – under which a duty of care is imposed on developers, deployers and malicious third-party interferes – to be preferable over an intent-based one. Other viable solutions include employing regulatory and civil sanctions. While AI systems are likely to become more complex in the future, the author holds out hope that Singapore’s robust legal system can satisfactorily balance the deterrence of harm against the risk of stifling innovation.

Legal Implications of Digital Surveillance: Individual Protection

Reading time: 14 minutes

Written by Lim Hong Wen, Amelia | Edited by Josh Lee Kok Thong

We’re all law and tech scholars now, says every law and tech sceptic. That is only half-right. Law and technology is about law, but it is also about technology. This is not obvious in many so-called law and technology pieces which tend to focus exclusively on the law. No doubt this draws on what Judge Easterbrook famously said about three decades ago, to paraphrase: “lawyers will never fully understand tech so we might as well not try”.

In open defiance of this narrative, LawTech.Asia is proud to announce a collaboration with the Singapore Management University Yong Pung How School of Law’s LAW4032 Law and Technology class. This collaborative special series is a collection featuring selected essays from students of the class. Ranging across a broad range of technology law and policy topics, the collaboration is aimed at encouraging law students to think about where the law is and what it should be vis-a-vis technology.

This piece, written by Lim Hong Wen, Amelia, seeks to analyse three key concerns that may arise from the use of digital surveillance, in particular, the issue of privacy, harassment, and algorithmic bias. This paper then examine how the four modalities expounded by Lawrence Lessig will come into play in regulating the use of digital surveillance (i.e. the law, architecture, social norms, and the market). Part II first explores the developments in the use of digital surveillance by the state, employers, and individuals. Digital surveillance has since transformed over the years and current laws may be insufficient in protecting individuals against certain unwanted forms of digital surveillance. Part III of this paper identified the inadequacies of current laws to address the key concerns identified earlier (i.e. privacy, harassment, and algorithmic bias). Given the lack of legal recourse available, Part IV then analyzed how the use or misuse of digital surveillance can be regulated by the remaining three modalities (i.e. the architecture, social norms, and the market).

Powered by WordPress & Theme by Anders Norén