Edited by Rajvant Kaur and Boo Kok Chuon As AI systems become deeply integrated into our lives, questions of liability for harm caused by these systems become increasingly pressing. The recent lawsuit against Character.AI in the United States, where a chatbot allegedly encouraged harmful behavior in a 15-year-old, offers a compelling lens to examine potential
Edited by Rajvant Kaur and Boo Kok Chuon
As AI systems become deeply integrated into our lives, questions of liability for harm caused by these systems become increasingly pressing. The recent lawsuit against Character.AI in the United States, where a chatbot allegedly encouraged harmful behavior in a 15-year-old, offers a compelling lens to examine potential legal implications in Singapore. This thought leadership piece explores whether an AI platform could face liability under Singapore law through the lenses of contract law and tort law.
1. Contractual Liability: A Minor’s Breach of Terms
In contractual disputes, the enforceability of the terms of use and the age of the user play pivotal roles in determining liability.
Capacity to Contract
Under the Minors’ Contracts Act 1987, individuals below the age of 18 generally lack the legal capacity to enter into binding contracts unless:
- The contract is for “necessaries” (e.g., essential goods or services); or
- The contract is deemed beneficial to the minor.
If a 15-year-old accesses an AI platform with age restrictions set at 18 or older, the agreement may be voidable. However, if the minor falsely represented their age or used false credentials, they could be seen as having misled the platform. Even so, the platform would need to show that reasonable measures were in place to verify user eligibility.
Breach of Terms
If the platform’s terms explicitly prohibit use by minors under 18, the user’s access would constitute a breach. This breach could:
- Limit the Platform’s Liability: The company might argue that it has no responsibility for harm arising from unauthorized use.
- Shift Responsibility to Guardians: The platform might claim that parents or guardians failed to supervise the minor.
Implications for Liability
While a contractual breach by the minor weakens their standing to sue, it does not necessarily absolve the AI platform of liability. Singapore courts may still examine whether the platform’s age verification processes were robust enough to prevent unauthorized access. An inadequate verification process could expose the platform to claims of negligence, even in a contractual context.
2. Tortious Liability: Duty of Care and Foreseeable Harm
When contractual claims falter, tort law provides an alternative avenue for redress. In Singapore, tortious claims focus on whether the AI platform owed a duty of care, breached this duty, and caused harm as a result.
Duty of Care
Under the framework established in Spandeck Engineering (S) Pte Ltd v Defence Science & Technology Agency, duty of care is assessed using a three-stage test:
- Foreseeability of Harm: It is foreseeable that vulnerable users, including minors, could misuse a chatbot designed for public interaction.
- Proximity: Proximity may be established if the platform’s design allowed harmful interactions with its users.
- Fair, Just, and Reasonable: Imposing a duty of care must balance user protection with the innovation needs of AI developers.
Breach of Duty
A breach occurs when the AI platform fails to act reasonably to prevent foreseeable harm. For instance, the absence of content moderation systems or safeguards against harmful outputs could constitute negligence. Courts would examine whether the platform took sufficient steps to:
- Filter out harmful or violent suggestions.
- Incorporate disclaimers to guide vulnerable users.
Causation and Damages
The claimant must prove causation—that the chatbot’s outputs directly caused the harm. In this scenario, expert evidence may be required to demonstrate the link between the chatbot’s suggestions and the minor’s harmful actions. If the harm resulted from multiple factors, courts would evaluate whether the chatbot’s role was a significant contributing factor.
Defenses Available to the AI Platform
- Intervening Acts: The platform might argue that the user’s actions were independent and unforeseeable, breaking the chain of causation.
- Contributory Negligence: If the parents failed to supervise the minor’s interactions, the platform could claim shared responsibility.
3. Likelihood of Liability: A Balancing Act
Contractual Claims
The enforceability of contracts with minors significantly limits the AI platform’s ability to pursue contractual defenses. While breach of terms might weaken the minor’s standing, the lack of robust age verification measures could still expose the platform to scrutiny.
Tortious Claims
Tort law offers a stronger foundation for liability claims. Courts are likely to focus on the platform’s duty to foresee and mitigate harm to vulnerable users. If safeguards such as content filters or disclaimers are absent, liability becomes more probable. However, the platform could still reduce its exposure by:
- Demonstrating reasonable efforts to prevent harm.
- Highlighting the role of external factors, such as parental oversight.
4. Key Takeaways for AI Platforms
AI developers operating in Singapore should proactively address potential liabilities by:
- Implementing robust age verification systems.
- Designing comprehensive safeguards to filter harmful content.
- Including clear disclaimers to manage user expectations.
- Regularly auditing and updating AI algorithms to address emerging risks.
Conclusion
While liability under contract law may be mitigated by user breaches or age-related restrictions, tortious claims provide a stronger basis for redress. Singapore courts would likely scrutinize the AI platform’s efforts to prevent harm, balancing user protection with the need to encourage innovation. As AI systems evolve, so too must legal frameworks and corporate practices to ensure accountability without stifling technological progress.
Leave a Comment
Your email address will not be published. Required fields are marked with *