For Josh Lee Kok Thong, Singapore’s effortless digital habits obscure a real issue

The managing director of the Future of Privacy Forum shows how standards, education and inquisitive leadership can shift Singapore from reactive compliance to confident capability.

future of privacy forum
Photo: Isabelle Seah/SPH Media
Share this article

Scroll, tap, swipe. Singaporeans are among the world’s most connected people. Phones, payment apps, and artificial intelligence (AI) chat tools are integral to daily life. 

Yet, that comfort, says Josh Lee Kok Thong, managing director (APAC) of the Future of Privacy Forum (FPF) and adjunct lecturer at the Singapore Management University, can be deceptive. Many, he reveals, are “digitally familiar, not digitally savvy” — able to use technology with ease but not as aware of how it actually works.

Lee first saw this in his law classroom. Students could move effortlessly between online platforms but struggled to explain where their data went or how it might be used. That gap between use and understanding, Lee believes, is where vulnerability begins.

Mistaking familiarity for expertise breeds a misplaced sense of confidence. The problem is psychological as much as technical. People who consider themselves savvy assume they’re less likely to fall for scams. Or worse, bask in the complacency of immunity. That assurance encourages risky behaviour: clicking unknown links, sharing personal information or trusting default settings.

“Lowering your guard invites exploitation”

A 2022 survey by the National University’s Institute for the Public Understanding of Risk found that Singaporeans under 25 were more susceptible to scams than those over 65, despite their digital fluency.

Researchers linked the pattern to “optimism bias”, the belief that misfortune happens to others. “It’s a classic blind spot,” Lee says. “In the digital world, that bias scales faster than ever.”

Lee argues that the same assumption pervades corporate life. Many leaders equate digital fluency with organisational readiness just because their employees are comfortable with Slack or Teams. But readiness isn’t the same as competence,” he emphasises.

He goes on to caution: “Lowering your guard invites exploitation. And that applies as much to institutions as it does to individuals.”

The consequences can be severe. In February 2024, an employee at Arup, a British engineering firm, was duped by a deep-faked video call into transferring HKD$200 million (about $33.5 million) to fraudsters. The year before, Samsung engineers accidentally uploaded internal source code to a public generative-AI platform, prompting a company-wide ban on such tools.

“These aren’t rookie mistakes,” Lee says. “They happened in large, well-resourced organisations.” The underlying problem, he adds, is not ignorance but the absence of structured governance.

He points to several common weaknesses: reliance on off-the-shelf digital systems whose data origins are not always clear; inadequate staff training on responsible AI use; and procurement decisions made without a full grasp of data-handling implications. 

“Without those fundamentals, you can’t claim to be digitally prepared,” he asserts. And when incidents occur, the fallout is invariably measured in trust and confidence, “the fragile currency of today’s data economy”.

Building credible standards

Restoring that confidence requires more than reactive compliance. Singapore’s Infocomm Media Development Authority has introduced the Privacy-Enhancing Technologies (PETs) Guide and the Singapore Standard 714 to provide businesses with practical tools for responsible innovation.

 future of privacy forum
Photo: Isabelle Seah/SPH Media

The PETs Guide provides guidance on privacy-enhancing techniques such as federated learning, homomorphic encryption and secure multi-party computation. At the fundamental level, it’s like learning from, or being able to use, information without actually seeing it, explains Lee.

Singapore Standard 714, meanwhile, embeds data-protection-by-design principles across a product’s life cycle. Together, the two frameworks signal a shift from policy intent to operational practice.

“They move the conversation from compliance to capability,” Lee says, showing that responsible data use and innovation can reinforce each other.

Such standards also help Singapore differentiate itself internationally. As data governance becomes a condition for trade and investment, the ability to demonstrate accountable innovation is emerging as a competitive edge.

“Global investors and partners look for ecosystems that can safeguard data,” Lee says. “That’s where Singapore can lead.”

Regulation can only go so far. “The law sets the outer boundaries,” he says, but adds that awareness and the right mindsets must develop within organisations and society. Without understanding why privacy and governance matter, compliance risks become a checklist exercise.

Many people appreciate the risks only after seeing how easily information flows across systems. Lee notes that once they realise how easily data can be combined and repurposed, “you begin to understand why guardrails are needed”.

From compliance to curiosity

In Lee’s view, education remains the most durable defence. True digital competence is not about coding but curiosity — asking how an app makes money, what data it collects, and who can see it. He urges boards to adopt the same inquisitiveness.

“If boards don’t understand the systems they’re buying, they can’t set meaningful oversight,” he states. “Governance must be informed, not delegated.”

That shift in mindset will determine how societies adapt to the next wave of data-driven tools. Digital adoption, he warns, is accelerating faster than comprehension. “Unless awareness keeps pace, convenience will continue to masquerade as competence.”

Lee is realistic but hopeful. “We can’t afford to treat technology as magic,” he says. “Once we look under the hood, we realise that confidence isn’t automatic — it’s something we have to build and maintain.”

For now, Singapore’s new standards and growing emphasis on public data awareness offer a pragmatic path forward: not slower innovation, but smarter risk management. “If we want a trusted digital ecosystem,” he says, “we have to start with informed people and accountable systems. Everything else follows from that.”

Share this article