Ignition AI Accelerator is setting the standard for ethical AI innovation
Its co-founder, Rachel Chng, is guiding startups to embed fairness, transparency, and accountability into their innovations — before regulation forces them to.
By Lyn Chan /
Forget the whirring servers and algorithmic prophecies. Rachel Chng’s story isn’t about predicting the future of artificial intelligence (AI) — it’s about building the scaffolding that lets that future rise. The co-founder and director of Ignition AI Accelerator started in planning, research, and policy in the civil service, focusing on cross-sector partnerships and regulatory frameworks before moving into venture-building. This atypical career shift gave her a broad perspective on how industries evolve.
Her time in the public sector exposed her to the complexities of policymaking and the challenges of translating technology into real-world impact. These experiences proved invaluable when she stepped into the world of AI startups, where innovation often outpaces regulation.
Presently, Chng connects startups, corporations, and policymakers to shape AI’s future in Singapore and beyond. The accelerator provides founders with technical expertise, computing power, industry partnerships, and regulatory guidance, ensuring that AI solutions don’t just remain ideas but become scalable businesses.
What makes a startup work — or fail
Having worked with countless founders, Chng, 32, has seen both successes and failures up close.
The difference, she says, often comes down to persistent resourcefulness. “Instead of seeing obstacles as failures that set us back, these can be seen as data points providing insights on a product or market that may not yet be ready.”
She observes that founders often treat failure as a dead end rather than a step toward success. “Each mistake brings us closer to understanding what success may look like,” she says.
Chng has observed that successful founders approach failure as a learning process rather than a roadblock. “Instead of seeing challenges as failures that set us back, these can be seen as data points providing insights on a product or market that may not yet be ready,” she says.
But overcoming failure is only one piece of the puzzle. For startups to succeed, they need more than resilience — they must also deal with the realities of working with larger organisations.
After all, startups move fast; corporations, more cautiously. One prioritises experimentation, while the other relies on predictability and scale. Working with both, Chng understands the friction that arises. Nevertheless, both groups “can work well together once they nail down their key objectives”.
Yet, alignment isn’t enough. Startups working with corporations often face long onboarding processes and sales cycles that can slow progress. Chng adds that the challenge is staying focused on measurable outcomes rather than getting lost in the process.
The real value of AI
Another obstacle: AI is often hyped as a universal solution. Chng takes a more measured view. “A common misconception is that AI alone creates value,” she says. “Real impact comes from solving practical problems, like optimising logistics or streamlining decision-making, where AI enhances efficiency rather than being a standalone solution.”
She points to companies like Nanyang Biologics, which applies AI to biotech research; RIDA, which improves complex deliveries with AI-powered routing; and HeyHi, which redefines personalised education. These startups don’t just automate — they use AI to tackle real-world challenges.
A pattern is seemingly emerging: Some of the most exciting AI applications aren’t just about automating processes but about enabling entirely new ways of thinking and decision-making.
AI ethics: Beyond the buzzword
With AI’s rapid expansion comes the burden of responsible use. While many companies talk about “ethical AI”, Chng is focused on making it a reality.
She says Ignition AI Accelerator guides startups toward responsible AI use by embedding governance frameworks early, ensuring that transparency, fairness, and accountability remain intact as AI startups scale.
One startup she worked with implemented strict internal policies to ensure biometric data was never collected or stored in their retail applications — a move that went beyond regulatory requirements and reflected a long-term commitment to ethical AI. “Compliance is only part of the equation; establishing trust from the outset is just as crucial,” she says.
Today, even as startups grapple with AI ethics, Chng believes AI accelerators must evolve, too.
“AI startups need more than funding. They require access to computing power, industry collaboration, regulatory guidance and global market access.” For her, the focus should be on long-term ecosystem building rather than short-term sprints.
She notes that AI startups in Asia are particularly agile, quickly localising their solutions for different markets. By contrast, Western counterparts often excel in deep-tech research but struggle with localisation.
On a personal level, Chng hopes “to redefine how innovation is nurtured — connecting startups, corporates, and policymakers to create sustainable ecosystems where transformative ideas are built, scaled and able to make an impact” — and leave her mark.
She sidesteps trend-chasing. “Continuous learning and growth, both personally and professionally, guide me,” she says. “Staying curious, adapting, and striving for meaningful impact ensures that innovation and progress remain at the core of what I do.”