How Idle is embedding AI into Singapore’s favourite care services without letting algorithms and automation take over
CEO Kritika Seth built Idle to end the chaos of spreadsheets, WhatsApp messages, and guesswork, giving smaller operators structure without stripping away nuance.
By Zat Astha /
When Kritika Seth and her co-founder, Alisha Wadkar, tried to book basic grooming services at university, the inconvenience felt small. They could not access on-campus beauty appointments, so they did what many young founders do: they built a solution. The first idea was an at-home beauty marketplace.
But conversations with salon owners across Singapore quickly reframed the problem. Demand was there. Skill was there. What was missing was operational capacity. For many owners, expansion simply felt impossible.
What appeared to be an access issue was, in fact, a systems issue. Behind the calm façade of neighbourhood salons, barbershops and spas, Kritika found businesses stitched together by WhatsApp threads, notebooks, fragmented software, and Excel sheets that never reconciled. None of the tools spoke to one another; the owner became the integration layer.
Idle emerged from months spent inside these stores, observing what actually broke. It is not merely a booking app but an operational infrastructure built for service businesses — salons, wellness studios, barbershops — where care and trust define the experience. Idle brings bookings, staffing, payments, memberships, and inventory into one system, reducing administrative fragmentation that quietly drains operators.
“The work required a human touch,” Kritika says. “But the systems supporting it were anything but human-aware.”
Override over automation
Idle places AI inside environments built on trust, which demands restraint. “The hardest boundary is knowing where certainty ends,” Kritika explains. In care settings, truth is negotiated in real time — a stylist runs late because a client needs reassurance; a therapist extends a session. Loyalty often lives in that flexibility.
AI, however, defaults to throughput — closing gaps and standardising interactions. In service businesses, the pause can matter more than the optimisation. Efficiency assumes shared goals; in care contexts, they rarely are.
So Idle prioritises override over pure efficiency. The system acts confidently when conditions are stable and steps back when nuance or emotion enters. Staff and owners retain authority. Over-automation, Kritika learned, creates friction: confused customers, staff who feel managed by algorithms, and businesses stuck doing damage control. Guardrails require discipline but deliver durable systems.
Unlike louder AI products, Idle operates quietly. “A salon staff member doesn’t want cognitive overload,” she says. The AI handles confirmations, reminders and workflows behind the scenes, avoiding performative dashboards or authoritative prescriptions. Recommendations remain legible and overridable.
This restraint slows releases and demands more testing. Yet in trust-heavy environments, visible control erodes adoption faster than missing functionality. Kritika prefers systems that businesses have relied on for years over products that dazzle briefly.
The most common misunderstanding, she adds, is that Idle is “just an appointment calendar”. Bookings are only the visible surface. Beneath each slot sit staffing logic, pricing rules, memberships, inventory and client history. Businesses operate in rhythms, not static templates — and one adjustment can ripple through the day.
Structure without dehumanisation
While most software treats businesses as static templates, Kritika sees them as living systems — an insight that led to Tetra. Where Idle serves beauty and wellness, Tetra extends the same philosophy to gyms, cafes, design studios, tuition centres and other service operators.
Rather than forcing rigid templates, it models how a business actually runs — what it sells, how customers move through it, how scheduling links to payments and inventory — and adapts as it grows.
During early testing, however, the limits of modelling became painfully clear. The system flagged “a performance dip pattern for certain staff members because they didn’t contribute enough to monthly revenue”. From a distance, she admits, it “looks like a simple decision for a business owner to make to trim costs”.
But something felt off. “This was our model taking a narrow proxy metric (revenue) and quietly upgrading it into a verdict about a person,” she says, “encouraging a destructive dynamic that we didn’t want our software to enable.” The system “wasn’t technically wrong,” she concedes, “but the objective function was just incomplete.” A staff member, after all, “isn’t a metric, they’re a person operating within context”.
That moment sharpened her principles. “In care-adjacent businesses, automation must behave less like persuasion and more like service. It should help the owner show up reliably and not pressure people through opaque scoring.” That’s when Idle tightened its safeguards. Triggers that prompt action become more conservative.
Recommendations required a clearer explanation. Human oversight remained central — especially when outputs influence pay, performance or reputation.
Kritika draws a sharp line between assistance and substitution. AI can replace repetitive tasks — rescheduling, reconciling, and forecasting. It should not substitute responsibility. Substitution occurs when systems make social or moral judgments: which client to prioritise, when to escalate a conflict, or when to protect a staff member’s time over a customer’s demand.
Early on, Kritika thought the boundary was primarily technical — avoid unsafe actions. Over time, she recognised its psychological dimension. Over-automation erodes agency. A technically correct system can still make a business brittle if staff feel dispossessed. “The aspiration isn’t an AI-run business,” she says. “It’s a human-run business with a nervous system.”