COVA.OS

Industry

AI & Technology

Collaborators

Swedish & Norwegian police

Project Team

Rachel Chuman Zhang & Lin Shen

Timeline

6 weeks

Context

Young people today live in rich, fast moving digital environments where their online identity is deeply tied to friendships, belonging, and self expression. But the same spaces introduce threats they cannot always understand, such as privacy breaches, coercion, harassment, or account misuse.

This tension creates a silent digital anxiety.

I know something could go wrong, but I don’t know what to do if it happens.

Challenge

Existing cybersecurity tools feel technical, authoritative, and overwhelming. And when the situation involves police or formal reporting, most teenagers feel intimidated and avoid seeking help, even when they urgently need it.

So the core problem became:

How might we make “cyber safety” feel approachable, supportive, & emotionally safe, without diminishing agency or identity?

Research Insights

From interviews and workshops, three strong tensions emerged. These insights shaped the tone and structure of the service.

01/

02/

03/

“I don’t want to look weak or like
I can’t handle things.”
Need
Teens often equate independence with strength. Admitting fear or confusion feels like losing social credibility, even when they’re overwhelmed.
Implication
Support must feel peer-aligned and affirming,
not corrective. The system should position help-seeking as an act of confidence, not weakness.
“I know I should be careful online, but I don’t really know how.”
Need
Digital risk feels abstract until it happens. Teens lack intuitive frameworks to understand warning signs or boundaries.
Implication
Escalation should be gentle and user-controlled, allowing young people to choose when and how to involve others.
“Once adults get involved, it becomes a whole big deal.”
Need
Teens fear losing control or being overprotected once adults step in. This prevents early reporting or open dialogue.
Implication
Create gentle, choice-driven escalation where the user stays in control of what is shared and when. Help should expand at their pace, not ours.

Design Strategy

Security as empowerment  not restriction. To build trust, we designed around three core principles:

Humanize the AI
The tone is warm, conversational, and
non-judgmental.
Proactive, not reactive
The system identifies early warning patterns and offers help before harm escalates.
Respect agency
The user remains in control: choosing when and how to escalate to support or authorities.

Solution

Cova.os is a companion system that helps teenagers understand digital risk, reflect on what they’re experiencing, and seek the right help without fear or overwhelm.

01/

Onboarding introduces the system as a peer-aligned guide, not a monitoring tool.

02/

The risk visualisation dashboard translates complex threat signals into clear emotional language.

03/

Conversation-based reporting allows users to express what happened in their own tone and pace.

04/

When necessary, gentle, choice driven escalation connects them to trusted adults or local authorities, always with emotional support first.

Information interception
and alert

Cova detects an inappropriate image  and automatically blurs it.

Conversations & Compassion

Teens often hesitate to ask for help online. Through empathetic language and tone, Ebba normalizes help-seeking, gently prompting users to lean on trusted people and grow stronger through connection.

Guidance for Reporting

Platform Report lets them flag online threats directly where
they happen, making reporting feel natural and immediate.

Provisional Report allows quick sharing with law enforcement, helping reveal online hotspots and patterns of risk.

Finalizing reports and ensuring follow-up

Cova thanks the user for helping identify digital threat hotspots and strengthen online safety by reporting the incident.

Digital report for the police

Every six months, the police receive a digital report summarizing reported, resolved, and pending online crimes. These updates help them track emerging digital hotspots, stay vigilant, and act quickly when needed.

Feedback from teenagers was clear

“This feels like something I could actually use. I wouldn’t hesitate, it’s not scary, and it doesn’t make it bigger than it is.”

Outcome

While exploratory, the project demonstrated how cybersecurity can shift from fear-based messaging
to shared care and self-respect in digital identity.

It helped define a service model where:

Support feels immediate, not delayed
Help feels empathetic, not intimidating
Safety becomes something a young person chooses, not something forced upon them

My Role

This was not just a UI challenge,
it required navigating emotion, responsibility, and power dynamics.

• Led UX research and synthesized behavioral insights
• Designed and facilitated co-creation workshops with teenagers
• Created the interaction model and UI system for the solution
• Collaborated with social workers and law enforcement representatives

Reflections

This project taught me how to design in spaces where stakes are emotional, social, and relational, not just functional.
I learned to balance support and agency, especially when involving both young people and police,
where trust and tone are everything.