EILI — Emotional Intelligence Leadership InstituteSign in

Privacy

How your conversations are used

Last updated: April 2026

Ellie is an AI coach built by the Emotional Intelligence Leadership Institute (EILI). She is here for the working moments that land hard — difficult feedback, a frustrating meeting, a decision you're carrying. Your conversations are sensitive, and this page is the plain-language version of how we treat them.

What stays private to you

Your full conversation history, the memory Ellie keeps about your patterns, and the reflections she leaves you are visible only to you when you sign in. No employer, manager, or third party can see them. EILI staff do not read your chats as part of normal operations.

You can delete anything Ellie remembers about you at any time from the Memory page, and you can delete individual conversations from the Reflections page. Deleting your account removes your memory, conversations, and reflections.

What we learn across the whole community — anonymized

EILI is a learning institute. To keep making Ellie more useful, and to decide what educational content to produce each week, we extract a small, anonymized signal from each conversation that has enough substance for a theme to emerge. That signal contains four things:

These rows are written to a table that has no user identifier on it. There is no database link, no hash, and no pepper that would let us trace a row back to you. Once written, the signal is severed from your identity — including from us. This is enforced in code, not just in policy.

Aggregates are only shown to EILI staff when the active cohort is at least 10 learners, and any single theme must have at least 3 mentions in a week before it appears. This is a guard against small-number re-identification.

If your organization licenses Ellie for you

Some learners join Ellie through an organization (employer, program, or institute) that has purchased a cohort license. When this is the case, the anonymized signals described above are also tagged with your organization's identifier — a single many-users-wide tag, not a personal identifier — so that EILI staff can see cohort-level patterns: what kinds of moments are common across your organization, and which regulation tools tend to land.

Your organization never receives raw data, dashboards, or self-serve insights. EILI uses these patterns inside our consulting practice — our consultants interpret the patterns and translate them into recommendations for your organization's People and Wellbeing leaders. The recommendations are about culture and practice; the underlying data stays with EILI.

The privacy posture is the same as for the platform-wide pool. Your conversations, memory, and reflections remain private to you; your employer cannot see them. Cohort patterns require at least 5 cohort members and at least 15 anonymized signals in the rolling 90-day window before they surface, with each individual pattern requiring 3 underlying observations.

Opting out of anonymized insights

Contributing to anonymized trend data is on by default — it is how a learning institute keeps learning. If you'd prefer your conversations not contribute even in severed form, email us at privacy@eiliemotionalintelligence.com and we will flag your account to skip insight extraction going forward. (Signals already written cannot be retrieved and removed because they carry no link to you.)

Voice features

You can speak to Ellie and have her speak her replies back. Voice is handled by the Web Speech API built into your browser — when you hold the mic, your audio is sent to your browser vendor (for example Google or Apple) for transcription under their privacy terms. Anthropic receives only the final transcript, not the audio itself. Ellie's speaking voice is generated locally on your device. You can turn voice features off entirely on the Memory page.

Safety and crisis support

Ellie is an AI coach, not a therapist or crisis service. If she detects language suggesting you may be in crisis or at risk of harm — to yourself or someone else — she will slow down, acknowledge what you're carrying, and surface crisis resources like your local crisis line or emergency number. She will not notify your employer or anyone else.

The example resource above is selected from your browser's time zone so it's relevant to where you are. We don't store your location — the lookup happens locally in your browser.

AI model providers

Conversations are sent to Anthropic's Claude API to generate Ellie's replies and to a smaller model for memory and insight extraction. Anthropic's API does not train its public models on your API traffic. Learn more at anthropic.com/privacy.

Contact

Questions, corrections, or requests are welcome at privacy@eiliemotionalintelligence.com.