Lumis and Privacy
When we started Lumis, we knew we were stepping into one of the most personal and vulnerable parts of digital life: your conversations.
The messages sitting inside Telegram, Slack, Gmail, and countless other apps aren’t just lines of text. They carry your late-night thoughts, your work struggles, your friendships, even the tiny moments of everyday life that you’d never want to lose, or have exposed. In many ways, they are the truest reflection of you.
So when we say Lumis can help you manage this ocean of conversations, the very first question that naturally arises is: Can I trust you with this?
The honest answer is: you shouldn’t have to.
No one should ever need blind faith to use a product that touches their private messages. If Lumis is going to live alongside your conversations, it has to make crystal clear what happens to your data, where it goes, how it’s handled, and, just as importantly, where it doesn’t.
Why privacy is the hardest problem for Lumis
Most AI tools today face a trust gap. The moment people hear “AI” and “messages” in the same sentence, the fears come flooding in: Will all my chats get uploaded to some mysterious cloud? Are my words being used to “train” a model I never agreed to? Could some stranger, human or machine, peek into the things I only meant for close friends or colleagues?
And those fears are valid. Messaging data is the most intimate digital footprint we leave behind. Unlike a search query or a browsing history, chats capture the raw texture of our lives, our work deadlines, our unfiltered humor, our doubts, our families, our relationships.
That’s why, for Lumis, privacy isn’t just another problem to solve. It’s the foundation the whole product has to stand on.
Our approach: Privacy by design
From day one, we’ve made deliberate, sometimes harder, choices so Lumis doesn’t just talk about privacy, it enforces it.
Your raw data stays on your device. Full chat histories never leave your phone or computer. Sensitive parsing and summarization are done locally, by models running right next to you.
A hybrid model strategy. Smaller, private models take care of confidential tasks on-device. Larger, more powerful models in the cloud are only used when they clearly add value, and never on raw, sensitive logs.
Minimal data, always. Instead of uploading a full thread, Lumis only extracts what’s necessary. For example: not the entire back-and-forth with Alex, but a single reminder, “Reply to Alex about the budget by Friday.”
Your control comes first. Pause processing anytime. Delete local memory anytime. Export your usage history anytime. No hidden pipelines. No secret training. Nothing happening in the shadows.
Building trust into the mission
For us, privacy isn’t a checkbox in a compliance document. It’s part of the soul of Lumis.
We want Lumis to grow with you, eventually becoming a kind of digital twin, dapturing your style, your habits, your priorities. But a digital twin only works if it respects your boundaries more faithfully than even a human assistant would.
Trust isn’t earned through polished marketing claims. It’s earned through clear architecture, full transparency, and giving you absolute agency over your data.
Looking ahead
As Lumis evolves, toward long-term memory, agent-to-agent communication, and contextual task execution with Atlas, privacy will always be the lens we make decisions through.
Because we don’t believe convenience should ever cost you your private life. The only future worth building is one where you never have to make that trade.
That’s the commitment at the heart of Lumis and Meland labs.
- Ethan Founder, Meland Labs