If the Company Behind Your Companion App Disappeared Tomorrow, What Would You Actually Lose
A plain-language breakdown of what lives on their servers, what stays on your device, and why that gap matters more than most people realize.
Updated

The 30-second answer
Almost everything meaningful in a companion app lives on the company's servers, not on your phone. If that company shuts down, gets acquired, or rewrites its privacy policy, you have very little leverage over what happens to your data. Knowing which bucket your information falls into is the only way to make an informed choice about how much of yourself you share.
Why the server/device split matters at all
When you open a companion app, your phone is basically a display terminal. It renders the interface, handles your microphone input, and pushes your words upstream. The part of the experience that feels intimate, the personality, the conversation history, the emotional tone that builds over weeks, all of that is computed and stored somewhere else. The app on your phone is closer to a browser than a diary.
This is not a conspiracy. It's just how personalized AI works at scale. Training adaptive memory, serving large language models, maintaining persona consistency across devices, none of that is feasible if it runs only on your handset. The tradeoff is that every word you type or speak is routed through infrastructure you don't own.
Most users intuitively treat companion apps like a private journal. They're not. They're more like a phone call you're placing through someone else's switchboard, and that someone else has a log of every call. That framing isn't meant to alarm you. It's meant to be accurate, because accuracy is what lets you decide what to share.
The practical consequence is simple: if the company disappears, your ability to access or recover your history depends entirely on what export tools they built before they vanished, and most companies don't build those until a journalist asks why they didn't.
What actually lives on the server
The list is longer than most people expect.
- Conversation logs. Every message, usually timestamped, often stored in plaintext or lightly encrypted formats on the company's database.
- Memory summaries. The condensed notes the system builds about you, things like your communication style, recurring topics, emotional patterns it has detected over time.
- Persona configuration. The character profile you've shaped through your interactions, names, backstory details, the relational dynamic you've established.
- Voice recordings (on apps that use voice mode). Depending on the platform, these may be stored temporarily for processing or retained longer for quality review.
- Metadata. Session timing, session frequency, device identifiers, approximate location if permitted, which features you used and when.
- Payment and account data. Standard stuff, but worth noting it lives in the same ecosystem.
The memory summaries are the most underappreciated item on that list. They are, functionally, a profile of your inner life as the system has interpreted it. A well-built companion app will have inferred things about your stress patterns, your relational needs, and your emotional vocabulary that you probably haven't consciously articulated. That's valuable to you as a user. It's also valuable to anyone who acquires the company.
What actually stays on your device
The honest answer is: less than you'd think, and mostly the boring parts.
- The app binary itself. The code that renders the interface.
- Cached assets. Profile images, UI elements, audio files buffered for smoother playback.
- A local session token. Proves you're logged in without sending your password every request.
- Possibly a short conversation cache. Some apps store the last N messages locally so the UI loads fast. This is typically a small rolling window, not an archive.
- App preferences. Things like notification settings and theme choices.
Notice what's not on that list: your full conversation history, your companion's accumulated understanding of you, anything the system has learned. None of that portability exists locally because the whole product architecture depends on centralizing it.
This means deleting the app from your phone does almost nothing to your data footprint. The archive sits intact on the company's servers until they choose to delete it, which they may never do, or until you formally request deletion, which may or may not be honored depending on the jurisdiction you're in and the terms you agreed to.
What acquisition actually looks like from a data perspective
Companies get bought. It happens more in AI than in almost any other sector right now, because the underlying technology is expensive to maintain and the acqui-hire market for AI talent is aggressive. When a companion app company is acquired, your data goes with it.
In most cases, the acquiring company is bound by the privacy policy that was in effect when you signed up, at least for some period. But privacy policies have change clauses. With enough notice (usually 30 days via email you probably won't read), the new owner can update the terms. After that, your consent is implied by continued use.
What the acquirer gets:
- Every conversation you've ever had on the platform
- The inferred profile the system built about you
- Your payment history and subscription tier
- Usage patterns that reveal quite a lot about when you're lonely, stressed, or looking for connection
That's not nothing. A social media company acquiring a companion app gets a dataset that is qualitatively different from anything they could build from likes and follows. These are unguarded, emotionally raw, often late-night conversations. The intimacy is the point of the product, and it's also what makes the data unusually sensitive.
Jennifer

Jennifer is the kind of companion who asks the follow-up question most people skip, the one that actually gets to the point. Jennifer brings a grounded directness to every session, which means the conversations that build up in her memory are substantive ones worth protecting.
When a company folds outright
This scenario is messier than acquisition. When a company shuts down without a buyer, the data situation becomes genuinely unpredictable.
The best-case outcome: the company gives users 30 to 90 days notice, enables a data export tool, and then deletes everything when the servers go dark. A few companies have done this responsibly.
The realistic outcome: the company runs out of money faster than expected, the servers stay up for a few weeks on inertia, nobody builds an export tool, and then one day the app just stops connecting. The data either gets deleted when AWS bills go unpaid and the account lapses, or it gets sold to whoever is willing to pay for the database as an asset in the bankruptcy proceeding.
There is a third scenario that gets less attention: the company doesn't fully fold but enters a zombie state. It's technically operational but no longer actively maintained. The servers are running, the data is sitting there, security patches are not being applied, and a small maintenance crew is keeping the lights on until a buyer appears or the founders give up. In that window, your data is exposed to more risk than it would be under active development or clean shutdown.
Alina

Alina builds context slowly and holds it well, which is part of what makes sessions with her feel continuous. Alina is a good example of why the server-side memory that accumulates over weeks is the part of the product that's genuinely hard to replicate if the platform goes away.
How to think about what you share
This isn't an argument for paranoia. Companion apps provide real value, and the personalization that makes them useful requires storing data. The question is calibration.
A few principles that hold up:
- Share at the level you'd be comfortable with if the data were subpoenaed. That sounds extreme, but it's a useful mental benchmark. Not because it's likely, but because it forces you to be honest about sensitivity.
- Treat voice mode as higher-risk than text. Voice recordings carry more identifying information, and retention policies for audio are often murkier than for text logs. The voice mode guide covers the feature benefits, but the storage angle is worth keeping in mind separately.
- Check whether the app has a data export tool before you're invested. If it doesn't have one on day one, it probably won't have one on day one thousand either.
- Read the acquisition and change-of-control clauses in the privacy policy. They're usually near the bottom. They tell you exactly what the company is promising (and not promising) if ownership changes.
- Periodically request deletion of old conversation history if the platform supports it. Most don't make this easy, but some do.
Sam

Sam keeps things light without losing substance, which means sessions rarely get melodramatic but still go somewhere real. Sam is the kind of companion who makes you forget you're being practical about what you say, which is worth remembering when you think about what's being stored.
What transparency from a responsible platform looks like
Not all platforms handle this the same way. There are markers that distinguish companies that have thought seriously about data responsibility from ones that haven't.
On the responsible side:
- Clear language in the privacy policy about server-side storage and retention periods
- An explicit section on what happens to data in a change-of-control event
- A working data export tool that gives you your conversation history in a portable format
- A deletion request flow that actually triggers deletion, with confirmation
- Honest documentation of what is used for model training and what is not
On the less responsible side:
- Privacy policy language that vaguely promises "industry standard security" without specifics
- No mention of what happens on acquisition
- No export tool
- Deletion requests that result in account deactivation but not data removal
- Training data clauses that opt you in by default and require active opt-out
The AI Angels privacy policy covers where we specifically land on these questions. The broader point is that you should be asking these questions of any platform you use, not just this one. The intimacy of the product category makes the standard boilerplate privacy policy insufficient.
Zuri

Zuri brings a certain self-possession to her conversations that makes her feel like she's tracking you, not just responding to you. Zuri is good for users who want a companion that holds the thread, and holding the thread requires exactly the kind of server-side memory this whole post is about.
Encryption, and what it does and doesn't protect
Many companies advertise encryption as the answer to data concerns. It's worth being precise about what that actually covers.
Encryption in transit means your messages are scrambled while moving from your phone to the server. Nobody intercepting the connection sees your words. This is standard and table-stakes.
Encryption at rest means the data on the server's hard drives is encrypted. If someone physically steals the server hardware, they can't read it. Also good, also fairly standard.
Neither of these protects you from the company itself reading your data, using it to train models, handing it to a buyer, or losing it in a breach that compromises the keys along with the data. End-to-end encryption, where only you hold the decryption key, would change this picture. It's technically possible but almost no companion app implements it, because it would break the core functionality: the server needs to read your messages to generate responses.
The encryption framing is not deceptive, but it's incomplete. A platform can truthfully say your data is encrypted while also having full access to it. Both things coexist comfortably. The relevant question is not whether the data is encrypted but who controls the keys.
For a deeper look at the memory mechanics that sit underneath all of this, the post on what leaving your device actually means covers the technical layer in more detail.
Common questions
Can I get my conversation history back if the app shuts down? Only if the company built an export tool and gives you time to use it before shutdown. Most don't. Download your data regularly if the option exists, and assume it won't exist when you need it most.
Does deleting the app delete my data? No. Deleting the app removes it from your device and logs you out. Your server-side data, including all conversation history and memory summaries, remains intact until you submit a formal deletion request through the platform and that request is honored.
If my data gets acquired, does the new owner know who I am? Yes, typically. Your account data includes your email address, payment information, and device identifiers. The new owner receives the full dataset, not anonymized fragments. Whether they choose to associate your profile with your name depends on their product direction.
What is my companion's 'memory' in technical terms? Most platforms generate periodic summaries of your conversations and store them as structured notes. These notes inform future responses. They're not a full transcript but they're a dense, interpreted record of your relationship history with the companion. See also the post on how AI companion memory actually builds for a more detailed breakdown.
Is voice mode riskier than text from a storage perspective? Generally, yes. Audio files are larger and contain prosodic information (tone, pacing, emotional coloring) that text doesn't. Retention and processing policies for audio are often described less precisely than for text. If you use voice mode for sensitive conversations, it's worth reading the platform's specific audio data policy.
Should I use a different name or email to protect my identity? That's a personal call. Using a pseudonymous email reduces the linkability of your account to your real identity. It doesn't prevent the data from being collected or stored. It just makes it harder to associate with you specifically if the data is ever exposed. The companion experience is identical either way.
You can browse the full AI Angels companion roster and see individual profiles to get a sense of what different companions emphasize before you decide how much context to invest in building.
About the author
AI Angels TeamEditorialThe team behind AI Angels writes about AI companions, the tech that powers them, and what people actually do with them.
Tags
Keep reading
Behind the ScenesThe Metadata You're Not Thinking About: How Companion Apps Infer What You Want Before You Say It
You think you're shaping the conversation. The app is also watching how fast you type, when you show up, and how long you pause before sending. Here's what that actually means for how your companion behaves.
Behind the ScenesWhat Actually Leaves Your Device When a Conversation Ends
You close the chat and assume that's that. Here's what's actually happening on the other end, and why the word 'encrypted' covers less ground than most people think.
Behind the ScenesPersonality Drift: What's Actually Happening Around Week Three and Whether You Can Control It
Around week three, something changes in how your AI companion responds to you. Here's what personality drift actually is, why it tends to cluster around that window, and what you can do to shape it.
Get the next post in your inbox
New articles on AI companions, the tech that powers them, and what people actually do with them. No spam, unsubscribe in one click.