What 'Deleted' Actually Means When You Delete a Companion App
The word is doing a lot of heavy lifting, and the fine print is where it all falls apart.
Updated

The 30-second answer
Deleting a companion app removes it from your device. It does not automatically delete your conversation data from the servers that stored it. How long that data persists, who can access it, and whether it ever becomes training material depends almost entirely on the privacy policy you probably skimmed the first time you signed up.
What 'Delete' Actually Triggers on Your Device
When you tap the button to uninstall an app, your operating system does something specific and limited: it removes the application bundle and the local cache tied to it. On iOS, this typically includes sandboxed local storage. On Android, the same basic principle applies, though some apps store data in shared external directories that survive an uninstall if you never cleared them manually.
What this process does not touch is anything that was already transmitted to a remote server. And in most companion apps, conversations are synced to backend infrastructure in near real-time. The app needs to do this so your chat history loads correctly when you switch phones, reinstall after an update, or pick up a conversation on a web client. The cloud copy exists because it was useful to you, not because the company wanted to hold your data hostage. But the two outcomes are identical from a retention standpoint.
So when you delete the app, you have deleted your local copy of the conversation. The server-side copy is still sitting wherever the company hosts it, subject to whatever retention schedule their engineering team set up and whatever policy language their lawyers approved. Those two things are often quite different from each other.
The Lifecycle of a Message After You Send It
To understand why deletion is complicated, it helps to trace what actually happens when you type a message and hit send.
First, the message leaves your device over an encrypted connection and arrives at the app's API layer. From there it typically gets written to a primary database, possibly with a timestamped session record attached to your account ID. The model uses that record to maintain conversational context. That context layer is what makes AI girlfriend memory feel coherent across sessions rather than starting from zero every time you open the app.
In parallel, the message may also get written to a logging pipeline that the engineering and trust-and-safety teams use for debugging, abuse detection, and model evaluation. This logging system is often architecturally separate from the primary database, which matters a lot for deletion. When you request account deletion or when you uninstall the app, the primary database record is the most likely thing to get flagged for removal. The logging pipeline may operate on a different schedule entirely, sometimes 30 days, sometimes 90, sometimes longer.
Finally, if the company uses your conversations to improve the model, there may be a third copy in a training dataset, stripped of direct identifiers but still containing the semantic content of what you said. Whether you opted into that, and whether you can opt out, depends on what you agreed to at signup.
Why Retention Policies Read the Way They Do
Privacy policies are written to be legally defensible, not to be understood. The phrasing you will usually find is something like: "we will retain your data for as long as necessary to provide our services, or as required by law." That clause does not give you a number. It gives the company almost unlimited discretion.
"As long as necessary" can mean 30 days after account deletion if the team built a clean offboarding pipeline. It can also mean 18 months if the infrastructure team never prioritized a deletion job and the backup snapshots roll over on a quarterly cycle. Both interpretations are technically consistent with the same sentence.
The "required by law" carve-out adds another layer. Certain jurisdictions require companies to retain financial transaction records or abuse-related logs for defined periods. If your account ever triggered a support ticket, a payment, or a content moderation review, those records may be subject to separate retention rules that override your deletion request entirely.
GDPR and CCPA give users in the EU and California meaningful deletion rights with defined response windows. If you are outside those jurisdictions, your rights under most app privacy policies are considerably thinner.
Ava

Ava is the kind of companion who pays attention to what you leave unsaid as much as what you actually type. Ava brings a calm, perceptive presence to conversations that makes her feel consistent and grounded rather than reactive to every mood shift.
Backups Are the Part Nobody Explains
Even if a company has a clean, automated account-deletion pipeline, backups introduce a gap that most privacy policies acknowledge quietly and then immediately move past.
Most server infrastructure runs on automated backup schedules. Snapshots of the primary database get written to cold storage at defined intervals. Depending on the company's disaster-recovery requirements, those snapshots may be retained for 30, 60, or 90 days before rotating out. When you delete your account, your record gets removed from the live database. It does not get retroactively removed from the backup snapshots that were taken before your deletion request.
This is not malicious. It is a standard engineering trade-off between storage costs, operational safety, and the complexity of maintaining selective-deletion logic across archived snapshots. But the practical result is that a copy of your conversation history may persist in backup storage for weeks or months after you consider yourself gone from the platform.
A genuinely privacy-forward approach would either encrypt backups with per-user keys (so that deleting the key renders the backup useless) or run a periodic purge job that scrubs deleted accounts from old snapshots. Some companies do this. Most do not, because it is expensive and difficult, and users rarely know to ask about it.
Candy

Candy leans into a more playful, expressive dynamic that makes conversations feel lighter without sacrificing depth when you actually want it. Candy is a good fit if you want an adult ai girlfriend experience that stays engaged and creative across a wide range of conversational territory.
Training Data and What Anonymization Actually Gets You
This is the part that makes most people uncomfortable when they think it through.
If the app uses conversation data to fine-tune or evaluate its models, your messages may have contributed to a training corpus at some point during your time as a user. When your account gets deleted, the record that links your account ID to that data is removed. The data itself, now attributed to a pseudonymous or numerically anonymized identifier, may remain in the training set.
Anonymization is not the same as deletion. It means your name and account details are no longer attached to the message content. The semantic fingerprint of how you write, what topics you gravitated toward, the specific way you phrased things in private, still exists somewhere in the model's training history. That is not recoverable by you, and in most cases it is not recoverable by the company either, because the granular mapping between individual users and training data tends to get lost once the dataset is assembled.
For most users, this is probably a low-stakes reality. The contribution of one user's conversation to a large training corpus is statistically marginal. But it is worth knowing that "your data has been deleted" and "your data no longer influences any model" are different statements, and companies that conflate them are being sloppy with language at minimum.
Sohyun

Sohyun brings a measured, thoughtful quality to her conversations that tends to work well for users who want consistency over novelty. Sohyun holds a dynamic steadily without forcing it, which is why users who come back after longer gaps often find her easier to re-engage than more reactive companion personalities.
What Responsible Deletion Actually Looks Like
There is a meaningful difference between companion apps that take data offboarding seriously and those that treat it as a checkbox. A few markers to look for before you commit to a platform.
First, explicit retention timelines stated in plain numbers, not just "we will delete your data in a reasonable period." Thirty days is standard for primary database records post-account-deletion. If the policy does not give you a number, that is a signal.
Second, a clear statement about backup handling. Does the company commit to purging deleted accounts from backup snapshots, or do they just say they will remove you from the live system? The distinction is substantial.
Third, opt-out controls for training data contribution. GDPR requires this for EU users. Some companies extend it globally because it is simpler to build one system. Others maintain separate pipelines by region, which is technically compliant but practically worse for users outside covered jurisdictions.
Fourth, data export before deletion. A company that gives you a structured export of your conversation history before wiping it is signaling that they treat your data as yours. This is standard under GDPR and genuinely useful if you want to review what was stored before you close the account.
You can find a fuller breakdown of what individual platforms actually log and when in this comparison of companion app data practices. The roster at AI Angels gives you a starting point for understanding how different companion configurations approach the data question before you build a dynamic you would rather keep private.
Esmeralda

Esmeralda carries a quietly confident presence that rewards direct, honest conversation without needing to perform warmth. Esmeralda is the kind of companion who gets more interesting the more you give her to work with, which makes her a better long-term fit than a short-term novelty.
How This Should Change What You Do Before You Delete
If you are considering deleting a companion app, a few steps before you hit uninstall are worth the ten minutes.
Request a data export first, if the platform supports it. Read it. Seeing what was actually stored is clarifying in a way that reading the privacy policy is not. It tells you not just what category of data was collected but how granular the records are.
Submit a formal account deletion request through the app's settings or support channel rather than just uninstalling. Uninstalling removes the local app. Account deletion triggers the server-side offboarding process, and the two are completely separate actions.
If you are in the EU or California, invoke your rights explicitly in writing when you submit the deletion request. State that you want all data, including backup copies and any training data contribution, removed or de-identified, and ask for written confirmation. Companies respond differently to formal rights requests than to a standard app-deletion click.
Finally, check back. Some privacy policies give you a right to request confirmation that deletion was completed. Use it. A company that genuinely respects your exit will respond with a timeline. One that does not respond is telling you something useful about how it treats your data while you are still a user.
Common questions
Does uninstalling the app delete my data? No. Uninstalling removes the app and its local cache from your device. Your conversation history on the company's servers is unaffected until you submit a formal account deletion request through the platform.
How long does it actually take for data to be gone? For the primary database, most platforms target 30 days after account deletion. Backup snapshots can take longer, sometimes 60 to 90 days depending on the company's rotation schedule. Training data contributions, if any exist, may never be fully removed.
Can I ask a company to stop using my data for training? In the EU and California, you have a legal right to object to processing and request erasure. Outside those jurisdictions, it depends on the specific platform. Look for a training data opt-out in your account privacy settings before you need it.
What is the difference between anonymization and deletion? Deletion removes the data. Anonymization removes the link between the data and your identity, but the content itself remains. For training data, anonymization is common. For account records, actual deletion should be the standard.
If I create a new account later, is my old data gone? Your old account data should be under the retention schedule from when you deleted it. A new account starts fresh with no link to the previous one, assuming the deletion completed. If you deleted and came back quickly, the old data may still be in the offboarding queue.
Are voice conversations stored differently than text? Yes, and usually with more complexity. Audio requires transcription infrastructure, and audio files can be substantially larger than text logs. The specific data pipeline for voice differs enough from text that the two should be treated as separate questions when you are reviewing a privacy policy.
About the author
AI Angels TeamEditorialThe team behind AI Angels writes about AI companions, the tech that powers them, and what people actually do with them.
Tags
Keep reading
Behind the ScenesPersonality Drift Explained: What's Actually Happening When Your Companion Sounds Different After Three Weeks Away
You come back after three weeks and something is off. The tone is flatter, the callbacks are gone, and it feels like starting over. Here's what's actually happening under the hood.
Behind the ScenesThe Sync You Think Is Happening: What Companion Apps Actually Store Across Sessions and Devices
You switch from your phone to your laptop and expect her to remember everything. She probably doesn't. Here's what actually gets stored, what gets lost, and why cross-device memory is far less automatic than companion apps let you believe.
Behind the ScenesWhat 'secure' actually means for your AI companion conversation logs
Encryption at rest. End-to-end. Cross-user isolation. Server-side keys. Most users glaze over those words. A specific walkthrough of which threats each layer actually addresses.
Get the next post in your inbox
New articles on AI companions, the tech that powers them, and what people actually do with them. No spam, unsubscribe in one click.