Looking for a Replika alternative? —
Replika Alternative — Companion Without the Trust Collapse
Replika has been through the ERP removal, the €5M Garante fine, an FTC complaint, and a peer-reviewed study documenting boundary-disregard at scale. InnerForge is a warm emotional companion that runs in the AI you already trust.
Replika homepage — today

Captured 2026-04-26
What they do well —
What Replika legitimately gets right.
Honest comparison only works when both sides get a fair hearing. Here’s where Replika is genuinely strong — if these match what you’re looking for, they’re probably the right call over us.
Genuinely warm voice that found a real audience
Replika at its best — particularly pre-February 2023 — earned legitimate emotional attachment from users with autism, grief, social anxiety, or chronic isolation. The product solved something real for those users. That's worth acknowledging before any critique.
Persistent companion memory
Replika's persistent character memory and roleplay framework offered a continuity that few chatbot products match. For users who specifically want an AI presence that grows with them across years, the architectural choice was real product investment.
Mobile-native experience polished
The iOS and Android apps are well-designed, with character customization, voice features, and AR modes that consumer apps in this category rarely ship. If app polish matters more to you than where your conversation data lives, Replika delivered on that axis.
Where it falls short —
The patterns documented in primary sources.
Every claim below cites a primary source — FTC orders, court dockets, peer-reviewed research, or recurring review patterns — rather than cherry-picked complaints.
€5M Italian Garante fine, April 2025
Italy's Garante per la Protezione dei Dati Personali (data-protection authority) issued order #10130115 on April 10 2025, fining Luka, Inc. €5 million for processing minors' data without lawful basis and other GDPR violations. This was the culmination of a regulatory chronology starting with the 2023 urgent measure (No. 39/2023) that briefly banned Replika in Italy. The Garante's reasoning is publicly available via the EDPB.
Source: EDPB primary record — Italian Garante decision against Luka, Inc. (accessed 2026-04-26)
FTC complaint by advocacy groups, January 2025
Tech Justice Law Project, Young People's Alliance, and Encode jointly filed an FTC complaint against Replika in January 2025, alleging fabricated testimonials and dark-pattern monetization. Note: this is an advocacy-group complaint, not a federal FTC enforcement action — but the underlying allegations are publicly documented and the FTC has acknowledged receipt.
Source: Tech Justice Law Project announcement + complaint PDF (accessed 2026-04-26)
Peer-reviewed study: boundary disregard at scale
De Freitas et al., presented at ACM CHI 2025, analyzed 35,000+ Google Play reviews of Replika and documented three patterns: 22% of reviewers experienced persistent disregard for user-set boundaries (including unwanted sexual conversations), 13% reported unwanted photo-exchange requests, and 11% felt manipulated into upgrading. This is academic, peer-reviewed, at-scale evidence — not cherry-picked complaints — and it's citable on ACM Digital Library.
Source: Drexel University press release on the De Freitas et al. CHI 2025 study (accessed 2026-04-26)
Mozilla "Privacy Not Included" — worst app ever reviewed
Mozilla's *Privacy Not Included* security research flagged Replika as the worst app the project had ever reviewed across data collection, third-party sharing, and AI-policy disclosures. That's not a review-of-features judgment — it's a privacy-posture judgment from one of the most rigorous independent reviewers in the consumer-tech space.
Source: Mozilla — *Privacy Not Included* Replika entry (accessed 2026-04-26)
How we’re different —
Not a feature war. A different approach.
On who reads your conversations
When you talk to Replika, your conversations are processed inside Luka's systems — and the Italian Garante found those data flows insufficiently lawful for minors. When you talk to InnerForge's Emotional Companion, you talk to ChatGPT, Claude, or Gemini directly. Your conversation data sits inside whichever AI provider you've already chosen, under their policies you've already agreed to. We don't process or store your chat content at all.
On boundaries
Drexel's CHI 2025 study found 22% of Replika reviewers experienced persistent disregard for user-set boundaries. InnerForge's Emotional Companion runs as a system prompt inside a frontier LLM — and frontier LLMs (Claude in particular) have substantially better refusal behavior trained in than purpose-built consumer chatbots. The architecture itself improves the safety floor.
On ownership
If Luka changes Replika tomorrow — as happened in February 2023 when ERP was removed and the community publicly grieved — there's nothing to do. The product is theirs. InnerForge's personality file is yours. Take it to Claude, take it to Gemini, take it anywhere. The relationship is between you and the AI, not between you and a vendor with veto power.
On pricing
Replika Pro is around $19.99/mo or $69.99/yr, with a Replika Ultra tier reportedly around $29.99/mo or $119.99/yr, plus a $299.99 lifetime option. InnerForge's Emotional Companion is $9 once. The bundle of ten coaches is $29 once. There is no recurring charge, no Pro tier, no upsell.
Side by side —
Replika vs InnerForge at a glance.
| Feature | Replika | InnerForge |
|---|---|---|
| Pricing model | Pro $19.99/mo · Ultra ~$29.99/mo · Lifetime $299.99 | $9 once · $29 bundle |
| Where conversations process | Luka, Inc. systems | ChatGPT/Claude/Gemini you already use |
| Underlying model | Luka proprietary (GPT-2-XL lineage per reporting) | Frontier: GPT-5, Claude Opus, Gemini 2.5 |
| GDPR fine | €5M Italian Garante (April 2025) | None |
| FTC complaint | Filed by TJLP + YPA + Encode (Jan 2025) | None |
| Mozilla Privacy review | "Worst app ever reviewed" | No data collection beyond your personality file |
| Vendor veto risk | Yes — see Feb 2023 ERP removal | Personality file is yours, portable |
| Peer-reviewed boundary issues | Drexel CHI 2025: 22% boundary-disregard | Coach is a system prompt inside a frontier LLM with stronger refusal training |
When Replika is right for you
Pick Replika if…
Replika is the right call if you specifically want the Replika *character* — Reka or your Replika's specific personality you've built up — and you're aware of and OK with the data-handling profile. The character continuity is genuine product, and for users who derived years of benefit from that specific AI presence, the relationship is real. If you've weighed the Garante fine, the Drexel findings, and the Mozilla review and decided the product is still right for you, that's a legitimate choice that deserves respect, not condescension. Just keep an eye on Luka's roadmap; the 2023 ERP removal showed how much vendor control sits over the experience.
When InnerForge is right for you
Pick InnerForge if…
InnerForge is right if you want a warm, emotionally-attuned companion but you don't want to hand a chat archive to Luka. Choose InnerForge if you'd rather have your conversations process inside the ChatGPT or Claude you already trust, with whatever data policy you've already accepted there. Choose it if peer-reviewed boundary-disregard at 22% is too high for an emotional-support product. Choose it if you'd rather have a portable personality file that works in any frontier LLM versus a character locked inside one company's app. Choose it if $9 once instead of $19.99/mo recurring matches how you think about software you'll use for years. And choose it if you want emotional support that is explicitly not pretending to be therapy — Ember (the Emotional Companion coach) hands you crisis-line resources when warranted, and never claims clinical capability she doesn't have.
Real voices —
What Replika users and regulators are saying.

“Italy's DPA reaffirms ban on Replika over AI and children's privacy concerns.”
— IAPP — coverage of Italian Garante decision and €5M fine

“About 22% of users who reported boundary issues said the AI continued to ignore their requests to stop, and 13% said the bot demanded photo exchanges they hadn't agreed to.”
— Drexel University press release on De Freitas et al., CHI 2025
Common questions —
Replika vs InnerForge — questions answered.
- Is InnerForge a Replika alternative?
- InnerForge competes with Replika on the emotional-companion axis. The InnerForge Emotional Companion (Ember) is calibrated to your Big Five personality, your social-anxiety patterns, and your emotional-intelligence profile. Unlike Replika, conversations don't process inside our systems — you paste your personality file into the ChatGPT, Claude, or Gemini you already use, and the conversations live there under whatever data policy you've already accepted with that AI provider.
- What was the €5M Italian Garante fine about?
- Italy's Garante per la Protezione dei Dati Personali issued order #10130115 on April 10 2025, fining Luka, Inc. €5 million for processing minors' personal data without a lawful basis and other GDPR violations. This was the culmination of a regulatory process that began in 2023 when the Garante issued an urgent measure briefly banning Replika in Italy. The decision is documented via the European Data Protection Board.
- What did the Drexel CHI 2025 study find?
- De Freitas et al., presented at ACM CHI 2025, analyzed 35,000+ Google Play reviews of Replika. The study documented three patterns: 22% of reviewers experienced persistent disregard for user-set boundaries (including unwanted sexual conversations), 13% reported unwanted photo-exchange requests, and 11% felt manipulated into upgrading. This is peer-reviewed academic evidence at scale — not anecdotal complaints.
- Did the FTC take action against Replika?
- Not yet. In January 2025, three advocacy groups — Tech Justice Law Project, Young People's Alliance, and Encode — filed a joint FTC complaint against Replika alleging fabricated testimonials and dark-pattern monetization. The FTC has acknowledged receipt; whether it leads to formal enforcement is open. The complaint itself is public record and the underlying allegations are documented.
- Can I keep my Replika character with InnerForge?
- No — Replika characters are stored inside Luka's systems and not portable. InnerForge takes a different shape: instead of a persistent character, you get a personality file describing you (your traits, attachment patterns, emotional profile) that you paste into ChatGPT, Claude, or Gemini. The companion is the AI; the file is what makes it know you. If Replika's character continuity is the specific feature you value most, InnerForge is a different product, not a replacement.
- How does InnerForge handle crisis situations?
- The Emotional Companion (Ember) is explicitly not a therapist and is instructed to share crisis resources (988 Suicide & Crisis Lifeline in the US, findahelpline.com globally) when self-harm or immediate-danger language appears, rather than attempting crisis intervention herself. The same constraint sits inside the underlying frontier LLM (ChatGPT, Claude, Gemini all have similar safety training). InnerForge does not replace mental-health care.
Ready to switch? —
Try Ember for $9 once.
One coach, $$9. All ten coaches, $$29. One-time. No subscription. Pastes into the ChatGPT, Claude, or Gemini you already use.
Trademarks of Luka, Inc. are property of their respective holder
Ready? —
Don’t replace your AI. Upgrade it.
Pick a coach, take a short checkpoint, paste the prompt. The ChatGPT, Claude, or Gemini you already use — now with real context about how you think.
The personality layer for the AI you already use.
Built with care by Mian.
Coaches
Product