“Hey, is this you?”
Bunni will get these DMs usually — random alerts from strangers flagging phony profiles mimicking her on-line. As an OnlyFans creator, she’s realized to stay with the exhausting, infuriating cycle of impersonation that comes with the territory. 5 years in, she is aware of the drill.
However this time felt completely different. The account in query hit too shut. The picture? Little doubt, it’s her shirt, her tattoos, her room. The whole lot checks out, however that isn’t her face.
A reverse deepfake
What’s occurring to Bunni is likely one of the extra uncommon — and unsettling — evolutions of deepfake abuse. Deepfakes, usually AI-generated or AI-manipulated media, are mostly related to non-consensual porn involving celebrities, the place an individual’s face is convincingly grafted onto another person’s physique. This type of image-based sexual exploitation is designed to humiliate and exploit, and it spreads shortly throughout porn websites and social platforms. Probably the most outstanding hubs for this sort of content material, Mr. Deepfake, lately shut down after a key service supplier terminated assist, chopping off entry to its infrastructure.
The shutdown occurred every week after Congress handed the “Take It Down Act,” a invoice requiring platforms to take away deepfake and revenge porn content material inside 48 hours of a takedown request. The laws, anticipated to be signed into legislation by President Donald Trump, is a part of a broader push to manage AI-generated abuse.
However Bunni’s case complicates the dialog. This isn’t a matter of her face being pasted into specific content material — she’s already an OnlyFans creator. As an alternative, her pictures have been digitally altered to erase her identification, repackaged below a special title, and used to construct a completely new persona.
Chasing an AI catfisher — Bunni’s scenario
In February, Bunni posted a video to Instagram. The video confirmed a surreal side-by-side: the true Bunni pointing at an image from a Reddit put up that hardly resembled her. The faux picture had been meticulously scrubbed of lots of her defining options — the facial piercings gone, her darkish hair lightened, her expression softened. Of their place was a face engineered for anonymity: huge inexperienced eyes, clean pores and skin, and sanitized alt-girl aesthetics.
The actual Bunni factors at a faux {photograph}.
Credit score: Screenshot from Instgram consumer @bunnii_squared

Authentic picture of Bunni from Instagram
Credit score: Screenshot from Instgram consumer @bunnii_squared
The Reddit profile, now deleted however partially resurrected by way of the Wayback Machine, introduced “Sofía”: a self-proclaimed 19-year-old from Spain with an “alt style” and a love of rock music, who was “open to meeting new people.” Bunni is 25 and lives within the UK. She isn’t, and has by no means been, Sofía.

The faux Reddit profile for “Sofía,” a fabricated persona claiming to be a 19-year-old from Spain.
Credit score: Screenshot from Wayback Machine
“I’m so used to my content being stolen,” Bunni advised Mashable. “It kind of just happens. But this was like — such a completely different way of doing it that I’ve not had happen to me before. It was just, like, really weird.”
It will get weirder. The Sofía account, which first popped up in October 2023, began off innocently sufficient, posting to feel-good boards like r/Awww. However quickly, it migrated to extra area of interest — and extra disconcerting — subreddits like r/youngsters, r/teenagersbutbetter, and r/teenagersbuthot. The latter two, offshoots of the primary subreddit, exist in an irony-pilled grey zone with greater than 200,000 mixed members.

Screenshot exhibiting the “Sofía 🖤🎀🌙” account posting in subreddits r/teenagersbuthot and r/TeenagersButBetter, making informal and book-related posts to look genuine.
Credit score: Screenshot from Wayback Machine
Utilizing edited selfies lifted from Bunni’s socials, the account posted below the guise of searching for vogue recommendation, approval, and even pictures of her pets.
“Do my outfits look weird?” one caption requested below a photograph of Bunni attempting on denims in a becoming room.
“I bought those jeans,” Bunni recalled. “What do you mean?”
However the recreation wasn’t nearly taking part in dress-up. The Sofía persona additionally posted in r/FaceRatings and r/amiuglyBrutallyHonest, subreddits the place customers fee strangers’ attractiveness with brutal candor. The seemingly motive is greater than seemingly constructing credibility and validation.
Mashable Mild Pace

Credit score: Screenshot from Wayback Machine
The ultimate stage of the impersonation edged towards grownup content material. Within the final archived snapshot of the account, “Sofía” had begun posting in subreddits like r/Selfie — a normal selfie discussion board the place NSFW photos are prohibited, however hyperlinks to OnlyFans accounts in consumer profiles are allowed — and r/PunkGirls, a much more specific area that includes a mixture of newbie {and professional} alt-porn. One Sofía put up in r/PunkGirls learn: “[F19] finally posting sexy pics in Reddit, should I keep posting?” One other adopted with: “[F19] As you all wanted to see me posting more.”
![Screenshot of a Reddit post in r/PunkGirls from user “Sofía 🖤🎀🌙” with the title “[F19] finally posting sexy pics in Reddit , should I keep posting?](https://helios-i.mashable.com/imagery/articles/07quief580gTDApFwgKzM7R/images-4.fill.size_2000x1109.v1747245191.png)
The account used altered pictures of Bunni and posed as a 19-year-old searching for validation via sexually suggestive posts.
Credit score: Screenshot from Wayback Machine
The final put up from the account was in an r/AskReddit thread describing the weirdest sexual expertise they’ve ever had.

One other remark blurring the road between persona-building and sexual baiting, serving to the impersonator seem extra actual whereas participating in attention-farming habits.
Credit score: Screenshot from Wayback Machine
Bunni surmised that the endgame was seemingly a rip-off concentrating on males, tricking them into shopping for nudes, doubtlessly lifted from her personal OnlyFans. The profile itself didn’t put up hyperlinks to exterior platforms like Snapchat or OnlyFans, however she suspects the true exercise occurred in non-public messages.
“What I imagine they’ve done is they’ll be posting in SFW subreddits, using SFW pictures, and then messaging people that interact with them and being like, ‘Oh, do you want to buy my content’ — but it’s my content with the face replaced,” she stated.
Luckily for Bunni, after reaching out to moderators on r/youngsters, the impersonator’s account was eliminated for violating Reddit’s phrases of service. However the incident raises a bigger, murkier query: How usually do incidents like this — involving digitally altered identities designed to evade detection — really happen?
Fashionable-but-not-famous creators are the proper targets
In typical circumstances of stolen content material, imposters would possibly repost photos below Bunni’s title or below a faux title, which catfishers do. However this model was extra subtle. By altering her face — eradicating piercings, altering eye form, subtly shifting options — the impersonator gave the impression to be taking steps to keep away from being recognized by followers, pals, and even reverse picture searches. It wasn’t simply identification theft. It was identification obfuscation.
Reddit’s Transparency Report from the second half of 2024 paints a partial image. The platform eliminated 249,684 cases of non-consensual intimate media and simply 87 circumstances flagged particularly as impersonation. However that knowledge solely displays removals by Reddit’s central belief and security group. It doesn’t embrace content material eliminated by subreddit moderators — unpaid volunteers who implement their very own community-specific guidelines. Mods from r/youngsters and r/amiugly, two of the subreddits the place “Sofía” had been lively, stated they couldn’t recall an identical incident. Neither maintain formal data of takedowns nor causes for removing.
Reddit declined to remark when Mashable reached out relating to this story.
If Trump indicators the “Take It Down Act” into legislation, platforms will quickly be required to take away nonconsensual intimate imagery inside 48 hours.
It’s not onerous to see why creators like Bunni could be the perfect goal for an impersonator like this. As an OnlyFans creator with a multi-year presence on platforms like Instagram, TikTok, and Reddit, Bunni has amassed an enormous archive of publicly out there photos — a goldmine for anybody trying to curate a faux persona with minimal effort. And since she exists within the mid-tier strata of OnlyFans creators — well-liked, however not internet-famous — the percentages of an off-the-cuff Reddit consumer recognizing her are low. For scammers, catfishers, and trolls, that candy spot of visibility-without-virality makes her the proper mark: acquainted sufficient to look actual, obscure sufficient to remain undetected.
Extra troubling is the authorized ambiguity surrounding this sort of impersonation. In response to Julian Safarian, a California-based lawyer who represents on-line content material creators, likenesses are protected below U.S. copyright legislation, and doubtlessly much more so below California’s evolving deepfake laws.
“It gets complicated when a creator’s likeness is modified,” Safarian defined. “But if a reasonable person can still recognize the original individual, or if the underlying content is clearly identifiable as theirs, there may still be grounds for legal action.”
As a result of a Reddit consumer acknowledged the edited pictures as Bunni’s, Safarian says she might doubtlessly deliver a case below California legislation, the place Reddit is headquartered.
However Bunni says the price of pursuing justice merely outweighs the advantages.
“I did get some comments like, ‘Oh, you should take legal action,’” she stated. “But I don’t feel like it’s really worth it. The amount you pay for legal action is just ridiculous, and you probably wouldn’t really get anywhere anyway, to be honest.”
AI impersonation is not going away
Whereas this will likely look like an remoted incident — a lone troll with time, entry to AI picture instruments, and poor intentions — the rising accessibility of AI-powered modifying instruments suggests in any other case. A fast seek for “AI face swap” yields an extended record of drag-and-drop platforms able to convincingly altering faces in seconds — no superior abilities required.
“I can’t imagine I’m the first, and I’m definitely not the last, because this whole AI thing is kind of blowing out of proportion,” Bunni stated. “So I can’t imagine it’s going to slow down.”
Paradoxically, the fallout didn’t damage her financially. If something, Bunni stated, the video she posted exposing the impersonation really boosted her visibility. However that visibility got here with its personal price — waves of victim-blaming and condescending commentary.
“It’s shitty guys that are just on Instagram that are like, ‘You put this stuff out there, this is what you get, it’s all your fault,’” she stated. “A lot of people don’t understand that you own the rights to your own face.”
Have a narrative to share a few rip-off or safety breach that impacted you? Inform us about it. E mail [email protected] with the topic line “Safety Net” or use this type. Somebody from Mashable will get in contact.