News

Infuriating MAGA Influencer Scam That Wrecked Far Too Many

The woman never held a rifle. She never wore scrubs. She never set foot in the United States.

Emily Hart — gun-toting, God-fearing, flag-waving — was a character conjured by algorithms and operated out of a bedroom in India by a 22-year-old male medical student with a plane ticket in mind and tuition bills on his desk.

Hart’s creator, who spoke to Wired magazine, is an aspiring orthopedic surgeon. He did not stumble into the scheme accidentally. 

He deployed Google’s Gemini AI to mine demographic data, searching for a gap in the market — a segment of social media users who were financially comfortable, fiercely loyal to their in-group, and hungry for content that reflected their values back at them.

The data pointed him toward older conservative men in the United States. He built Emily Hart for them.

Marketed as a registered nurse and a committed American patriot, Hart’s feed functioned as a relentless ideological signal — a daily volley of content calibrated to resonate with a specific worldview. 

Photographs depicted her discharging a rifle. Images surfaced showing the AI-generated woman in bikinis against wintry backdrops. Her captions left no room for ambiguity.

One post showed Hart at a shooting range. It carried the caption: “If you want a reason to unfollow: Christ is king, abortion is murder, and all illegals must be deported.”

The creator described the rhythm of the operation to Wired in plain terms. “Every day I’d write something pro-Christian, pro-Second Amendment, pro-life, anti-abortion, anti-woke, and anti-immigration,” he said.

Ten thousand followers arrived in the first month alone. The account grew. So did the money.

“I was spending maybe 30 to 50 minutes of my day, and I was making good money for a medical student.”— Hart’s creator, speaking to Wired magazine.

The financial architecture behind Hart extended well past Instagram engagement. 

Her creator sold MAGA-branded merchandise through the account and established a parallel presence on Fanvue — a subscription platform that, unlike its competitors, explicitly allows AI-generated material — where paying users accessed explicit content featuring the fictional character. 

The creator told Wired the enterprise generated thousands of dollars per month.

“In India, even in professional jobs, you can’t make this amount of money,” he said. “I haven’t seen any easier way to make money online.” The end goal, he acknowledged, was to use the profits to relocate to America — the very country whose voters he was targeting.

Instagram removed Hart’s account in February, classifying the activity as fraudulent. The takedown marked the end of the operation, though not the end of the pattern.

A figure called Jessica Foster ran a parallel operation at a far greater scale. 

To her one million Instagram followers, Foster was the consummate conservative soldier — a striking blonde photographed beside President Donald Trump on an airport tarmac in high heels, snapping selfies in front of fighter jets, and completing military assignments in Greenland alongside fellow servicewomen. 

Her account launched in December with a two-word biography: “America First.” The comment sections beneath her posts filled with men requesting introductions.

Foster was not a soldier. She was not a woman. She was not a person. Like Hart, she was constructed entirely through artificial intelligence, and her account has since been removed.

The same blueprint has appeared in international influence operations. Hundreds of deepfake videos have circulated online depicting glamorous Middle Eastern women dressed in military uniforms, carrying pro-Iran messaging. 

The clips present them as Iranian female soldiers and fighter pilots — a scenario that carries an internal impossibility: Iranian law bars women from serving in those roles, meaning the women shown could not exist even if they were real.

Separate accounts have posted images of a woman posing with Elon Musk inside SpaceX facilities. Many of these profiles have since vanished from major platforms. 

The money, in each case, had already been collected before the accounts disappeared.

Continue Scrolling for the Comments

Leave a Comment