A woman who booked a long-term Airbnb stay in Manhattan says she was shocked to find herself facing more than $7,000 in damage claims — after the host allegedly used artificial intelligence to fake photos of destruction she insists never happened.
The guest, who is based in London, had reserved a one-bedroom apartment in Manhattan for two and a half months. But after staying for just seven weeks, she checked out early, citing safety concerns in the neighborhood.
That’s when the situation took a sharp turn.
The host, listed on the platform as a “superhost,” submitted a claim to Airbnb accusing the guest of causing over $16,000 worth of damage. He included photos to back up his claims, showing a cracked coffee table, stained mattress, and broken items ranging from a robot vacuum cleaner to a microwave and air conditioner.
Airbnb reviewed the photos and initially sided with the host. The company told the guest she would be required to pay more than $7,000.
But the guest said something didn’t add up.
She told The Guardian she examined the images closely and noticed something off — especially in the pictures of the wooden coffee table. According to her, the photos showed visual inconsistencies that suggested image manipulation.
Airbnb Issues Full Refund After Host Allegedly Used AI- Generated Photos To Blame Guest For $16K In Damages
pic.twitter.com/rZS2AcLaFI
— Raphousetv (RHTV) (@raphousetv2) August 7, 2025
She didn’t stop there.
She also told Airbnb she had an eyewitness who could testify under oath that the apartment was left in “clean, undamaged and good order.” She explained to the company that she had only two visitors the entire time she was there and believed the host had filed the claim in retaliation for her early departure.
Despite her efforts, the claim remained active — until a reporter began asking questions.
Just days after The Guardian contacted Airbnb about the situation, the company told the guest that her appeal had been accepted. She was given a $580 credit. After she made it clear she would not be using Airbnb again, the platform refunded her the full cost of the stay.
The guest expressed concern for future users, especially those without the time or resources to dispute a claim. She warned that AI-generated images can easily be passed off as evidence, especially when hosts know how to manipulate digital tools.
Her worry wasn’t just about her own experience — it was about what could happen to others who don’t push back.
In a statement released to FOX Business, Airbnb said the woman’s experience “fell below our usual high standards.” The company confirmed that the host had been warned and would be removed from the platform if a similar incident occurred again.
“We have been in touch with the guest to apologize and assure them that they will not be charged for the reported damage,” the company said. “We are reviewing the original handling of this case. We take damage claims seriously.”
Why isn’t @AirbnbHelp at least naming the “superhost” for doctoring images to claim guest damages? Its users need to know about this New York scammer before they book. https://t.co/oLDAD3BK9l
— Greg Piper (@gregpiper) August 7, 2025
Airbnb also said it has a specialist team to review evidence and reach fair decisions. Still, it’s unclear how often image manipulation is caught during the review process, or how many guests may have paid for damages they didn’t cause.
The case has now gone viral and sparked a debate about the growing role of AI in disputes that rely on visual proof.
If a photo is no longer proof — what is? And how many more images out there are being trusted without question?
The woman got her money back, but the story is raising bigger questions about digital trust, accountability, and how far someone might go to make a claim stick.














Continue with Google