How AI can be scammy

How AI can be scammy post thumbnail image

Remember Charlie and the Chocolate Factory? That cute children’s story where some lucky kids get to tour Willy Wonka’s secret factory. This story, written by Raold Dahl in 1964, has had three movie interpretations over the years, each of them very different from the other.

But one thing that each movie does have in common is their colorful sets. In the original story, each room in the factory is described in rich, beautiful detail, and the movie versions did their best to interpret them. No matter which movie is your favorite, you can’t fault the amazing sets, and the sheer magic of these imaginary worlds with their Oompa Loompas and chocolate rivers and trees made of candy.

So when parents in Glasgow, Scotland saw an ad for a local Willy Wonka-inspired Chocolate Experience in February of 2024, many jumped at the chance to bring their kids. The pictures in the ad were chock-full of amazing landscapes with bright colors and lollipops, a magical, colorful land with a level of detail you’d only expect to find in places like Disney World. Except it was right there in a warehouse in Glasgow, for just a few bucks! The equivalent of $45 in US dollars. Way cheaper than Disney World, and right around the corner.

The Willy Wonka experience…or not

I can just imagine these parents picturing the look of delight on their kids’ faces when they got to experience this “world of pure imagination.” Oh, what a wonderful day it would be for the kiddos!

Parents were too excited to notice some strange red flags: words on the ad were badly misspelled but perfectly placed, in a way that signaled to those of us who watch the artificial intelligence space, that this ad was most definitely generated by AI.

AI-generated ad for Willy Wonka experience, with misspelled words and colorful imagery.

When the day came, there was a line around the block to get in. But when these families entered, they got… a bare warehouse with a few pictures hung on the wall, a small candy cane sculpture thing, a bouncy castle, and a few actors in costume who tried their best despite knowing the experience was incredibly lame.

It was a disaster. Angry parents demanded their money back, and a spokesperson for House of Illuminati, the company that promoted the event, apologized profusely.

It turns out that House of Illuminati had used AI, not actual pictures of what it looked like. And the difference between the AI images and reality was huge. Like, not even from the same planet-huge.

On top of that, they gave the actors these AI-generated scripts that made no sense at all. For example, the script called for this grim reaper kind of character to jump out and scare the kids. I don’t remember that being in the original Willy Wonka story.

Parents sued for their money back, and the event was shut down.

The whole mess was a perfect example of how not to use AI.

And then the scammers arrived


While the Willy Wonka event was more like a mistake than an intentional scam, actual scammers have started to use AI to avoid detection. Here’s how they do it.

Before AI, if you got a text from an extremely attractive person wanting to be your boyfriend or girlfriend, you could do a reverse image search on any pictures they sent you, and find the original. And you would find out pretty quick that it’s a photo from some other person’s Instagram, someone whose name doesn’t match the texts you’re getting. Or if it was a celebrity, you could find those original images in that person’s feed, and realize that the scammer had just copied the images. This would give you a clue that it’s a scam, that the person was trying to rope you into an online relationship so they could start asking for money.

But with AI, a scammer can create pictures that are sort of like the original, but different enough that they either won’t come up in a reverse image search, or that you might start to believe they’re sending you real pictures, live and in real time.

Just like the parents in Glasgow who thought they were getting a unique Willy Wonka experience, just like all the unfortunate men and women who think they’re in an online relationship with an attractive stranger, or with Johnny Depp or Taylor Swift, who for some reason can’t get access to their millions and need you to send them money. Just like any of the countless people who have fallen for scams because the image just looked so real.

AI can be fun, if you understand it

Anything good in life will eventually be exploited by scammers, and that includes AI. But you can protect yourself by knowing what AI does, and how you can spot scammers who are using AI images to try and get something from you.

You can have some fun with this. If you’re randomly sent a photo of an extremely attractive person, ask for a photo of them with a spoon sitting on their head, or of them holding up a sign that says “I love donuts” or some other phrase. Then rejoice silently while they either try to create the image with AI, or give up and stop bothering you.

I’m not one of those people who thinks AI will eventually revolt against us, that my Alexa is going to suddenly grow a brain. I think the bigger threat is criminals’ use of AI to manipulate people into thinking something is true when it’s not.

So go forth safely into the world, and use AI responsibly.

Related Post