Tumgik
#I SCROLLED PAST THIS POST AND THOUGHT IT WAS A BALD MEME BUT THEN REALIZED "wait how come it's just him ???<??</<?
Link
Dan Nosowitz was scrolling through Instagram when he saw it: an ad for a cooking device whose sole function was to heat up raclette cheese.
“I had to click through because I had no idea what it actually was,” he explains. “Finding out that an algorithm believed I would be interested in a discount ‘traditional Swiss-style electric cheese melter’ is sort of comfortably bumbling. It’s like watching a Roomba bonk into a wall.”
Whether the humor inherent in the ad comes from the fact that the gadget is so oddly specific, or because raclette is an incredibly high-maintenance cheese and therefore hardly a common grocery item for most people, is difficult to say. What we do know, however, is that the complicated set of algorithms that serve targeted ads on social media are the most brutal, most incisive owns of our time.
In Nosowitz’s case, he figures he likely saw the raclette warmer because he’s a food writer who Amazon surely knows has previously browsed cooking tools on its site. That’s because Amazon, Facebook, Instagram, and the rest of the internet track your every keystroke and will then use your history to show you things they think will make them money. So it’s no wonder that it feels so deeply personal when we get targeted ads for, say, “dressy sweatpants,” colonoscopies, underwear whose selling point is that they are easy to take off, preparing for your own funeral, or, somehow the biggest attack of all: tickets to Jagged Little Pill: The Musical.
The simplest explanation for why targeted ads are so creepily intimate: Your phone, your computer, and the internet in general contain a gargantuan amount of information about you. Google, for instance, knows essentially every website you have ever gone to in your life, and thanks to geolocation can tell where you live, where you work, and where you’ve traveled and when. Credit card companies know what you buy, and the brands that sell those items can use that data to predict the things you’ll buy in the future — in Target’s case, it can tell that you’re pregnant before even your family knows.
There are ways to prevent at least some of this, but the more the internet entrenches itself in our lives, the more difficult and time-consuming it is to opt out. The consequences are, of course, potentially democracy-shattering. For our purposes here, however, the thing in danger of being shattered is our self-esteem.
Seth Stephens-Davidowitz, who has written a book on how the internet uses your data, has himself experienced the strangeness of being targeted by a Facebook ad for hair loss cream despite never having posted anything about balding.
“It was a little like being in a Seinfeld episode,” he explains. “I had never worried about my hair and always thought hair products were a total waste of money. And now I had to wonder, ‘Am I crazy? Should I actually be taking a product for hair loss?’” (He, however, ended up deducing that it was probably because two-thirds of men start losing their hair by the time they’re 35, and that the ad simply targeted all men around that age.)
I just got a Facebook ad for hair loss product. Are they using my pictures to figure out I am balding? I am pretty sure there is no other way, using my internet behavior, for them to know that.
— Seth Stephens-Davidowitz (@SethS_D) March 29, 2018
Facebook, undoubtedly the platform with the worst and most prolific targeted ads, said in a memo this April that while it allows companies to target their ads to users that fit a certain profile, it keeps users’ actual identities private from them.
But companies are able to target specific people by other means, namely through sending Facebook a list of emails, which Facebook can then use to find associated accounts. If you’ve ever bought anything from, say, Urban Outfitters, the brand could use the email you used to either make the purchase online or the one you gave at the checkout counter to specifically target you. And if you happened to be browsing Glossier.com, while still logged into Facebook, you might return to the social media app to find ads for Boy Brow.
Plus, the blog post doesn’t mention the fact that marketers can take advantage of your data that isn’t simply demographic — it theoretically could, for instance, reach users who seem to match a specific personality type or emotional state, thereby taking advantage of already vulnerable people. So ads for funeral preparations or musicals about mid-’90s female angst could be more than just a coincidence and instead referendums on your actual current mood.
The most horrific item I have ever seen in a targeted Facebook ad was a sweatshirt emblazoned with a bunch of Celtic knots that implied the superiority of having “Jennings blood.” Ignoring the possible white supremacist connotations, the ad was ironic mostly because you can buy the exact same sweatshirt replaced with literally any last name that sounds vaguely Irish and about a zillion other versions, too. “God made the strongest and named them Rubin,” reads one. “Never underestimate the power of a person with name’s Brooke,” shouts another, despite the fact that this sentence does not make sense.
It’s obvious why this specific ad showed up on my feed: Facebook knows that my last name is Jennings, and marketers can easily target users with such information. What’s more complicated is how the hell all those last names ended up on a sweatshirt.
To be clear, they didn’t. The reason so many T-shirts and sweatshirts with oddly specific phrases is because online clothing companies have tasked algorithms with the heavy lift of actually filling in the specifics and photoshopping those results onto digital images of clothing. The sweatshirts themselves don’t physically exist until you hit “purchase.”
Michael Fowler had been in the T-shirt business for 20 years before creating a simple computer code that would change his life in 2011. It took a common phrase, such as “Kiss Me, I’m a [blank],” compiled hundreds of thousands of words from digital dictionaries, created a list of phrase variations using those words, and then generated images of T-shirts with each phrase. According to The Hustle, Fowler’s company went from just 1,000 T-shirts that were designed by actual humans to more than 22 million code-generated ones. Through targeted Facebook ads, he was eventually able to sell 800 a day.
Unfortunately, his success was not the reason Fowler would make international headlines. Two years later his algorithm was responsible for shirts that read “Keep calm and rape a lot,” among other disturbing and misogynistic variations on the famous World War II slogan. Fowler said he had no knowledge of the items, and in fact, they’d been available for more than a year before anyone noticed. But even though he quickly deleted the offending shirts, his company still ended up folding.
Robot-written word salad T-shirts, however, have managed to become one of the internet’s purest inside jokes. On the subreddit r/TargetedShirts, members share the most egregious versions they come across, be they weirdly antagonistic (“Walk away, this forklift operator has anger issues and a serious dislike of stupid people”), uncomfortably sexual (“I don’t need therapy, I just need to get f#ed in public by fourteen werewolves”), birthday month-related (“Never underestimate an old man who is also an air force veteran and was born in November”), or utterly nonsensical (“Good girls go to heaven, January girl go hunting with Dean”).
The sub even has its own parody versions, like “These titties are protected by a skinny white guy in his mid-thirties who wears DC shoes, yells at me in public and is addicted to percs who was born in February,” or “Only heros with an IQ of 121, work as a pizza delivery driver, have 3 spoons of sugar in their coffee and love reptiles & mice, were born in March by C-section 2 weeks before their due date.”
Its founder, David Moreno, launched the subreddit just ten months ago, but it already has more than 40,000 subscribers. He explained to Vox that the first time he saw a targeted ad, back in 2011 or 2012, “it did fuck with my brain for a while because it had my last name and month of birth and at the time I didn’t realize what was going on.”
These days, however, the practice makes sense to him. “Funnily enough, I work in marketing, so while it might seem like a desperate strategy, it is actually a very good way to target a very specific group of people without spending too much cash,” he said.
The best versions, of course, are the ones seen in the wild. The sub is often populated by surreptitiously photographed people in the offending shirts, like this one, with comments that lightly roast the wearer. They’re the best because they are the saddest — the catalog of folks who were not only owned by the algorithm, but scammed by it.
That’s the other part of what it’s like to see a hyper-targeted ad for something incredibly on-brand: sometimes they read us more clearly than any actual humans. This is an inherently depressing thought, considering that this is sort of the job of the people we love and the society we live in. But the more intimate our phones and our data become in our lives, it might increasingly be the case.
The prevailing cynical attitude towards targeted ads — tweets that say things like, “i just got an ad for preparing for your own funeral, what are you trying to say to me youtube” — can sort of be compared to the FBI agent meme of the past year and a half or so. The idea is that every internet user has their own personal agent monitoring their behavior through their devices, but instead of this being incredibly creepy, the joke is that the agent acts as a friend or frustrated mentor to the subject.
me: (sitting back down on my bed with a bowl of chips ready to binge a new series) hey so what does “fbi” stand for anyway
fbi agent inside my computer: uh Faraway.. Buddy.. Insideyourcomputer
me: cool. so what do u wanna watch next
fbi agent: i heard grace and frankie is fun
— jonny sun (@jonnysun) February 1, 2018
A Mashable article earlier this year explored the surprising poignance of the meme: “The agent wants the best for their subject,” writes its author Chloe Bryan. “The narrator, conscious of how boring their life must be to observe, tries to entertain the FBI agent. They have pleasant conversations. They develop a forbidden friendship. They become quiet, lightly subversive allies.”
In both cases, we’re taking our deepest technological anxieties — that the internet stores and sells our data and that the government is spying on us — and turning them into lighthearted jokes. Which is fair! It’s a lot more fun to pretend Big Data is actually just there to dunk on our most embarrassing shopping habits instead of manipulating U.S. elections or contributing to the rising wealth of the world’s richest people.
Which means there will probably come a day when an ad on Instagram for an enormous cheese-warming gadget targeted specifically to a person using a complex set of his internet data will no longer be funny. But we may as well laugh while it still is.
Want more stories from The Goods by Vox? Sign up for our newsletter here.
Original Source -> The joy and horror of targeted Facebook ads
via The Conservative Brief
0 notes